> He has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.
This is not really true. There were complex multilayered systems before computers.
In large systems, Western Electric #5 Crossbar was comparable to a large real-time program. General Railway Signal's NX system had the first "intelligent" user interface. But that level of complexity was very rare.
Both mechanical design and electronic design are harder than program design. The number of people who did really good mechanism design is tiny. There were only two good typesetting machines, over most of a century - Merganthaler's Linotype and Lanston's Monotype. Everybody else's machine was a dud. In the printing telegraph/Teletype business, Howard Krum and Ed Klienschmidt designed the good ones, and the other twenty or so designs over many decades were much inferior. There were been lathes for centuries, but all modern manual lathes strongly resemble Maudsley's design from 1800.
There are far more good programmers than there were good mechanism designers or electronics engineers. Programming is not really that hard by the standards of other complex engineering.
> Both mechanical design and electronic design are harder than program design. The number of people who did really good mechanism design is tiny. There were only two good typesetting machines, over most of a century - Merganthaler's Linotype and Lanston's Monotype.
That’s a good example and of course it immediately brings to mind TeX, which is an equally monumental if not greater achievement. Certainly there’s no denying that TeX has considerably higher dimensionality than the pre-computerized hot type setting machines. Especially when you include all the ancillary stuff like Metafont.
Also recall that Dijkstra was a systems programmer in his industry career. He was well aware of the complexity of the computing hardware of the day—which was cutting edge electronic design. The semaphore wasn’t invented as a cute mathematical trick; he needed it to get hardware interrupts to work properly. Something which THE managed and Unix, among others, never quite did (although it did get to mostly good enough if you don’t mind minefields).
> There are far more good programmers than there were good mechanism designers or electronics engineers. Programming is not really that hard by the standards of other complex engineering.
Most programmers are incapable of writing a correct binary search, let alone something the size and complexity of TeX with only a handful of relatively minor errors. Programmers capable of that level of intellectual feat are indeed few and far between. I suspect they’re more rare than competent EEs or MEs.
Most programmers are more comparable to the guys cleaning the typesetters, not the ones designing them.
> TeX, which is an equally monumental if not greater achievement.
TeX didn't come out of nowhere. It's a successor to the macro-based document formatting system which began with RUNOFF and went through roff, nroff, tbl, eqn, MM, troff, ditroff, and groff. The last remaining usage of those tools seems to be UNIX-type manual pages. There was so much cruft a restart was required.
Linotype didn’t come out of nowhere either. Printers used to cast type manually.
And don’t just gloss over TeX’s astounding correctness. It’s a truly remarkable feat of the human intellect to design something so large with so few errors.
For those who haven't seen Knuth's own analysis of his errors while writing TeX, it's well worth reading his 1989 article "The Errors of TeX" [1] and glancing through the full chronological list of errors [2].
> Most programmers are incapable of writing a correct binary search
If you exclude the programmers who are incapable of writing FizzBuzz (who I would consider "not programmers", no matter what job title they managed to acquire), then I'm pretty sure your statement is false.
If you mean "could sit down and write one that worked the first time without testing", then yes, you could be write. But could not write one at all? I don't buy it.
> If you exclude the programmers who are incapable of writing FizzBuzz
FizzBuzz really is trivial. Binary search on the other hand is deceptively tricky[1]. It was well over a decade from its discovery to the first correct published implementation! No doubt if asked to write it, you'd look it up and say that's trivial, all the while double checking Internet sources to avoid the many subtle pitfalls. You might even be familiar with one of the more famous errors[2] off the top of year head. And even then the smart money at even odds would be to bet against your implementation being correct for all inputs.
And if you had to do it just from a specification with no outside resources? Much harder. At least unless you know how to formally construct a loop using a loop invariant and show monotonic progress toward termination each iteration. Which brings us back to the original submission. There are some programs that are pretty much impossible to prove correct by testing, but that can, relatively easily, be shown to be correct by mathematical reasoning. Since this is a comment on a submission by Dijkstra, here[3] is how he does it in didactic style
> If you mean "could sit down and write one that worked the first time without testing", then yes, you could be write. But could not write one at all? I don't buy it.
Yes that's what "correct" means. Code that only works sometimes is not correct.
It wouldn’t surprise me if there were fewer programmer who know of the overflow edge case (https://research.google/blog/extra-extra-read-all-about-it-n...) and will think of it when writing a binary search than programmers who know of it and will remember to prevent it.
If the implementation language doesn’t automatically prevent that problem (and that is fairly likely), the latter group likely would introduce a bug there.
Yeah if I wanted to just get started with electronics engineering, the easiest cheapest way would be… to use software. Programming / digital engineer bf is lower-stakes than physical stuff.
This is not really true. There were complex multilayered systems before computers. In large systems, Western Electric #5 Crossbar was comparable to a large real-time program. General Railway Signal's NX system had the first "intelligent" user interface. But that level of complexity was very rare.
Both mechanical design and electronic design are harder than program design. The number of people who did really good mechanism design is tiny. There were only two good typesetting machines, over most of a century - Merganthaler's Linotype and Lanston's Monotype. Everybody else's machine was a dud. In the printing telegraph/Teletype business, Howard Krum and Ed Klienschmidt designed the good ones, and the other twenty or so designs over many decades were much inferior. There were been lathes for centuries, but all modern manual lathes strongly resemble Maudsley's design from 1800.
There are far more good programmers than there were good mechanism designers or electronics engineers. Programming is not really that hard by the standards of other complex engineering.