An outtake left on the cutting room floor from the ms. of "Slouching Towards Utopia?: An Economic History of the Long 20th Century: One set of lenses with which to view the technological core of modern economic growth brings into focus General Purpose Technologies: those technologies where advances change, if not everything, a lot, as they ramify across sector upon sector.... In the 1950s there came another GPT. Electricity was no longer a sector, for it was everywhere. But there emerged something called “microelectronics”...
This is the hardware side, but the software side has its own story. Modern accounting might flow from late Medieval Italy with its bankers and adoption of Arabic numerals, but modern information processing was developed in the 19th century. Mid-century, Maury at the USNO set out to organize the data being collected in ships' logs which were once rather idiosyncratic. It took a lot of work to fill in the missing stuff, slog through the irrelevant stuff and attempt to discern patterns and understand the oceans. Maury encouraged a standard reporting form that is still familiar today. He developed what we now call a relational database, though the idea wasn't formalized until the late 1960s.
By the late 19th century, all sorts of data were being moved into the new form, and the sheer volume, as generated by the census, led to the development of data processing hardware, but the software was pervasive well before Hollerith's or von Neumann's breakthroughs. By the 1920s, tabulating machines, improved with high torque electric motors, and flexible filing systems were changing the information structure of business, government and industry. The tabulators and sorters were visible as technology, the filing systems, large scale rotary and linear files, printed cards for data structuring, and formalized data access and modification procedures were hidden in plain sight.
By the 1950s, computers had caught up with the software, and while computers became faster and more commodious, software devolved into confusion and complexity. It wasn't until the 1970s that there was as successful effort to untangle things. Unlike hardware with Moore's Law and easily understood benchmarks - nanometer scale, transistor size, components per square centimeter - software appears as a problem, a nuisance, otherwise it is invisible. It's like compression attachment technology, either the flint stays on the head of the spear or the spear is a piece of junk.
As a software engineer, I have always been impressed with software's invisibility. It is still evolving. Silicon Valley is full of startups trying the make it easier to make sense of the massive flow of data. Machine learning, for all its flaws and biases, is playing a part. The real triumphs will be invisible. Advertisers will quietly be able to target potential customers without invading privacy. Securiity attacks will be silently thwarted. Systems will be reconfigured transparently in the face of spikes or outages. As with electricity, the lights might flicker for an instant, but no one will notice a thing.
Yes. I don't have a good handle on the software side, however, save for some scattered observations about SAGE, System 360, WIMP, UNIX—& brain-hacking via dopamine loops & gamification...
I see software as having the same relationship to hardware, as language has to human wetware. Bigger brains are useful, but it was the invention of language and its expressive power that really helped launch human civilization. Software languages have rapidly evolved to capture different requirements and continues to do so. It is even possible that new software languages will facilitate AGI and really change everything.
It's a tempting frame, but just as steam and machining has no meaning without the imperial projects of navies and railroads, those social constructs to produce the collective ability to do the logistics of empire which creates demand for better tools, VLSI has no meaning without the mammonite project of knowing where absolutely all the money is.
(yes, there was a huge flowering of interest and effort and creativity; very little of it did anything and none of it did anything systemic, because we're still using quill-pen-and-ledger organization, just very fast.)
It took some time to extend the customary control systems and the channels of guaranteed profit, but it happened. While it remains possible to get very rich, it will keep happening with biotech and the wet nanotech and the witchcraft solid state materials science. The rich cannot tolerate meaningful change. (Which is why it matters that the English Pirate Kingdom was a marcher state, and why it matters that VLSI's beginnings have something to do with the holy and unquestionable need for better ICBMs.)
This is the hardware side, but the software side has its own story. Modern accounting might flow from late Medieval Italy with its bankers and adoption of Arabic numerals, but modern information processing was developed in the 19th century. Mid-century, Maury at the USNO set out to organize the data being collected in ships' logs which were once rather idiosyncratic. It took a lot of work to fill in the missing stuff, slog through the irrelevant stuff and attempt to discern patterns and understand the oceans. Maury encouraged a standard reporting form that is still familiar today. He developed what we now call a relational database, though the idea wasn't formalized until the late 1960s.
By the late 19th century, all sorts of data were being moved into the new form, and the sheer volume, as generated by the census, led to the development of data processing hardware, but the software was pervasive well before Hollerith's or von Neumann's breakthroughs. By the 1920s, tabulating machines, improved with high torque electric motors, and flexible filing systems were changing the information structure of business, government and industry. The tabulators and sorters were visible as technology, the filing systems, large scale rotary and linear files, printed cards for data structuring, and formalized data access and modification procedures were hidden in plain sight.
By the 1950s, computers had caught up with the software, and while computers became faster and more commodious, software devolved into confusion and complexity. It wasn't until the 1970s that there was as successful effort to untangle things. Unlike hardware with Moore's Law and easily understood benchmarks - nanometer scale, transistor size, components per square centimeter - software appears as a problem, a nuisance, otherwise it is invisible. It's like compression attachment technology, either the flint stays on the head of the spear or the spear is a piece of junk.
As a software engineer, I have always been impressed with software's invisibility. It is still evolving. Silicon Valley is full of startups trying the make it easier to make sense of the massive flow of data. Machine learning, for all its flaws and biases, is playing a part. The real triumphs will be invisible. Advertisers will quietly be able to target potential customers without invading privacy. Securiity attacks will be silently thwarted. Systems will be reconfigured transparently in the face of spikes or outages. As with electricity, the lights might flicker for an instant, but no one will notice a thing.
Yes. I don't have a good handle on the software side, however, save for some scattered observations about SAGE, System 360, WIMP, UNIX—& brain-hacking via dopamine loops & gamification...
I see software as having the same relationship to hardware, as language has to human wetware. Bigger brains are useful, but it was the invention of language and its expressive power that really helped launch human civilization. Software languages have rapidly evolved to capture different requirements and continues to do so. It is even possible that new software languages will facilitate AGI and really change everything.
It's a tempting frame, but just as steam and machining has no meaning without the imperial projects of navies and railroads, those social constructs to produce the collective ability to do the logistics of empire which creates demand for better tools, VLSI has no meaning without the mammonite project of knowing where absolutely all the money is.
(yes, there was a huge flowering of interest and effort and creativity; very little of it did anything and none of it did anything systemic, because we're still using quill-pen-and-ledger organization, just very fast.)
It took some time to extend the customary control systems and the channels of guaranteed profit, but it happened. While it remains possible to get very rich, it will keep happening with biotech and the wet nanotech and the witchcraft solid state materials science. The rich cannot tolerate meaningful change. (Which is why it matters that the English Pirate Kingdom was a marcher state, and why it matters that VLSI's beginnings have something to do with the holy and unquestionable need for better ICBMs.)