No doubt, one has to marvel all the recent technologies (and your understanding of them). But here's the conundrum: Either economic growth has moderated because of something else, or, true economic growth is not being measured. Or, perhaps we shouldn't be studying technology from the perspective of its effects on economic growth alone. There are so many other ways modern life has altered due to technological advancements.
That is such a satisfying song of the flow of technology, rising straight and smooth on a logarithmic scale, that I really hate to quibble. But there is one area where I think your dates and rates understate how early the advances and geometric increases began and that is in data communications. (Voice communications lagged by ten or more years because of the inertia and indolence of traditional leaders in communications, the telephone companies.)
By 1980, Email messaging had been in place for a decade. Chip and machine designers in one state could send the details of a completed design, once checked, to a factory in another state. Not long after, the factory could be in another country. At first this only worked for companies that could afford large satellite antennas on their rooftops, but by 1990 the internet in its several early forms was firmly in place with the capacity required. By 2000, cables stretched across almost all oceans. Certainly the Atlantic and Pacific were densely covered. The telecom bankruptcies of 1999-2000 found much of this buildout still dark, so it was quickly snapped up for pennies on the dollar by the software survivors of the tech crash of about a year later, Google among them. This provided a platform for the explosion in distributed services that took place in the first decade of the 2000s.
There was also quite a struggle to replace valves with transistors. The US space program helped kickstart that change, although god knows, audiophiles are enraptured by valves and IO read some years ago of a value factory being reopened to supply demand. What next, recreation of venerated early computers with valves? It isn't just the silicon hardware either. It is the explosion of software languages and tools that make use of the hardware power. The world might look very different if we were stuck coding with binary, even assembly language, rather than with higher level languages. [There are so many, in so many diverse applications, that there is no consensus on which language to pick as your first. Once it was easy - BASIC for home computers, or Fortran for big machines. Then C became available and we were off to the races. The evolutionary tree of languages is huge today. ]
If Tim Berners Lee hadn't invented the WWW, would someone else? I recall being asked at the beginning of the 1990s what I thought would be the next big thing in computing. All I could say was a vague "communications". But I dismissed email as inferior to fax based on my experience of BB text messages. Doh! I will say that the focus on huge social platforms is a disappointment. I would have thought there were more socially/economically useful directions for information technology to attack. AI has been waiting in the wings almost since its inception. Maybe now it is becoming really useful, much as BASIC brought computing to the enthusiasts, and the web has made communication and services available to almost anyone willing to use it.
There is a steampunk subculture in computing, which you can encounter at the Computer Museum in San Jose. I go back far enough that I even have experience in programming one or two of the late models. And even today, there are groups using the 6502 microprocessor (the 5000 transistor heart of the Apple I) as a target design for all sorts of retro, large enough to see with the naked eye, technologies.
Some are delighted in the accumulation of layers of software that allow each of us say "Hello World" to each other at ever higher levels of encapsulation and objectification. Others despair that the pure brilliance of machine language lies so deep in this ocean, buried under thick layers of whaleshit. Ask Google to let you share "Don Knuth's Tears," a seminar he gave not long ago at Stanford.
I will look up that reference. When I lived in San Jose/Los Gatos I visited the Computer Museum. Fun to see hardware from the deep past and hardware that I used even pre-Apple II. (Wasn't it shut down a while back?) Facebook has those sort of groups too, like "Minimalist Computing". As for layers of abstraction, Vernor Vinge describes such deep layers of code in the far future where no one wants to recode old code, so they just add a new layer on top of the old. Not so different conceptually of adding interfaces to old mainframe [COBOL] code so that more modern languages could provide new UIs and communications. If the purpose is to make things easier and to be more productive, then that is fine with me. I'm not a purist.
I was somewhat saddened by the video of Knuth's talk at Stanford "The Tears of Donald Knuth. I think Haigh's critique of Knuth's lecture was more valid. "Historical Reflections - The Tears of Donald Knuth: Has the history of computing taken a tragic turn?"
Power of technology transformed the world. As much as it quickly arbitraged goods and labor prices around the world, it also spawned new industries. Previously impossible, imagination and creativity can now become marketable commodities.
I think you were indulging yourself with the explanation of why doped silicon (and what about germanium?) is a semi conductor. I think it would have been a lot better to have devoted some space to the rise of information handling, especially the replacing of valves with transistors and why information technology is so important. You give a few examples, but it really is a transformative technology, even if economists cannot seem to detect the productivity gains from computers. [I think economists miss the real value of infotech in terms of deeper analyses in science and technology, and improving product design and quality. IIRC, hedonic pricing captures features, but not intangibles like design quality.]
No doubt, one has to marvel all the recent technologies (and your understanding of them). But here's the conundrum: Either economic growth has moderated because of something else, or, true economic growth is not being measured. Or, perhaps we shouldn't be studying technology from the perspective of its effects on economic growth alone. There are so many other ways modern life has altered due to technological advancements.
That is such a satisfying song of the flow of technology, rising straight and smooth on a logarithmic scale, that I really hate to quibble. But there is one area where I think your dates and rates understate how early the advances and geometric increases began and that is in data communications. (Voice communications lagged by ten or more years because of the inertia and indolence of traditional leaders in communications, the telephone companies.)
By 1980, Email messaging had been in place for a decade. Chip and machine designers in one state could send the details of a completed design, once checked, to a factory in another state. Not long after, the factory could be in another country. At first this only worked for companies that could afford large satellite antennas on their rooftops, but by 1990 the internet in its several early forms was firmly in place with the capacity required. By 2000, cables stretched across almost all oceans. Certainly the Atlantic and Pacific were densely covered. The telecom bankruptcies of 1999-2000 found much of this buildout still dark, so it was quickly snapped up for pennies on the dollar by the software survivors of the tech crash of about a year later, Google among them. This provided a platform for the explosion in distributed services that took place in the first decade of the 2000s.
There was also quite a struggle to replace valves with transistors. The US space program helped kickstart that change, although god knows, audiophiles are enraptured by valves and IO read some years ago of a value factory being reopened to supply demand. What next, recreation of venerated early computers with valves? It isn't just the silicon hardware either. It is the explosion of software languages and tools that make use of the hardware power. The world might look very different if we were stuck coding with binary, even assembly language, rather than with higher level languages. [There are so many, in so many diverse applications, that there is no consensus on which language to pick as your first. Once it was easy - BASIC for home computers, or Fortran for big machines. Then C became available and we were off to the races. The evolutionary tree of languages is huge today. ]
If Tim Berners Lee hadn't invented the WWW, would someone else? I recall being asked at the beginning of the 1990s what I thought would be the next big thing in computing. All I could say was a vague "communications". But I dismissed email as inferior to fax based on my experience of BB text messages. Doh! I will say that the focus on huge social platforms is a disappointment. I would have thought there were more socially/economically useful directions for information technology to attack. AI has been waiting in the wings almost since its inception. Maybe now it is becoming really useful, much as BASIC brought computing to the enthusiasts, and the web has made communication and services available to almost anyone willing to use it.
There is a steampunk subculture in computing, which you can encounter at the Computer Museum in San Jose. I go back far enough that I even have experience in programming one or two of the late models. And even today, there are groups using the 6502 microprocessor (the 5000 transistor heart of the Apple I) as a target design for all sorts of retro, large enough to see with the naked eye, technologies.
Some are delighted in the accumulation of layers of software that allow each of us say "Hello World" to each other at ever higher levels of encapsulation and objectification. Others despair that the pure brilliance of machine language lies so deep in this ocean, buried under thick layers of whaleshit. Ask Google to let you share "Don Knuth's Tears," a seminar he gave not long ago at Stanford.
I will look up that reference. When I lived in San Jose/Los Gatos I visited the Computer Museum. Fun to see hardware from the deep past and hardware that I used even pre-Apple II. (Wasn't it shut down a while back?) Facebook has those sort of groups too, like "Minimalist Computing". As for layers of abstraction, Vernor Vinge describes such deep layers of code in the far future where no one wants to recode old code, so they just add a new layer on top of the old. Not so different conceptually of adding interfaces to old mainframe [COBOL] code so that more modern languages could provide new UIs and communications. If the purpose is to make things easier and to be more productive, then that is fine with me. I'm not a purist.
I was somewhat saddened by the video of Knuth's talk at Stanford "The Tears of Donald Knuth. I think Haigh's critique of Knuth's lecture was more valid. "Historical Reflections - The Tears of Donald Knuth: Has the history of computing taken a tragic turn?"
Power of technology transformed the world. As much as it quickly arbitraged goods and labor prices around the world, it also spawned new industries. Previously impossible, imagination and creativity can now become marketable commodities.
I think you were indulging yourself with the explanation of why doped silicon (and what about germanium?) is a semi conductor. I think it would have been a lot better to have devoted some space to the rise of information handling, especially the replacing of valves with transistors and why information technology is so important. You give a few examples, but it really is a transformative technology, even if economists cannot seem to detect the productivity gains from computers. [I think economists miss the real value of infotech in terms of deeper analyses in science and technology, and improving product design and quality. IIRC, hedonic pricing captures features, but not intangibles like design quality.]
It's a good explanation. Thanks for sharing it.