8 Comments

Lovely set of thoughts and questions. Every task and every individual carrying out a recognized function operates along a distribution, with the simple, boring tasks stretching out to the left and the challenging, innovative original tasks stretching out far to the right. Most of the productivity enhancers scoop up the stuff on the left and (as long as the fellow sitting in the center has the wit and the tools to make sure it is correct and in the style desired -- there are tools already under development for this) productivity goes up many-fold. But the real gains are made on the right, where something new happens, which must be recognized, evaluated, shared and built on. These are two very different problems. I hope the monetizable gains on the left will pay for some of the right hand stuff.

In the stone age, we explore and extend a new idea by explaining it to others until we understand it well enough to take it further and build something new. We can't all do this at once in a global market square -- the cacophony even if all ideas are brilliant is overwhelming. What seems to be missing from the blogosphere and the world of startup accelerators is some economic structure that pulls good ideas together until they reach a survivable size.

Expand full comment

Back in 2000, Bruce Sterling wrote an article for one of the big business magazines about life in 2050. One thing stuck in my mind. He said that the production of new technology would be so rapid that if any particular technology failed to deliver, there were a number of others that could do the same thing. For the public this meant ennuie would set in. To some extent we see that in the plethora of computer languages. In the 1980s, there was little disagreement about which computer language beginners would use on their PCs (BASIC) and then advance to (C/Pascal), and then which object orientated language to progress to (C++ was the default for C users). Can we agree on what language beginners should start with today? OTOH, for established coders, there are so many viable languages, all freely available, each with their pros and cons. You can even get free Fortran compilers today, and Linux, FreeBSD, has pretty much destroyed the value of proprietary Unix OSs such as System V. In other domains, we see the same thing happening - overlapping science and technology "advances" mean that there are very few really breakthrough advances that are unique and have no comparable competitors.

This has an impact on the value of the knowledge worker output. The "unique value" AI will generate for the individual will be eroded by the sheer volume of near identical output. Produce an analysis that gains attention and almost immediately there will be other analyses doing the same or better. That ease of competition has scientists maintaining tight control over their expensive, hard won data, as the "crowd" could probably do useful analyses even faster, and more comprehensively, than the originator, and potentially publish faster.

Expand full comment

"3 or 4 doublings of productivity of knowledge workers"

And yet economists such as yourself suggest that there is little evidence of productivity gains from computers. This seems a disconnect.

Possible explanations:

1. Only a few knowledge workers experience the productivity gains and therefore have little impact on the national economy.

2. The productivity gains are illusory, the equivalent of mathematical masturbation. Remember the backlash against MS PowerPoint?

3. The productivity gain is real, but the gains are not correctly measured by economic metrics. I consider the improved quality of the output to be such an unmeasured metric.

4. Related to #3, the increased productivity gains are absorbed by the harder knowledge work to make progress. IOW, scientific and technological gains are harder to make and require the productivity gains to make the same progress that pencil and paper did in a less advanced period. [Example from my past. In the 1970s there was a fear that the time spent in library citation tracking would eventually prevent any new experiments from being done because repetition of past experiments was unhelpful. Electronic search has largely solved that problem. Even in the mid 2000s, we had a Stanford alumnus travel to the library to get physical copies of requested journal papers. This has been largely, but incompletely solved by online search, legal and "illegal" access to articles, open science journals, etc. The "runners" are no longer needed.]

Expand full comment
author

Individual knowledge workers produce more stuff, but that means that the attention of other knowledge workers is fragmented as they have to process more stuff. Each individual worker is more productive, but that does not translate to the whole...

Expand full comment

Very timely. The BBC just published an article on this very question:

BBC News: Why is technology not making us more productive?

https://www.bbc.co.uk/news/business-66233654

Expand full comment

That strikes me as a rationalization. Is there evidence to support that statement? To take my example of searching journal stacks and citation indexes, the individual is not available to provide attention to others in the team. If a computer collapses the search time, then the time saved can be used to work with team members. More generally, if the computer is doing the work that was done manually in the past, more work can be done in the same time, or time allocated for other activities, such as meetings and discussions.

Let me use an area of work that I am unfamiliar with, but does use extensive computational effort - weather forecasting. What was once a slow manual process collecting data, plotting it out, then making expert guesses for a forecast is now done with remote data acquisition (including satellites), and computational effort to plot the conditions and forecast the weather for up to a week ahead. In my youth, UK weather forecasts were little more than weather fronts and expected sunshine/rain/snow/fog symbols on a national map. Today the data is displayed as movements over the last 24 hours with very granular cloud density and predicted rainfall. The news channels still provide daily forecasts, the weather data is on the internet is far more granular than in the past, much more accurate. How is the productivity of the various weather services measured? Is there some improved quality factored in? What about the benefits to life (and property) from imminent storm warnings? It seems to me the "whole" gains in this domain.

Now let's introduce AI into this domain. What might change? The human interaction might change when producing the forecasts. The AI might run possible scenarios, access prior weather events, and produce a forecast without human intervention. The human weather forecaster might be replaced by a CGI avatar for the tv consumer. Jobs presenting the weather will be lost, much like typing pools disappeared. The OTOH, some forecasters will find new roles in determining how weather can be better displayed and integrated into other economic activities.

But if the forecast output stays the same, just produced with fewer people, that will be a productivity gain.

Now hypothetically, what if computers were just generating busy work. Instead of hand drawing charts we futzed endlessly with spreadsheet charts (and more of them), tarting them up to look "prettier". Then we embedded them in presentation slides. Maybe that was a chunk of failure to increase productivity. Add in AI and the time wasted in all this busywork largely disappears. What I do worry is that the busywork will shift to managing AI prompts and instructions, trying to get better output. I once spent some time trying to get a good image from Stable Diffusion - classic busywork.

+

Expand full comment

Brad, what you said at my event last night was terrific. You added some nice material here as well, but also left out some nice material from the talk. Like your example of you and your fellow physics students together acing the test that none of you alone could do.

Expand full comment
author

It is a very good story, but it has its place only to reinforce that we are intelligent together, not individually. An individual thinking that he and his mother's basement – or some computer somewhere – is going to outsmart everybody by himself, and to take control of things is simply engaging in a pre-adolescent fantasy. That is an important point to make. But I am hoping here to talk to people further along in their understand.

Expand full comment