1 Comment
⭠ Return to thread

Back in 2000, Bruce Sterling wrote an article for one of the big business magazines about life in 2050. One thing stuck in my mind. He said that the production of new technology would be so rapid that if any particular technology failed to deliver, there were a number of others that could do the same thing. For the public this meant ennuie would set in. To some extent we see that in the plethora of computer languages. In the 1980s, there was little disagreement about which computer language beginners would use on their PCs (BASIC) and then advance to (C/Pascal), and then which object orientated language to progress to (C++ was the default for C users). Can we agree on what language beginners should start with today? OTOH, for established coders, there are so many viable languages, all freely available, each with their pros and cons. You can even get free Fortran compilers today, and Linux, FreeBSD, has pretty much destroyed the value of proprietary Unix OSs such as System V. In other domains, we see the same thing happening - overlapping science and technology "advances" mean that there are very few really breakthrough advances that are unique and have no comparable competitors.

This has an impact on the value of the knowledge worker output. The "unique value" AI will generate for the individual will be eroded by the sheer volume of near identical output. Produce an analysis that gains attention and almost immediately there will be other analyses doing the same or better. That ease of competition has scientists maintaining tight control over their expensive, hard won data, as the "crowd" could probably do useful analyses even faster, and more comprehensively, than the originator, and potentially publish faster.

Expand full comment