14 Comments

It also is important for OS choice. MS stuck with x86 architecture for Windows and its software and is only now porting to ARM. MS must not only do that successfully for Windows and their associated suites of apps, but they must ensure that legacy apps can be supported in some way. This is a danger as backward compatibility with older apps is important. Is MS going to have to provide virtual x86 support for legacy apps in ARM-based Windows? Messy. Why not just use another OS and run x86 virtual Windows for x86 apps? One can even run apps from the cloud and locally in browsers with WASM. This could spell not just the decline (and demise?) of Intel, but MS as well. I don't care for Apple as it now acts like MS did around the turn of the century. Computers are fragmenting in hardware, OS, and how software is run. This was the original hope for languages like Java that ran on virtual machines. This may yet provide the solution to avoid OS lock-in and more competitive computing platforms. The hardware should be as efficient as possible, and OSs and applications should be able to run on any hardware with sufficient resources.

Expand full comment

Am I hallucinating, or has Apple successfully written a bunch of low-level and highly successful translation "Rosettas"? -B.

Expand full comment

Yes, panic for sure. What's the alternative magic strategy to Gelsinger's? And, given the talent wars, what sort of magical execution of Gelsinger's strategy could have been expected? But I guess last summer's layoffs were a sign that time for renewal-through-investment had run out.

Expand full comment

These C-suiters aren't good at questioning themselves, past a certain near point. They think the sun shines out their orifices.

Expand full comment

RISC machines were first developed back in the 1970s with the hope that having a simple instruction set, almost like microcode, would make it possible to rapidly iterate better, more powerful processors. The whole technology was sidelined by the success of the Intel-IBM-Windows triad which provided the economic muscle to push CISC machines, particular the x86 family, to new levels. It wasn't that long ago that Apple switched over to x86 as RISC fell far behind the curve.

From a technological point of view, RISC had an extreme advantage. If nothing else its sheer simplicity. Despite this, the CISC world had the money to stay well ahead. I remember Bob Frankston explaining this to me in the early 1990s. RISC might have been technically superior, but CISC was where the money was, at least until it wasn't. (There's a whole pile of history here involving the IBM antitrust case, the unbundling the OS, the reverse engineering the BIOS and then increased competition turning general purpose PCs into commodities.)

For a long time, the CISC based x86-Windows combination drove the market, but after Apple released the iPhone, it soon had the resources to develop RISC processors. There's a lot of contingency here. Apple had been designing its own support chips since the first Macintosh, but designing processors is expensive. It's no surprise that RISC machines are outpacing CISC machines. The only real surprise is that business conditions finally allowed it to happen.

P.S. Technologies often fail to develop for a variety of reasons often unrelated to their own merits. Those GLP-1 diabetes and diet drugs that are in the news these days were first demonstrated as effective drugs for diabetics back in the early 1990s. Pfizer, who had the patents and had done the research, chose not to develop them because they were injectables. At the time, management wanted a drug for diabetics that didn't require needles, so GLP-1 was sidelined. As it turned out, diabetics didn't consider needles to be a big problem. I remember an expert at reverse engineering Soviet military technology explaining that it often helped to remember the road not taken.

Expand full comment

Intel was at the forefront for a long time, but they missed the laptop transformationin the 1990s that emphasized power management. Apple had to do a lot of work to get good battery life on Intel laptops, and they were limited by Intel's chip design. Eventually, Intel caught on and produced a reference laptop system in the early 2000s, but by then they were behind the curve. The big money for Intel was in high end servers, so laptops were just a side show.

One problem Intel had since the early 1980s was its isolation from the end users. They produced CPUs for companies that assembled PCs not PC users. In some ways it was like the automotive industry in the 1920s. There's a reason they called them ASSEMBLY lines. Auto makers bought parts and assembled them. The people making those parts - engines, wheels, struts, body panels and so on - were isolated from the end user. If a car company did it right, they could buy parts on a 30 day float and deliver cars to a dealer with a sight draft to cover that float. There were a lot of innovative cars assembled, but less innovation in the automotive parts business.

By the time smartphones moved from being a niche item like the Palm Pre, Intel was far behind. Intel still owned, with AMD, the high end processor market, but growth was elsewhere. The end users were demanding, at the low end, higher CPU power to power consumption ratios and, at the high end, more willing to move to specialized processors like GPUs.

It isn't clear where Intel goes from here. They may be stuck. meanwhile Qualcomm could move into the general purpose processor market, especially now that Apple has developed its own communications chips.

Expand full comment

Owning INTC is a hedge for a blockade or invasion of Taiwan.

Expand full comment

TSMC + suppliers feels weird to me because it's a geopolitically/world economy-significant manufacturer with a dominant position that's not driven by geographical location (actually, their location might be a net negative these days, or soon enough) but rather by accumulated investment and know-how.

My default expectation would have been that the governments of the US, China (and maybe Japan and the EU as a whole) have enough money to throw at the problem that replicating investment and buying know-how should be expensive but straightforward. China and Japan, at least, have a lot of know-how on *that*.

I don't think I know whether or to which degrees/which ones (1) are doing it and it just takes time, (2) they have run the numbers and it's too expensive to do more than gesture at it, (3) they are trying to leapfrog e.g. through quantum (bad bet this generation IMHO but what do I know), (4) there are thresholds of technical and infrastructural complexity that are impossible to replicate within say a couple of decades, even when money isn't a first-order limiting factor.

My guess is that it's (2) for some actors and a mixture of (4) and (1) at different time scales, but if (4) were a driver at significant time scales (and in a self-sustaining way) then that'd be *fascinating*; I don't buy into the techno-apocalyptic misreading of Vinge's Singularity[1], but the idea of long-term self-sustaining complexity/technological advantages resilient even to would-be competitors leads to some interesting scenarios.

[1] Although it's interesting that it's the same sort of misunderstanding that goes on wrt singularities in black holes; it's as if it's very hard to conceptualize a *breakdown in the theory* without objectivizing it into some sort of real-world phenomenon.

Expand full comment

One interesting thing about TSMC is that they don't manufacture their own chip making machines. Their key advantage is their large team of engineers and factory managers who know how to make a chip factory work. A lot of important knowledge is poorly documented as it seems to be obvious and is frequently is the result of painful experience. There are many companies who can economically produce chips at some distance from the cutting edge, but TSMC seems to have learned how to operate at the frontier.

Expand full comment

Lots of people could be ASML machines. But who other than Samsung & TSMC can make them work?

Expand full comment

> A lot of important knowledge is poorly documented as it seems to be obvious and is frequently is the result of painful experience.

Yeah. I kind of think of "poorly documented knowledge" as both expanding and overlapping with the concept of "culture" - it's whatever you can't easily buy because it's not in tools/services you can acquire but in social norms, corporate myths, and such. The easier it is to access everybody else's technologies and supplier networks, the more critical this becomes for competitive advantage.

As a side note: In theory, one could expect AI to make it easier for companies to subscribe to "CultureAsAService." Perhaps over the long term that'll happen, but right now it feels like the opposite is true: oligopolic AI suppliers all using similar training sets and technologies leads to an AI monoculture of sorts, and the more companies leverage AI to outsource knowledge work, the more they lose any potential competitive advantage derived from that.

Expand full comment

All those cases in which the instruction manual is just too low-bandwidth relative to an actual expert...

Expand full comment

"Culture" is a good word for it. They used to talk about "machine culture" which embodied all the things people need to know about machinery to live in an industrial society. We saw China absorbing manufacturing culture over the last 30 years or so even as more developed societies were abandoning it. Bunnie Huang was a surprising chronicler of that transition. (His discussion of phone chips a decade ago gave a fascinating glimpse.)

I like your idea of CultureAsAService. It speaks to the same management fantasies as Taylorism or RTO. As with fire and flint knapping, people need to experience the benefits of adopting the new culture if they are going to adopt it. Development economists know this, but their bosses don't like them to talk about it.

If you watch closely, we're seeing an AI using culture developing. For example, programmers use it as a combination help system and macro expander, but they've learned to check the code produced for particular types of mistakes. We see this in other AI using communities.

Expand full comment

> idea of CultureAsAService. It speaks to the same management fantasies as Taylorism or RTO. [...] Development economists know this, but their bosses don't like them to talk about it.

True. There's always been a mythological basement to management practices, but now that some of the most influential people in the world have gone techno-Apocalyptic it's getting really really weird.

Expand full comment