4 Comments

Am I the only one commenting here? Is that a substack thing? In any event, I'm back from a long sybaritic road trip and a stay in wartime Portland, so here goes:

Anyone who stayed even vaguely awake in Econ 101 and 102 is aware that inflation is a sign of economic growth. Rising prices are an economy's way of letting suppliers and potential suppliers know that they need to produce more, and if they do so, that they will be well rewarded. The relatively low inflation from the 1980s to the present has been worrying. Surely, there should be demand to produce more, but where was the economic signal?

In the Regency era, beloved of romance novelists, inflation was violent, but it was also a sign of a growing economy. Mr. Darcy notwithstanding, land was no longer the sole source of wealth. The industrial revolution was happening, and a new class was emerging even if they felt a need to "sink the shop" and marry their sons and daughters into the land holding aristocracy. Plenty of land owners were more than eager for their posterity to have a piece of the new wealth.

Besides, who fears inflation? Sure, if it gets out of hand, it is possible to fall behind. If prices soar in the interval between when gets one's paycheck and one buys one's groceries, it makes sense to fear inflation. But inflation is rarely that rapid. Inflation usually gives one plenty of time to buy groceries and the like. The people who really fear inflation are the wealthy who have a fixed sum of assets and a fixed contracts for income and who, having no useful economic capacity. fear having to actually work for a living rather than simply assuming that the bills will be paid.

I never understood the fuss in the 1970s. Inflation then was about rising living standards and the need to move to a sustainable economy. My father, who lived on his investment income, didn't fear inflation, since he recognized that rising prices were a sign of economic growth. As Il Gatopardo put it, everything has to change for things to stay the same. Still, a lot of rich people feared inflation and were willing to sacrifice economic growth for the next 50 years or so rather than even trying to have a useful economic function.

We're seeing this debate now, and the sides are familiar. There are the entrenched and the comfortable and those who actually want or have to perform an economically useful function. In the middle ages, there were the poor and their sympathizers and those who chose crucifixes with Jesus Christ on the cross dying for mankind's sins with a fat purse fastened to his belt. I wish i were making that up. Surely, one of the various Romans who prodded, flogged, tortured and so on Jesus would have stolen his purse. (Check out Pursuit of the Millennium for that one.)

P.S We stayed at a top of the line Portland hotel in what was surely a hospitality suite for wine producers what with its built in wine cellar, gas fireplace, multi-lan router and the like. The city was far from the nightmare portrayed in the media. There were homeless camps, but perhaps OR and REI are the future of American housing and not the Toll Brothers. I kept thinking of Winds of War with Ali McGraw trapped in Warsaw between the advancing Germans and Russians and just wanting a bath. Our hotel had an excellent steam shower. Our hotel had two people at the desk, and as one explained, they were doing everything: check ins, check outs, valet parking, luggage assistance, concierge duty and possibly physical plant repair. I had to sympathize and did what i could to lighten their load. Portland was lovely with lots of restaurants with outdoor dining = big tips, skate boarders, some retail - I remember retail, and good coffee. I'll throw in a thank you here for all the weird ass, long shot researchers studying alternative nucleic acids, mRNA trickery and probably beer making as applied to vaccine production, who produced the vaccines that made our trip possible.

Expand full comment

I, too, find the huge fuss made in the 1970s very hard to understand—but there was a huge fuss, and it was not confined to people who owned bonds...

Expand full comment

Re: the M1 processor

When RISC (reduced instruction set) processors were developed in the late 1970s and 1980s. the argument was that RISC made it easier to improve performance with a lower marginal effort than for a CISC processor. A single instruction cycle only did a limited number of things, so only a limited quantity of logic was needed to make it fast and faster. What this ignored was the CISC processors, like the x86 series, could take advantage of rising chip densities to improve their performance as well, especially if they were in higher demand and more engineering resources could be applied.

nternally, modern CISC processors convert the complex instruction set into a simpler RISC instruction set and then dynamically optimize the sequence. This worked well, but it ignored the costs of the translation and optimization hardware. A RISC sequence would already be optimized and converted. That meant CISC processors could stay ahead for a long time if suitable engineering resources were available. Besides, what could one do with the increased processing power in the face of relatively static and slow data access speeds.

I remember arguing about this with Bob Frankston who did his PhD thesis at MIT on computational utilities back in the 1970s at MIT. He was right for several decades, but eventually RISC moved ahead. If nothing else, RISC was designed to make moving ahead easier by reducing the complexity of each instruction cycle. I'd argue that it was multi-processing that made it practical.

People had been experimenting with using multiple processors to improve the user experience since the late 1980s, but it wasn't until fairly recently that people started realizing that there could be heterogeneous processors and that the OS could use this to advantage. It was driven by portable devices where power and cooling were at a premium. The nirvana was a processor or set of processors that would use no more power and generate no more heat than a task would require.

Apple has always been an old fashioned computer company. I used to say that Steve Jobs was a software guy in a hardware guy's body. In the early days of computing, companies would design and build their processors in parallel with the design of their OS, compilers and the rest of their software stack. They controlled everything. When IBM standardized the industry around the Intel x86 architecture, software was done elsewhere and eveything stagnated. It was like the automobile industry starting in the 1920s. It really was an assembly line. Cars were built from stock parts, and if you played the game well, you could cover your supplier's 30 day float with a sight draft attached to the bill of lading.

The need for improved fuel efficiency and safety in the 1970s ended this, and the move to portable computing this century changed the way computer systems were designed. Suddenly, the sheer overhead of a CISC processor became a liability. Once you chose a CISC architecture, you could juggle the pipeline, add or remove functional units, but you really couldn't have compatible processors on chip that would handle the same instruction set with an appropriate energetic and performative approach.

Being an old fashioned computer company, Apple could take advantage of this. The software and hardware could be design by a single team. Instead of each side having to adapt and adjust on a multi-year schedule, they could plan together, and so here we are. It's not a surprise at all, at least not to anyone paying attention in the 1970s and 1980s.

Expand full comment

Very interesting...

Expand full comment