2 Comments

Matrix multiplication and inversion will never go out of style. Since the physicists tell us that everything is locally linear, at some point linear algebra will come crawling out of the snarls and tangles. The integer banging as in crypto comes and goes, but there big matrices needing heavy processing will always be with us.

Computer hardware seems to come in cycles. There's some great new idea, let's say hardware hashing support, floating point arithmetic, video compression / decompression or vector computation, and there is suddenly a growing market for a specialized piece of hardware to be hooked up to one's CPU. I remember when Mercury made vector processing boards back when CAT scanners and petroleum geologists needed to number crunch.

In the next part of the cycle, those special operations become CPU features with a new set of instructions and instruction prefixes and perhaps a new mode added to the repertoire. You have to move way down the CPU ladder to find a processor without a few CRC instructions or built in floating point nowadays. Most modern CPUs have a good chunk of graphics processor on chip or chip adjacent, but NVidia is still way out ahead.

I think the next barrier is going to be good enough. Already, LLM companies are offering ways to perturb their existing LLM with a customer's data to produce a specialized LLM with much less computing than building one from scratch. Right now, this is still beyond the capability of a typical laptop or desktop, but there's no reason this will be true in, let's say, five years. For one thing, there will be more precomputed starting state LLMs and, for another, the algorithms will improve. Meanwhile, baseline CPUs will just keep getting better and faster. (I'd be surprised if no one at Apple is working on this for some future version of Spotlight.)

There will still be a big market for high end processing boards, but it will no longer be driven by LLMs as it is now. I can't predict the next frontier, but it's a fair guess that it will involve matrices.

Expand full comment

Although Nvidia's stock price seems to reflect a belief not that "some of the work will be multiplying matrices" but rather that "all of the work will be"...

Expand full comment