What To Do About the Dependence of the Form Progress Takes on Power?: Quick Takes on Acemoglu & Johnson's "Power & Progress"
Janeway, Smith, Farrell, & DeLong all take their shots...
Janeway, Smith, Farrell, & DeLong all take their shots...
Bill Janeway gives his take on Acemoglu and Johnson’s new book Power & Progress:
Bill Janeway: The Political Economy of Technology: ‘The economic outcomes we experience have never been wholly the consequence of markets efficiently allocating resources to their optimal uses. On the contrary, how the costs and benefits of technological progress are distributed is a matter of social choice—even if it does not always seem so…. [There is] a fundamental tension between the industrialized Western world’s two systems for distributing and exercising power: political democracy and the market economy. Each, in its own way, documents how the dynamics of capitalism have concentrated economic and financial power, which then is used to influence and even dominate the political process….
Melvin Kranzberg’s “First Law of Technology”… states that “technology is neither good nor bad; nor is it neutral”…. Successive technologies… both… “displace” labor from existing tasks and… “reinstate” labor in new tasks…. New technologies… augment… surplus…. But the sharing of that surplus is determined by the balance of power in markets and in the political process, which always has the potential to mitigate or even reverse market outcomes…. From the point of view of the handloom weaver, the technology of the textile mill was unequivocally bad…. Exploitation… was reinforced by the “Bloody Code,” which made machine breaking and more than 100 other acts felonies punishable by death or transportation to Australia…. Progressive reform… was won through aggressive public pressure… peaceful assemblies and petitions… insurgent riots…. Power and Progress assigns a special, shaping role to the “vision” of the entrepreneurs who have led successive waves of technological innovation, and who constitute a “vision oligarchy.”… Acemoglu and Johnson worry that the vision of today’s Big Tech entrepreneurs will dominate how today’s new technologies are applied….
Identifying the actual distributional effects of a technology’s deployment is extremely challenging, and that formulating interventions to move deployment toward machine-useful applications is even more so…. Carrots, rather than sticks… antitrust… an active government role in supporting innovative technologies… as the first, collaborative customer… academia’s “central role in the cultivation and exercise of… social power”… changing the narrative and, with it, cultural norms; building countervailing power; and generating relevant policy solutions… <https://www.billjaneway.com/the-political-economy-of-technology>
And Noah Smith gives his take:
Noah Smith: Book review: “Power and Progress”: ‘“Technology can be used for bad purposes” should be a simple truism, [but] Acemoglu and Johnson pick some very odd examples… [of] “new inventions that brought nothing like shared prosperity”…. “Fritz Haber developed artificial fertilizers that boosted agricultural yields. Subsequently, Haber and other scientists used the same ideas to design chemical weapons that killed and maimed hundreds of thousands on World War I battlefields…” The idea that… Haber-Bosch process… “brought nothing like shared prosperity” is an absolutely wild…. About half of the entire population of Earth—3.5 billion people—is only sustained thanks to this technology….
Acemoglu and Johnson argue that “digital technologies became the graveyard of shared prosperity” over the last few decades…. But as… Larry Mishel and Josh Bivens noted, when Acemoglu and Restrepo measured the effect of workers’ “exposure to IT capital”… they found either no effect or a positive effect on employment and wages…. Acemoglu and Johnson… include persuasion and compulsion in a single category of “power”…. I do not understand why we should put accidental success in a nonviolent marketplace of ideas in the same conceptual category as chattel slavery and feudalism….
You could write a very interesting book about how technologies that complement humans are better for both productivity and broad-based prosperity than technologies that try to substitute for humans wholesale. I would definitely read that book! But Acemoglu and Johnson did not choose to write that book; instead, they warn against a focus on productivity, claiming that it's a seductive but dangerous narrative used by the greedy, fast-talking techbros….
How do we know in advance, before a technology is invented, whether it will increase or decrease the labor share?… Fundamentally, it… boils down to some sort of mandarins… trying to assess the economic effects of a technology that doesn’t even exist yet…. This is probably an impossible task… inferior to… policies to increase labor share ex post… co-determination… sectoral bargaining… wage subsidies funded by taxes on capital income… policies [that] will act like a Pigouvian tax on the kind of cost-cutting that Acemoglu and Johnson decry. With a wage subsidy, for example, the higher the market rate you can afford to pay your workers, the more of a rebate you can get from the government. So if there are technologies that augment your workers and let you hire new workers, a wage subsidy gives you an incentive to create them...
And Henry Farrell:
Henry Farrell: Dr. Pangloss’s Panopticon: ‘Noah represents a style of economics that has an overly Panglossian view of power, economics and progress…. Acemoglu and Johnson’s notion of non-coercive power… [sees that] power is a kind of social influence… some combination of “social influence” and “agenda control”…. Acemoglu and Johnson… are worried about some very specific ideas… unhappy with Silicon Valley’s “techno optimism.” But they don’t push back against progress in general. Instead, they tell us that we can’t just opt for progress and sort out the distributional implications post hoc….
Acemoglu and Johnson’s core claims, as I read them are:
That the debate about technology is dominated by techno-optimists….
That this dominance can be traced back to the social influence and agenda setting power of a narrow elite….
That their influence, if left unchecked, will lead to a trajectory of technological development in which aforementioned very rich tech people likely get even richer, but where things become increasingly not-so-great for everyone else.
That the best way to find better and different technology trajectories, is to build on more diverse perspectives, opinions and interests...
There are two parts to the Glasgow weaver’s complaint…. Different technological trajectories… have long term distributional implications (they lock in economic patterns of who gets what)… [and] you can’t assume that these problems will sort themselves out in some fair and equitable fashion in the long run…. The second part… is a social concern—that maximizing on aggregate wealth and power may have adverse effects for society, and may hurt some groups particularly badly…. Acemoglu and Johnson worry that machine learning… will not only remake the bargain between capital and labour, but radically empower authoritarians…. A decade ago, there were ferocious blogospheric debates about left neo-liberalism… a different version of the fight over whether we could just solve for progress and economic growth and assume that the distributional issues would somehow take care of themselves. I found myself on the opposite side of some very sharp disagreements with Noah’s podcast-mate and intellectual partner, Brad DeLong. I am not at all sure that we’d find ourselves on the opposite sides now… <https://crookedtimber.org/2024/02/27/dr-panglosss-panopticon/>
So where do I come down on this? Ten theses:
First: Janeway, Smith, Farrell are all well worth reading and thinking about.
Second: Indeed, we do not know the effects of any additional act of increasing our knowledge about how to manipulate nature and organize ourselves in the interest of “enlarging the bounds of human empire, to the effecting of all things possible”, as Francis Bacon used to say. It might make a better society. It might make a worse society.
Third: But that very uncertainty means that we, today, should stay in our lane.
Fourth: If we have any confidence in future humanity considered as an anthology intelligence, we should act as to empower it—which means to push forward technology today, in the expectation that once they are able to see what it actually is they will be able to do a better job of assessing how to handle it than we can now.
Fifth: That is, for me, a very powerful point on the Noah Smith side of the argument: Trust the people who will be able to see what the problems are to deal with them, rather than foreclosing their freedom of action by our decisions today.
Sixth: Yes, there are vicious circles by which technology and society enable domination, which then shapes technology and society to make them more friendly to further domination—look at the Roman Empire, where technological progress in technologies of production was limited to aquaducts and to ways to expand scale and routinize production to take advantage of the Roman Peace, while there was immense “progress” in technologies of military road-building, siege weapons, legionary organization, administration, and taxation. But that technology-domination-society-technology loop is likely to be stronger and more durable the slower is overall technological progress.
Seventh: All that said, large oligopolistic firms facing downward-sloping demand curves with market power and large scale are highly likely to focus on developing technologies that substitute for rather than complement labor, and that deskill and degrade workers wherever possible.
Eighth: By contrast, nonprofit bureaucracies—and crazed individuals who want to change the world rather than just make lots of money—are much much more likely to want to try to develop, in Steve Jobs’s phrase, "bicycles for the mind".
Ninth: Hence empowering nonprofit bureaucracies and somehow incentivizing them to finance crazed world-remakers is much more likely to generate genuinely good outcomes than leaving everything to large oligopolistic corporations with substantial market share facing downward-sloping demand curves.
Tenth: The Silicon Valley Ideology is and has always been much more complicated and nuanced than Acemoglu and Johnson and also Farrell recognize—libertarian, but libertarian in a left-wing sense of liberation as well as a right wing sense of deregulation and untaxation. The key task is to keep the locus of innovation outside of organizations that have a strong incentive to eliminate workers, and deskill those workers they do not eliminate. The key task is not to smash the machines.
References:
Acemoglu, Daron, & Simon Johnson. 2023. Power and Progress. Our Thousand-Year Struggle Over Technology and Prosperity. New York: PublicAffairs.
Farrell, Henry. 2024. "Dr. Pangloss’s Panopticon." Crooked Timber, February 27. <https://crookedtimber.org/2024/02/27/dr-panglosss-panopticon/>.
Janeway, Bill. 2023. "The Political Economy of Technology." Project Syndicate August 23. <https://www.billjaneway.com/the-political-economy-of-technology>.
Smith, Noah. 2024. "Book review: 'Power and Progress'". Noahpinion, February 21
#7 (like everything) reminds me of tax reform. [Key joke about the psychiatric patient for whom all the Rorschach blobs remind him of sex.]
Tax reform to
a) untax wage labor, delinking health insurance from employment and financing stage f life transfers of consumption with a VAT
b) tax away some of the wealth created by the new technology to create demand for the technologically displaced labor in new products and services.
c) reduce deficits so as to increase investment (including R&D) and the creation of future income.
Brad, for your point #6, why do you believe that the “technology-domination-society-technology loop is likely to be stronger and more durable the slower is overall technological progress”? As you’ve shown in "Slouching…", technology is increasing MUCH faster than in the past, and look where we are, with our present polarization and existential threat of climate change. Which brings me to your points #4 and #5, if we aren’t getting the solutions to present-day problems with our present-day “anthology intelligence,” why do you expect it will be any better in the future? I think you’re reading into A&J what you want to read, based on your biases, which seem to be similar to Noah’s. My biases tend to be along the same lines as Farrell’s and I read that A&J want exactly what you stated in your point #10: “The key task is to keep the locus of innovation outside of organizations that have a strong incentive to eliminate workers, and deskill those workers they do not eliminate. The key task is not to smash the machines.”