"Regulating AI will be harder than regulating nuclear power. AI grew up in the wild."
New technology always seems Frankenstein-monster scary at first.
We are in the Frankenstein-monster stage.
Tech billionaires at the Trump inauguration
There is plenty of material for the writer of a political blog. Tulsi Gabbard and Donald Trump are saying "treason;" business people are dealing with tariff uncertainty; Ghislaine Maxwell is cutting some sort of deal with Trump; the U.S. dollar is down; the stock market is up; and the Portland Trail Blazers are making crazy trades. Amid this, college classmate Jim Stodder shared an observation about a technology that I expect will change the world as profoundly as did the steam engine.
Stodder teaches international economics and securities regulation at Boston University. He left school for a decade to knock around as a roughneck in the oil fields. Then he returned to formal studies and received a Ph.D. from Yale in economics. His website is www.jimstodder.com
Guest Post by Jim Stodder
Big Tech: Tired of Trump
Elon was the first to jump ship, but he will not be the last.
Before the election the "tech bros" were aghast at the Biden administration’s clear intent to regulate the hell out of AI. An Instagram exchange between Marc Andreesen and Ben Horowitz shows them recalling with incredulity how Biden staffers told them that AI was a national security issue every bit as serious as nuclear power. So it would be regulated just as stringently, with basic research results fenced off as “state secrets.”
As a result of such pronouncements, Andreesen and other tech bros decided to go all-in for Trump. We all saw Trump’s inauguration seating chart. Why has this ardor started to cool? Why are people like Dario Amodei CEO of Anthropic calling for more regulation, not less? Let me advance several reasons based on what economists call “Increasing Returns to Scale.”
Increasing Returns to Scale (IRS) means that when you double all the inputs, you more than double the output. Many people, with their instinctive distrust of the rich, think that’s always how Big Biz gets big, that everything works that way. It doesn’t. If it did, every industry would be dominated by just one gigantic firm – whoever got big first.
Virtually all firms – including ones based on AI – face a production function that looks like the letter “S”. With inputs collected into one variable on the X-axis, we have output as the Y-axis in a giant “S” curve, tilted and stretched up and to the right. In the early stages, the output curve grows steeper. Output per input is growing – we have IRS. But about half-way up, output starts to grow more slowly.
Most studies of AI scaling patterns can be summarized by similar S-curves, although that “fast first half” seems to last longer than just half the time or resources. We are now very much in the first part of the curve.
Most new technologies show such IRS in their early days, and it usually leads to cut-throat competition. What is unusual about AI technology is that this IRS stage may last a very long time. What does this say about our near future?
Why AI Must be Highly Regulated
1. A long IRS means small leads turn into much bigger ones.
2. The resources needed for “frontier” level AI are unprecedented, with some some CEOs predicting we will soon need data centers in the hundred-billion-dollar range.
3. Gigantic scale makes government control unavoidable, since:
--- a. Governments will have to help raise, protect, and ensure this investment.
--- b. The power of the AI-elite will make the robber barons look like small-town hustlers. Either the government controls them, or they own the government. I’m betting on the latter, at least for the medium-term.
4. The AI companies are starting to demand government regulation because:
--- a. It provides a screen against the anger of the public at this new concentration of wealth and power.
--- b. Regulation will reinforce the dominance of established U.S. firms like Open AI, Anthropic, Microsoft, Amazon, and Google.
--- c. Investors want more predictability, not Trumpian chaos.
--- d. Given the deep concern of most AI experts about the human control and “alignment” of Artificial General Intelligence (AGI), an all-out “arms race” for AGI makes catastrophic outcomes more likely.
--- e. The AGI arms-race with China is in full swing. This makes it harder for US leaders to tell our own companies to tread more carefully. Nonetheless, we have no hope of “arms control” – persuading the Chinese to increase regulation and safety-checking – unless we are doing it with our own companies.
--- f. The computer power of AI is centralized – but the data it needs are everywhere. We have a massive opportunity for data sharing with our allies. This will require not just U.S. regulation, but U.S. laws for data privacy and protection – such as those the EU has been pioneering. If we want to compete with China, we need the full cooperation of all our former allies. Someone should tell Trump.
5. Regulating AI will be much harder than regulating nuclear power. Nuclear power was developed and initially provided by the federal government alone. AI grew up “in the wild”. It will remain so unless it can somehow be corralled.
[Subscribe. Don't pay. The blog is free and always will be.]
The problem with new technologies so irresistible that they invite massive investment is that they may never pay back that investment. When I began in the consulting business in 1972 with The Boston Consulting Group, I was exposed to the common sense principle that it is cash that counts, not profits. When one considers public policy, quite often governments look at “cash traps,” businesses that will never produce as much cash as their overall investment requires. Semiconductors are one such business. Their investments are actually subsidies, typically by Asian governments, who see them as strategic and as critical inputs to other businesses that do generate positive cash flow. The bailout of the American auto industry in 2008/9 is a good example of investment that kept the United States in the game but now China is out-investing us without caring if they get a return. I’m not sure whether AI will ever make return its investment unless it becomes a valuable input to other business areas such as health care.
Trump announced yesterday that his great plan for AI is to not burden it with regulations, other than to make sure it isn’t “woke.” He described it as “a beautiful baby that’s born.” “We have to grow that baby and let that baby thrive. We can’t stop it. We can’t stop it with politics, we can’t stop it with foolish rules.”
What could possibly go wrong?
I realize that Trump & Chumps couldn’t care less about the planet all our lives depend on, but one of my big concerns with AI is its impact on our already stressed-out environment. “Rapid development and deployment of powerful generative AI models comes with environmental consequences, including increased electricity demand and water consumption.”
https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117