How Applying Abundance Thinking to AI Can Help Us Flourish

Realizing AI’s full potential requires designing for opportunity—not just guarding against risk.

Apr 9, 2025
Guest Commentary
Download Audio

The Compute Shortage of 2028-2031

The following scenario is hypothetical.

– Wikipedia entry [last edited 03/25/2034]

Most historians point to two key incidents as the catalysts of the Great Compute Shortage. First, China successfully blockaded Taiwan from late 2028 through 2029. Taiwan Semiconductor Manufacturing Company, the globe’s leading chipmaker, saw its Taiwan-based operations grind to a halt. Second, Russia sabotaged Advanced Semiconductor Materials Lithography, Inc.’s operations via a novel cyberattack, stalling production of the lithography systems at the core of chip creation.

Attempts to mitigate these production losses were largely ineffectual. Operation Heavy Lith, a coordinated EU initiative, experienced significant delays due to the bloc’s extensive monitoring and reporting process. Meanwhile, in 2029, the US Congress passed the CHIPS 2.0 Act, appropriating over $750 billion to semiconductor manufacturing. The Act, however, required any new lithography and chip production project to complete a multi-stage proposal and review process before receiving unanimous approval from a board of representatives from five different executive agencies. Most applicants failed to have their projects approved. By 2031, CHIPS 2.0 had allocated just $50 billion. While major AI labs, such as OpenAI and Google’s DeepMind, leaned on their existing stores of compute, startups and research institutions were forced to halt their AI projects.

Historians suspect that this pause undermined AI progress in a number of ways. Venture capital funding between 2028-2031 fell by 65 percent. Numerous publicly-funded AI projects – including those for improving education and health care among at-risk populations – were cancelled due to the exorbitant costs of training and deploying new models. Some historians even contend that the compute shortage led to a higher death toll during the destructive 2030 hurricane season. Model Storm, a severe weather research initiative from leading higher education institutions, folded. While Model Storm’s weather detection tools lacked the power to prevent storms, its prediction capabilities could have helped guide evacuations, saving countless lives.

Procedural Paralysis

The hypothetical presented above imagines a near future in which policymakers fail to align procedural checks – laws, regulations, bureaucratic safeguards – with substantive aims when drafting and implementing AI policy. This failure stems from excessive reliance on procedure as a way to appease the concerns of various vocal factions. The emphasis on protective measures and risk avoidance ultimately undermined the success of responsive laws and programs. Sound familiar?

The alternative is Abundance. You’ve likely seen this concept pop up in your podcast feed, streaming service, and social media chats. A recent book by the same name has spurred a lot of conversation. The authors, Ezra Klein and Derek Thompson, have been making the rounds to spread the word about this approach for solving contemporary public policy problems.

Ezra Klein (right) and Derek Thompson (left), the authors of "Abundance" (center). Credit: Simon & Schuster, Shaughn and John, Inc, and Lucas Foglia

What is it? In short, it’s a framework for remedying the scarcity that defines much of daily life – too few starter homes, affordable doctors, quality schools, and so on. The solution is relatively simple – identify substantive policy goals and work backwards to ensure that procedural hurdles align with that goal rather than block progress. Klein, Thompson, and other adherents to Abundance (notably Marshall Kosloff and Steve Teles) point to numerous instances in which well-intentioned procedural steps not only do not align with underlying purpose of those checks, but actively worsen the problem in question. California’s high speed rail project stands out among their myriad examples as particularly emblematic of this procedure-impact misalignment. 

Supporters of high speed rail dreamt of a future of lower emissions as drivers from Los Angeles, San Francisco, and various communities in between ditched their cars for a relaxing, affordable, speedy train. That greener future has not been realized. The project, initially planned to be completed by 2020, has ballooned in cost from $33 billion to $128 billion, in part, because of costly and lengthy environmental reviews. Californians continue to drive – and drive up emissions – in the interim. 

Klein and Thompson cite myriad other examples: There’s the CHIPS Act and its litany of labor requirements that threaten to drive up the costs of new semiconductor manufacturing projects. There’s the National Institute of Health underfunding novel research, like targeted cancer therapies, in order to meet return-on-investment expectations. There are zoning ordinances that stall or block housing development, leading to higher costs and increased homelessness. The list goes on. These ironic and tragic outcomes explain growing interest in Abundance.  

Abundance in Action

The proper mix of political will and procedural creativity can bring scarcity to an end. Operation Warp Speed, as detailed by Klein and Thompson, provides an excellent example. In May 2020, the Trump Administration announced a plan to develop a COVID-19 vaccine in rapid time – and designed procedures and policies tailored to that end. They gave pharmaceutical labs new incentives for focusing on vaccination efforts. They reformed testing requirements. They leaned on private companies – Walgreens, CVS, and the like – to get shots in arms. The first vaccines were ready just seven months later, in December 2020, thanks in large part to Operation Warp Speed. The success of the operations shows that when political will, financial capital, and policy design align with the public good, scarcity can quickly fade away. 

Abundance has ramifications for contemporary AI policy debates. And though the litany of actors motivated to use AI for the public good each have their own ideal substantive objective in mind, let’s accept “human flourishing” as the broad goal of AI policy. This encompasses access to housing, meaningful employment, health care, education, nature, and more. Working backwards from that goal, Abundance calls for asking what mix of procedural checks will encourage the development and diffusion of AI systems towards human flourishing. 

Abundance and AI

The concept of Abundance offers a crucial lens for understanding how America might harness AI for widespread prosperity. Just as previous generations transformed America through new materials, energy sources, and technologies, AI represents our generation's opportunity to "build a new world," to borrow from David Potter. But this transformation is far from inevitable.

We face critical shortages across multiple domains essential to AI progress. The scarcity of compute resources has created a landscape where only the largest tech companies can afford to train and deploy advanced models. Research institutions, nonprofits, and startups focused on developing AI tools primarily for advancing public welfare – rather than solely for commercial gain – find themselves unable to compete. 

Data, too, has become increasingly siloed and proprietary, limiting both the diversity and quality of information available for those who wish to train models that could address many societal challenges. 

We are also facing a scarcity of talent, as our educational systems struggle to produce enough AI specialists with both the technical skills and ethical understanding needed to build systems that benefit society. 

Perhaps most crucially, our energy infrastructure has not kept pace with AI's growing demands, creating bottlenecks that further restrict who can participate in this technological revolution.

These scarcities are already slowing AI progress, particularly in areas that could most benefit society. While buzzy applications like commercial chatbots and image generators proliferate, we see far less development in AI systems designed to advance education equality, improve public health outcomes, or enhance democratic participation. This imbalance reflects not a lack of imagination or desire, but rather the existence of too many structural barriers that starve motivated groups with good ideas from getting the resources they need.

The U.S.’s patchwork approach to AI regulation threatens to worsen these scarcities. Numerous states have enacted disparate and often contradictory laws governing AI development and deployment. California and Utah, for instance, have distinct rules for what information must be disclosed to consumers when they engage with a chatbot. 

This regulatory fragmentation, combined with the absence of a unified federal approach, forces startups and research organizations to contend with inconsistent requirements across jurisdictions — increasing expenses, delaying development, and creating systemic advantages for established companies with substantial legal departments. This creates artificial scarcity by constraining the flow of ideas, talent, and capital across state lines. Ultimately, this will likely concentrate innovation in a handful of regions and companies rather than fostering the broad-based ecosystem that abundance requires.

/odw-inline-subscribe-cta

Solving AI Scarcity

Addressing these challenges demands specific reforms that embrace the abundance mindset. 

First, we must establish public compute resources: a national AI infrastructure. Just as the interstate highway system helped supercharge commerce by enabling the free flow of goods across America, a national AI infrastructure would supercharge AI development by democratizing access to the processing power needed for meaningful AI research and development. 

Second, and related to the first goal, we need to ensure that high-quality, diverse, and ethically-sourced data is available to all. This can be achieved through the establishment of public AI data banks, which would democratize access to data for researchers, smaller AI labs, and public initiatives. These data banks would provide qualifying AI developers and researchers with troves of training data, reducing Big Tech's data advantage and fostering broader innovation across the AI ecosystem, particularly for applications focused on public benefit like education and healthcare.

Third, to address the workforce shortage, we need educational programs that go beyond technical training. Universities, community colleges, and vocational schools must develop curricula that combine computer science with domain expertise in healthcare, education, energy, and other sectors where AI can drive social benefit. These programs should be accessible through multiple pathways, including mid-career retraining.

Finally, overcoming energy scarcity will require the most ambitious solutions. The Institute for Progress's Tim Fist and Arnab Datta recommend establishing so-called special compute zones dedicated to energy development for AI infrastructure. These zones would streamline the  permitting processes for building renewable energy projects designed specifically to power data centers. By co-locating energy production and compute resources, these zones would maximize both the efficient creation and use of electrical power. 

Future Abundance

A person reading Wikipedia in 2034 could find a different story — one where America embraced abundance over scarcity. Imagine an alternative entry describing how, during the critical years of 2025-2029, policymakers acted decisively in the face of compute scarcity. They streamlined approvals for domestic chip production while maintaining reasonable standards of quality. They established public compute resources that democratized access to AI capabilities. They created data commons that made quality information available to all innovators, not merely those with the deepest pockets.

In this alternative history, Model Storm's weather prediction system received priority compute allocation, saving thousands of lives during that devastating 2030 hurricane season. Such a success did not emerge despite government involvement, but because of a government approach that prioritized substantive outcomes over well-meaning, but ultimately paralyzing, procedural hurdles.

We can realize this better alternative by leaning into Abundance — rejecting the false choices between innovation and safeguards; between technological progress and human flourishing. The Abundance mindset teaches us to build systems with the hope of what might go right instead of the fear of what might go wrong, and that well-designed systems can generate both material prosperity and broadly shared benefits. As we stand at this technological inflection point, our challenge is clear: to create the conditions where AI's potential flows not to a privileged few, but to all Americans. History is still being written, and the choice of abundance over scarcity is ours to make.

Written by
Cover image: The production line at a semiconductor factory. Credit: SweetBunFactory / iStock
Continue reading

Exporting H20 Chips to China Undermines America’s AI Edge

Continued sales of advanced AI chips allow China to deploy AI at massive scale.

Apr 14, 2025

AI Risk Management Can Learn a Lot From Other Industries

AI risk may have unique elements, but there is still a lot to be learned from cybersecurity, enterprise, financial, and environmental risk management.

Apr 9, 2025

Subscribe to AI Frontiers

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Subscribe to AI Frontiers

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.