Key Takeaways

  • The AI datacenter boom delivers huge technological benefits but is driving sharp increases in electricity use, water consumption and memory chip demand, with costs that are already reaching ordinary consumers.

  • Steel - and especially Nucor’s low carbon steel products - has become the backbone of AI datacenter construction, with each large facility requiring approximately 15,000 - 30,000 thousands of tons of steel and helping support a stronger steel price environment.

  • Communities from Georgia to Arizona and Michigan are pushing back as AI datacenters strain local power grids and water supplies, forcing policymakers, utilities and steel makers to rethink how and where this infrastructure is built.

  • The U.S. is projected to add 2,788 new data centers in addition to the over 4,000 existing ones, with $134 billion in investments led by Virginia (595 projects), Texas, Georgia, and Ohio.

Introduction: AI Datacenter Steel, Water, Energy And Price At A Crossroads

Artificial intelligence now touches everything from search results to factory automation and medical imaging, and nearly all of that capability runs inside large warehouse scale AI datacenters packed with specialized chips. These facilities helped push global data center electricity demand to roughly 415 terawatt hours in 2024, about 1.5 percent of total global electricity use, and the International Energy Agency projects that figure could more than double to around 945 terawatt hours by 2030, largely because of AI workloads.

Building that much new capacity is not just a software story; it is an industrial story about land, transformers, cooling towers and enormous volumes of structural and fabricated steel, much of it supplied by companies like Nucor. At the same time, AI datacenters are concentrating energy and water demand in places that already struggle with scarce Colorado River supplies, aging grids or drought, raising local bills and political tensions.

Key takeaway: The AI boom sits at the intersection of digital promise and physical constraint, and understanding steel, water, energy and chip supply is essential for anyone trying to forecast AI’s real world costs and the future path of steel prices.

How AI Datacenters Work And Why Steel Is Their Backbone

Inside An AI Datacenter

An AI datacenter is essentially a highly optimized factory for computation, built around racks of servers loaded with graphics processing units (GPUs) and memory chips, along with high speed networking equipment and massive power and cooling systems. To train or serve large AI models, these GPU clusters consume far more electricity per rack than traditional cloud or enterprise servers, pushing power densities into tens of kilowatts per rack and forcing changes in cooling and electrical design.

From a resource perspective, those GPUs and memory modules draw continuous power, which becomes heat that must be removed using chilled water, evaporative cooling or energy intensive air cooling, all of which in turn drive additional electricity and water use. Studies of US facilities suggest that total data center electricity consumption could reach between 325 and 580 terawatt hours per year by 2028 in the US alone - between about 6.7 and 12 percent of national electricity demand - with AI as a major driver.

Key takeaway: AI datacenters are not abstract “clouds” but large industrial plants with high power densities that translate directly into heavy energy and water requirements.

Where Steel Shows Up In AI Datacenters

To host those dense computing systems safely and reliably, operators build what are effectively steel intensive industrial buildings: frames, joists and deck for the shell, plus structural platforms, cable trays, server racks, battery housings, cooling equipment supports and security fencing. Nucor, the largest US steel producer and recycler, reports that it now supplies over 95 percent of the steel products used in a typical data center, from the outer structure to interior infrastructure such as joists, decking and white space products.

Because AI hyperscale campuses can reach 1 to 2 million square feet per building (approximately. 15,000 - 30,000 tons of steel) and often include multiple buildings plus substations and support structures, each project represents a large, multi year pull on steel mills for plate, bar, sheet, rebar and fabricated joist and deck. Nucor executives describe the environment created by this AI and e commerce build out as “white hot” and a “tsunami of earnings power,” highlighting how datacenter demand has become central to the company’s growth strategy.

US Data Center Boom: Nearly 3,000 New Facilities Planned

The United States is witnessing an unprecedented surge in data center development, with nearly 3,000 facilities either under construction or announced for future buildout as of early 2026. Driven by explosive AI demand, this expansion - totaling 2,788 projects per the American Edge Project's December 2025 report - adds to over 4,000 existing centers and promises $134 billion in investments across leading states like Virginia (595 planned), Texas, Georgia, and Ohio. These developments are reshaping regional economies with jobs and infrastructure growth

Key takeaway: AI datacenters are fundamentally steel projects, with a single large campus requiring enough structural and fabricated steel to become a meaningful driver of mill volumes and steel price dynamics.

How Much Steel A Typical AI Datacenter Uses (Estimates)

While developers rarely publish exact tonnages for individual projects, we can get a reasonable sense of scale by combining industry structure data with common engineering rules of thumb. A large hyperscale AI datacenter might occupy around 1 million square feet under roof, with heavy structural members to support dense equipment and rooftop cooling systems; similar industrial buildings often use on the order 15,000 - 20,000 tons of steel alone for a facility of that size.

On top of the building frame, datacenters need extensive steel based interior systems: cable ladders, server racks, security cages, cooling skids, generator housings and transformer platforms, plus rebar in foundations and yards. When you add these subsystems, an entire AI campus with multiple buildings and on site power infrastructure can easily consume tens of thousands of tons of steel over its build out, especially when operators plan future expansion phases from day one.

Key takeaway: Even conservative estimates suggest that every hyperscale AI datacenter represents a multi tens of thousands ton steel opportunity, magnified by the global rush to build hundreds of such sites.

Nucor, AI Datacenters And Steel Price Dynamics

Nucor’s latest earnings commentary makes the relationship between AI datacenters and steel demand explicit. In the third quarter of 2025, the company reported that steel mill shipments rose 12 percent year over year to about 6.4 million tons, with strong growth in rebar, joist and deck tonnage, and management attributed a significant share of that strength to data center construction. The CEO told investors that Nucor now supplies more than 95 percent of all the steel products that go into a data center and characterized the market as a “white hot” source of a “tsunami of earnings power.”

To lock in this role, Nucor has acquired companies such as Southwest Data Products, which manufactures data center equipment, and launched a Nucor Data Systems business unit focused specifically on AI and warehouse infrastructure, effectively integrating steel supply with specialized fabrication and installation. At the same time, the firm highlights that its electric arc furnace (EAF) process for making “sustainable steel” generates less than one third of the carbon emissions of traditional blast furnace basic oxygen (BF BOF) routes and already accounts for roughly 70 percent of US steel production, compared with only about 22 percent of global output.

From a steel price perspective, the combination of surging AI datacenter demand, trade protection through elevated Section 232 tariffs on certain imports and the capital intensity of new EAF capacity all reinforce a structurally tighter market for the high strength plate, structural and joist products these campuses require. That does not mean steel price will simply march upward in a straight line, but it does mean that large AI and cloud projects are likely to support higher floor prices and margins for specialized long products even if more cyclical flat rolled markets soften.

Key takeaway: AI datacenters are reshaping steel markets by creating a large, relatively inelastic new source of demand, helping sustain strong steel prices and supporting investments in lower carbon EAF production at companies like Nucor.

Water Usage: AI Datacenters In A Thirsty World

Cooling is the most visible way AI datacenters affect local water supplies. Large US data centers can consume up to 5 million gallons of water per day for cooling, equivalent to the daily water needs of a town of 10,000 to 50,000 people, according to analysis from the Environmental and Energy Study Institute. The Lincoln Institute of Land Policy similarly reports that even mid sized facilities can use as much water as a small town, and that water use often peaks on the hottest, driest days when local utilities are already under the most stress.

This is especially contentious because many AI datacenters are being sited in regions that already face water scarcity. A Bloomberg News investigation found that more than 160 new AI oriented data centers were built across the US over three years, concentrated in areas with high competition for scarce water resources, a 70 percent increase compared with the prior three year period. A Project Censored review of AI datacenter impacts notes that Microsoft reported 42 percent of its data center water in 2023 came from water stressed areas and that new facilities in Arizona and Virginia are being built in regions already grappling with drought and shrinking river flows.

Researchers warn that AI specifically could become a major driver of future water stress. One 2025 analysis summarized in Project Censored’s report estimated that AI could consume between about 312.5 and 764.6 billion liters of water in 2025 alone, once both direct cooling and water used to generate electricity for datacenters are factored in. An article in Communications of the ACM likewise highlights that a single large tech company reported withdrawing 29 billion liters of freshwater for on site cooling across its self owned datacenters in 2023, and that the company’s datacenter water consumption grew roughly 20 percent from 2021 to 2022 and 17 percent from 2022 to 2023.

Arizona, Colorado River Cuts And AI Datacenters

These dynamics are particularly salient in the US Southwest, where cities like Phoenix and Tucson rely heavily on the Colorado River via the Central Arizona Project (CAP) canal system. An Arizona Mirror article describes a 2026 bill that would have required municipal water providers receiving CAP water to notify customers of potential rate increases if that water supply were cut, warning that Arizona is effectively “sleepwalking” into deeper Colorado River constraints. The same article notes that CAP, as a relatively junior user under the Colorado River Compact, would be among the first to face cuts if basin states cannot agree on a new sharing deal, putting added pressure on utilities already raising water rates for infrastructure upgrades.

Layering water hungry AI datacenters on top of these pre existing stresses attracts scrutiny. Bloomberg’s water analysis reports that operators have been drawn to arid states such as Arizona because dry air reduces corrosion risk in servers and because of favorable land and power conditions, even though surface and groundwater are already under intense pressure. Residents and tribal communities in the broader Colorado River basin have raised concerns that large new datacenter loads could force further cuts elsewhere in the system or accelerate price hikes for households and farms.

Community Pushback Over Water

In Saline Township, Michigan, for example, local residents are fighting a 2.2 million square foot data center planned to serve Oracle and OpenAI, fearing it will become the region’s largest water user and overwhelm local infrastructure. Project Censored reports that across the US roughly 98 billion dollars worth of datacenter investments have faced opposition or delays as communities organize around water and energy concerns and demand stricter regulation. In drought prone Northern Virginia, home to the world’s largest concentration of datacenters, advocates warn that water use and related energy demand could double electricity consumption over the next 15 years, putting further strain on the region’s aquifers and rivers.

Key takeaway: AI datacenters are colliding with water scarcity in places like Arizona, Michigan and Virginia, turning local water policy into a central constraint on where future AI infrastructure - and the steel that builds it - can be deployed.

Energy Usage: From Grid Strain To Consumer Bills

On the energy side, the International Energy Agency estimates that global datacenters consumed about 415 terawatt hours of electricity in 2024, roughly 1.5 percent of total global demand, and that this could more than double to around 945 terawatt hours by 2030, with AI accelerated computing as the primary driver. The US, Europe and China account for about 85 percent of current data center electricity use, and the bulk of projected growth will occur in these same regions. In the US, the Lawrence Berkeley National Laboratory and WRI estimate that datacenters could consume between 325 and 580 terawatt hours annually by 2030, corresponding to roughly 6.7 to 12 percent of national electricity use.

Meeting this demand requires both new generation and major upgrades to transmission and distribution. The IEA report cited by Nature and Scientific American notes that roughly 27 percent of data center electricity currently comes from renewable sources such as wind, solar and hydro, while natural gas supplies around 26 percent and nuclear about 15 percent, with gas generation expected to grow by roughly 175 terawatt hours to support new datacenter loads. The European Commission similarly warns that fast growing data center demand is a challenge for energy efficiency goals and that a lack of grid capacity is already slowing connections for new projects.

For customers living near major clusters, this can translate into higher electricity bills. Project Censored highlights Bloomberg analysis showing that electricity costs in areas near large data centers have risen as much as 267 percent over five years, and cites a Carnegie Mellon study projecting that the average US electricity bill could rise by about 8 percent by 2030 as a result of combined data center and cryptocurrency mining demand. In some cases, utilities are building new gas plants largely dedicated to serving datacenter loads, which can lock communities into higher emissions and potentially higher power prices for decades.

Key takeaway: AI datacenters are on track to become one of the largest single sources of new electricity demand worldwide, with ripple effects for grid planning, generation mix and the prices paid by households and businesses.

Chip And Memory Shortages: AI Datacenters Versus Automakers

The AI boom is also reshaping global semiconductor markets, particularly for GPUs and memory chips, and those shifts can echo into industries like automotive, consumer electronics and industrial equipment. Jalopnik reports that AI datacenter demand for memory has already contributed to a 90 percent jump in memory chip prices between the fourth quarter of 2025 and the first quarter of 2026, citing data from Counterpoint Research. The same article notes that hard drive makers Western Digital and Seagate had essentially sold out most of their 2026 inventory early, while PC vendors such as Dell responded by raising computer prices by 15 to 20 percent, and Lenovo was expected to follow.

Market analysts warn that this dynamic could lead to a new wave of chip shortages that again affects auto manufacturing. Jalopnik quotes a Counterpoint analyst who expects DRAM shortages to persist across electronics, telecom and automotive through the year and reports signs of “panic buying” in the auto sector as companies fear a repeat of the pandemic era shortages. An S&P Global Mobility analysis similarly finds that DRAM makers are increasingly prioritizing high bandwidth memory (HBM) for AI datacenters over lower margin automotive DRAM, since each GPU module requires large amounts of HBM and yields much higher profits per wafer than the chips typically used in cars.

A separate market intelligence briefing from EnkiAI describes this as an “AI vs auto chip war,” noting that from 2025 onward AI datacenters are projected to consume around 70 percent of all memory chips produced by 2026. Because AI clusters buy very expensive GPUs and high end memory in enormous volumes and are willing to sign long term contracts, foundries naturally focus scarce capacity on these orders, leaving automakers to compete for remaining, lower margin capacity. Bloomberg also reports that rampant AI demand for memory is fueling a broader chip crisis that is eroding corporate margins and pushing up prices across a wide range of products.

From the consumer standpoint, this supply tug of war shows up as higher prices or delayed availability for new vehicles, game consoles, laptops, smartphones and industrial equipment that rely on DRAM and flash memory. It can also push manufacturers to redesign products to use less memory or older process nodes, potentially limiting functionality or slowing innovation at the edge even as AI capabilities in the cloud leap forward.

Key takeaway: AI datacenter demand is not only concentrating water, energy and steel use, it is also concentrating semiconductor capacity, driving memory and GPU shortages that spill over into auto manufacturing and everyday electronics.

Communities, Policy Pushback And Local Case Studies

As AI datacenters multiply, communities and policymakers are beginning to treat them less like invisible digital infrastructure and more like power plants or heavy industry, warranting zoning limits, environmental review and sometimes outright moratoriums. In the US, a coalition of roughly 230 environmental organizations has called for a halt to new datacenter construction until stronger environmental safeguards are in place, warning Congress that the largely unregulated rise of AI and crypto datacenters is “disrupting communities across the country and threatening Americans’ economic, environmental, climate and water security.”

Georgia’s Emerging AI Datacenter Battles

Georgia has become one of the key flashpoints. Social media posts circulating a Guardian article describe the state as leading the first statewide push to ban or pause new AI datacenters, positioning Georgia as a ground zero in the fight over the resource costs of America’s AI boom. A detailed LinkedIn analysis of Georgia’s situation notes that the state has roughly 160 operational datacenter facilities with dozens more planned, including a massive 615 acre campus by QTS in Fayette County, Microsoft expansion in Atlanta and new projects by DC BLOX and Amazon. The same analysis warns that an average sized datacenter can use more than 10 million gallons of water per year, while a very large facility may consume over a million gallons per day, and that AI era datacenter electricity demand is growing at around 2 percent per year compared to roughly 0.01 percent previously.

Faced with this rapid build out, some Georgia legislators and local governments are exploring temporary bans, more restrictive zoning or requirements that new facilities “bring your own generation” by financing dedicated renewable or nuclear capacity rather than leaning entirely on existing grids. Residents argue that without such guardrails, the benefits of AI will accrue mainly to distant tech companies while the environmental and price impacts will be borne locally.

Water Politics In Arizona And The Colorado River Basin

In Arizona, political debate over the Colorado River is increasingly colored by concerns about large new industrial users, including data centers seeking access to relatively cheap power and dry air near Phoenix. As noted earlier, an Arizona Mirror report describes a failed state bill intended to notify customers of potential rate spikes if Central Arizona Project water were drastically reduced, reflecting anxiety about how cities like Phoenix and Tucson would cope with deeper cuts. That article also notes that CAP, as one of the newest major users under the Colorado River Compact, will likely be among the first to see allocations cut if basin states cannot agree on a new sharing plan, putting more pressure on cities already raising water rates.

Bloomberg’s analysis of AI and water finds that arid regions such as Arizona, Saudi Arabia and the United Arab Emirates are now courting AI datacenters as engines of economic growth, even as climate change tightens local water budgets. For residents of Phoenix suburbs already seeing water and sewer fees rise to maintain aging infrastructure, the prospect of additional hundreds of megawatts of power and millions of gallons of water going to AI facilities raises questions about who should get priority when supplies are finite.

Michigan, Virginia And Beyond

The Saline Township dispute in Michigan illustrates how AI datacenters can become flashpoints even in relatively water rich regions. As Project Censored reports, residents are suing to stop Oracle and OpenAI’s planned 2.2 million square foot campus, fearing it will drain aquifers, increase electricity bills and pave over farmland. In Northern Virginia, which hosts an estimated 30 percent of global datacenter capacity, local coalitions are pressing for greater transparency around water and energy use, warning that meeting projected datacenter electricity demand could double or triple the state’s consumption over 15 years.

In response, some states have considered bills requiring datacenters to run entirely on clean energy by 2040 or to report detailed energy and water use, although industry lobbying has defeated several of these proposals. Utilities and planners emphasize that modernizing grids and accelerating renewable build out will be essential if AI datacenters are to expand without blowing through climate targets or overwhelming local systems.

Key takeaway: From Georgia’s proposed AI datacenter moratorium to Arizona’s Colorado River politics and community lawsuits in Michigan and Virginia, local resistance is forcing AI developers, steel suppliers and utilities to engage with the full physical footprint of the AI boom.

Balancing The Benefits Of AI Datacenters With Their Costs

None of these challenges negate the real and sometimes transformative benefits that AI datacenters can deliver. AI models running in these facilities already support medical imaging analysis, fraud detection, logistics optimization, language translation and climate modeling, often improving accuracy or speed relative to traditional software. Utilities themselves are exploring AI tools hosted in datacenters to better detect leaks, forecast load and optimize maintenance, which can reduce water loss and energy waste in other parts of the system.

The key question is how to capture these benefits while reducing the resource intensity of the underlying infrastructure. Steel makers like Nucor argue that one part of the answer is using lower carbon EAF steel, which can cut embodied emissions per ton by more than two thirds relative to traditional blast furnace steel and is highly recyclable at end of life. Data center operators are experimenting with closed loop cooling systems that recycle rainwater and treated wastewater, potentially reducing freshwater use by roughly 70 percent compared with once through cooling, and with air cooled designs that trade some extra electricity use for lower direct water demand.

On the energy side, the IEA and WRI both stress the importance of pairing new AI datacenter loads with additional renewable generation, grid scale storage and demand response agreements that reduce datacenter consumption during peak grid stress. Policy makers can encourage this by tying tax incentives or permits to concrete metrics on power usage effectiveness (PUE), carbon intensity and water usage effectiveness (WUE), effectively making the business case for more efficient designs and location choices.

Key takeaway: The benefits of AI depend on physical infrastructure choices; smarter decisions about steel production, siting, cooling and grid integration can significantly reduce the environmental and price costs of AI datacenters.

Conclusion: The Future Of AI Datacenter Steel, Water, Energy And Steel Price

AI datacenters have quickly become one of the defining projects of our era, pulling together advanced chips, enormous power and water flows and vast amounts of steel into a new kind of critical infrastructure. They promise powerful improvements in productivity, science and daily convenience, but they also risk deepening chip shortages, stressing already fragile water systems and driving up electricity and steel prices if left entirely to market forces.

For steel producers, the message is that AI datacenter construction is not a passing fad but a structural growth driver that justifies investment in low carbon EAF capacity, specialized joist and deck manufacturing and integrated data center steel systems, even as they stay alert to potential policy shifts or community resistance that might slow the build out. For communities and policymakers, the challenge is to harness the economic and technological upside of AI while setting clear rules around water use, grid impacts and transparency, so that households are not left footing the bill for higher power, water and memory chip prices.

Ultimately, whether AI datacenters become a net positive will depend on choices being made today about where they are built, how they are powered and cooled, and what materials - including sustainable steel - go into them. As a consumer, investor, engineer or voter, the important question to ask is simple: are we building AI’s physical footprint in a way that aligns with the long term health of our water, energy and industrial systems, or are we sleepwalking into a future where the cloud quietly drains our rivers and grids?

logo

Subscribe to unlock the full article.

Paid members enjoy exclusive articles, podcasts, videos, ebooks, market insights, and our growing community.

Join Now

Keep Reading