AI Data Centers Use Fracked Gas: Environmental Impact & Concerns

The artificial intelligence revolution is quietly forging an unexpected alliance with one of the most controversial energy industries: hydraulic fracturing. As AI companies race to build massive data centers to power their computationally intensive models, many are turning directly to fracked natural gas as their primary energy source. This trend is particularly pronounced in Texas, where AI infrastructure is increasingly co-located with major gas-production sites. The environmental concerns surrounding hydraulic fracturing (fracking) – a method of extracting natural gas or oil from deep underground by injecting high-pressure fluid into rock formations – are now intersecting with the explosive growth of AI. A prime example is Poolside’s ambitious project, which involves constructing a data center complex on more than 500 acres in West Texas that will generate its own power by tapping natural gas from the Permian Basin [1]. The sheer scale of this operation and its implications for local communities highlight the complex trade-offs between technological progress and environmental sustainability.

The Intersection of AI and Fracking

The artificial intelligence revolution is quietly forging an unexpected alliance with one of the most controversial energy industries: hydraulic fracturing. As AI companies race to build massive data centers to power their computationally intensive models, many are turning directly to fracked natural gas as their primary energy source. This trend is particularly pronounced in Texas, where AI infrastructure is increasingly co-located with major gas-production sites. The environmental concerns surrounding hydraulic fracturing (fracking) – a method of extracting natural gas or oil from deep underground by injecting high-pressure fluid into rock formations – are now intersecting with the explosive growth of AI. A prime example is Poolside’s ambitious project, which involves constructing a data center complex on more than 500 acres in West Texas that will generate its own power by tapping natural gas from the Permian Basin [1]. The sheer scale of this operation and its implications for local communities highlight the complex trade-offs between technological progress and environmental sustainability.

The Scale of AI Data Centers

The energy demands of modern AI data centers are staggering, representing a new era of industrial-scale power consumption. These facilities require massive amounts of electricity, often sourced directly from fossil fuels to meet their computational needs. A prime example is Poolside’s Horizon project in West Texas, which will produce two gigawatts of computing power – equivalent to the entire electric capacity of the Hoover Dam. To understand this scale, a gigawatt is a unit of power equal to one billion watts, capable of powering approximately 750,000 average homes simultaneously. This unprecedented energy consumption has significant environmental implications when powered by natural gas extracted through hydraulic fracturing. The trend extends beyond startups to industry giants; Meta plans to build a $10 billion data center the size of 1,700 football fields that will require two gigawatts of power for computation alone [4]. This massive energy consumption pattern raises critical questions about sustainability and environmental impact, particularly as these facilities often locate near gas-production sites to secure reliable power. The sheer scale of this infrastructure development represents a fundamental shift in how we power technology, with consequences that extend far beyond computational performance to affect local communities and global climate goals.

Community Impact

While AI companies tout the geopolitical necessity of their energy-intensive data centers, the human cost is being borne by residents living in their shadow. For people like Arlene Mendler, who moved to West Texas 33 years ago seeking “peace, quiet, tranquility,” the reality of living across from OpenAI’s Stargate facility has been a rude awakening. The constant soundtrack of construction and bright lights that have spoiled her nighttime views are daily reminders of how their quality of life has been sacrificed. Residents express concerns about noise and light pollution from data center construction that fundamentally alters their way of life. Beyond the immediate sensory intrusion, local water supply issues are exacerbated by the demands of new data centers in a region already grappling with drought. The city’s reservoirs were at roughly half capacity during Altman’s visit, with residents on a twice-weekly outdoor watering schedule. While companies claim minimal water usage for closed-loop cooling systems – a system where water is continuously recirculated and cooled – experts point to the indirect water consumption at power plants generating the massive electricity these facilities require. This creates a cascading environmental impact that extends far beyond the data center’s fence line. The broader implications for community well-being are profound, transforming once-quiet rural landscapes into industrial zones without meaningful community consultation or consent.

Environmental Concerns

The environmental footprint of AI infrastructure extends far beyond electricity consumption, with water usage emerging as a critical concern that often goes overlooked. In drought-prone areas like West Texas, where reservoirs were at roughly half capacity during OpenAI CEO Sam Altman’s recent visit, the water supply is a critical concern for residents already facing twice-weekly outdoor watering restrictions. Companies like Oracle claim their facilities will consume minimal water after initial setup, stating that each of their eight buildings will need just 12,000 gallons per year following an initial million-gallon fill for closed-loop cooling systems. These closed-loop cooling systems are water-based cooling methods where the same water is continuously recirculated through the system, minimizing water consumption after an initial fill compared to systems that constantly draw new water. However, this narrow focus on direct cooling ignores the broader hydrological impact. As Professor Shaolei Ren of University of California, Riverside explains, these efficient cooling systems require more electricity to operate, which means more indirect water consumption at the power plants generating that electricity. Experts warn that the indirect water consumption at power plants may be significantly underestimated in corporate environmental assessments. This creates a troubling paradox: while companies tout water-efficient cooling technologies, they’re simultaneously driving increased demand for power generation that consumes massive amounts of water elsewhere in the system. The complete environmental footprint of AI infrastructure must account for both direct and indirect water usage across the entire energy supply chain [3]. As AI companies continue building in arid regions already struggling with water scarcity, this hidden hydrological cost threatens to exacerbate local environmental stresses while remaining largely invisible to end users enjoying AI-powered applications.

The Geopolitical Context

The strategic rationale behind the rapid expansion of natural gas infrastructure for AI data centers is increasingly framed as a matter of national security and technological supremacy. AI companies argue that energy independence is crucial for competing globally, positioning reliable, domestic fossil fuel sources as essential to maintaining an edge over geopolitical rivals. This perspective was starkly articulated by OpenAI’s Chris Lehane, who pointed to China’s massive energy buildout – 450 gigawatts and 33 nuclear facilities constructed in the last year alone – as a benchmark the U.S. must match or exceed [1]. The implication is clear: speed and scale are paramount, even if it means prioritizing fossil fuels over a more deliberate transition to renewables. This geopolitical competition is often cited as a justification for fossil fuel dependency, creating a narrative where environmental considerations become secondary to strategic imperatives. The Trump administration’s July 2025 executive order explicitly fast-tracks gas-powered AI data centers by streamlining permits and offering financial incentives, while notably excluding support for renewable projects [2]. This policy shift reflects a broader prioritization of immediate energy security and technological leadership, fundamentally reshaping U.S. energy policy toward supporting the AI industry’s colossal power demands. Industry leaders consistently highlight that falling behind in the AI race is not an option, framing natural gas as the pragmatic, readily available solution to power the computational arms race. However, this approach raises critical questions about long-term energy strategy and whether short-term geopolitical gains justify locking in decades of fossil fuel infrastructure that could hinder the transition to cleaner alternatives.

Expert Opinion

From our perspective at NeuroTechnus, the rapid expansion of AI infrastructure represents not just a technological challenge, but a profound environmental and ethical crossroads. While the industry’s explosive growth demands unprecedented computational power, the current trajectory toward fossil fuel dependency – particularly natural gas extracted via hydraulic fracturing – creates a dangerous paradox. We are building systems intended to solve humanity’s greatest challenges while simultaneously exacerbating one of them: climate change. The strategic placement of data centers near major gas fields like the Permian Basin creates what amounts to energy islands, disconnected from broader grid modernization efforts and locking in carbon-intensive infrastructure for decades. This approach fundamentally misunderstands the concept of energy resilience, prioritizing short-term availability over long-term sustainability. The geopolitical argument that this fossil fuel buildout is necessary to compete with China represents a false dichotomy – true technological leadership requires innovating beyond outdated energy paradigms, not doubling down on them. As we’ve explored in our analysis of AI’s broader ecosystem in ‘The AI Race: Investing in Environments for Training AI Agents’ [1], sustainable infrastructure is not a constraint on innovation but its essential foundation. The Duke University finding that utilities typically use only 53% of their available capacity suggests significant room for optimization before resorting to new fossil fuel plants [2]. Rather than racing to build the most power-hungry systems, the industry should prioritize efficiency gains, load flexibility, and accelerated transition to truly clean energy sources. The environmental costs being externalized to communities like Abilene and Richland Parish represent not just local grievances but systemic failures in how we approach technological progress.

The Future of AI and Energy Infrastructure

The future of AI infrastructure remains profoundly uncertain, poised between technological innovation and potential environmental degradation. As we’ve examined, the current trajectory sees massive data centers consuming energy at unprecedented scales, often powered by natural gas extracted through hydraulic fracturing – a process involving high-pressure injection of fluids to release gas from shale formations. This raises critical questions about long-term sustainability and community impacts in regions like West Texas and Louisiana. Looking forward, three distinct scenarios emerge. In a positive outcome, massive private investment in small modular reactors and advanced renewables successfully transitions AI infrastructure to cleaner energy sources within the next decade. A neutral scenario sees continued reliance on natural gas but with improved efficiency and carbon capture technologies mitigating environmental harm. The negative path involves unchecked fossil fuel expansion that locks in decades of emissions while leaving local communities with environmental burdens and financial liabilities long after corporate contracts expire. The Duke University study suggests utilities could handle projected AI energy demands through better load management rather than new power plant construction [1], offering a potential middle path. Ultimately, whether AI becomes an environmental liability or a catalyst for clean energy innovation depends on whether short-term geopolitical competition continues to override long-term sustainability considerations.

Frequently Asked Questions

Why are AI companies turning to fracked natural gas for their data centers?

AI companies are increasingly using fracked natural gas as their primary energy source because it provides reliable power for their computationally intensive models. This trend is particularly strong in Texas, where AI infrastructure is co-located with major gas-production sites to secure this energy supply.

What are the environmental concerns associated with AI data centers powered by natural gas?

The environmental concerns include significant greenhouse gas emissions from fossil fuel consumption and water usage issues. In drought-prone areas like West Texas, data centers exacerbate local water scarcity through both direct cooling needs and indirect water consumption at power plants generating electricity for these facilities.

How do AI data centers impact local communities where they’re built?

Local communities face noise and light pollution from constant construction, along with diminished quality of life as quiet rural landscapes transform into industrial zones. Residents also worry about water supply issues in drought-affected regions, where data center demands compound existing resource constraints.

What geopolitical justification do AI companies give for relying on fossil fuels?

AI companies argue that energy independence through domestic fossil fuels is crucial for competing globally with China, which has built 450 gigawatts and 33 nuclear facilities in one year. This geopolitical competition frames fossil fuel dependency as necessary for maintaining technological supremacy.

What alternative approach does NeuroTechnus suggest for powering AI infrastructure?

NeuroTechnus advocates prioritizing efficiency gains, load flexibility, and accelerated transition to clean energy sources rather than doubling down on fossil fuels. They point to Duke University research showing utilities typically use only 53% of available capacity, suggesting optimization potential before building new power plants.

Relevant Articles​

02.11.2025

DeepAgent AI: Autonomous Reasoning, Tool Discovery, and Memory Folding Achieves 91.8% success rate on ALFWorld, demonstrating superior performance in complex,…

01.11.2025

OpenAI GPT-OSS-Safeguard Release: Open-Weight Safety Reasoning Models The 16% compute efficiency allocation for safety reasoning in OpenAI's production systems demonstrates…