arbisoft brand logo
arbisoft brand logo
Contact Us

The Sustainability of Data: Compute, Cost, and Carbon

Naveed's profile picture
Naveed AnjumPosted on
9-10 Min Read Time

For much of the last decade, enterprise data strategy has been driven by scale. More data. More models. More compute. But as AI moves from experimentation to core business infrastructure, a new reality is emerging: compute has a physical footprint, and that footprint carries financial, energy, and carbon consequences.

 

For CIOs and investor relations leaders, sustainability is no longer a moral add-on or a regulatory footnote. It is becoming a material variable in cost control, operational resilience, and long-term valuation. The sustainability of data now sits at the intersection of compute efficiency, financial discipline, and carbon accountability.

 

Jonathan Koomey, a veteran researcher who has studied data-center energy use for more than three decades, puts the situation in perspective:

 

“It’s not like we’re helpless in the face of this massive increase in the demand for AI. There’s all sorts of things we can do, and we’re only at the beginning of doing those things.”

 

At the same time, Koomey cautions against simplistic projections and panic-driven narratives:

 

“We don’t know how much electricity AI will use next year, never mind five years from now, and anyone who comes in with a projection beyond that is just nuts, because [information and communication tech] changes so fast.”

 

The message for enterprise leaders is clear. The challenge is real, but it is also manageable if data and AI systems are designed with sustainability in mind.

 

Compute Is Growing — But Inefficiency Is a Choice

Enterprise compute demand is rising rapidly. AI training, real-time analytics, and increasingly complex data pipelines are pushing infrastructure to scale at unprecedented speed. What is often misunderstood is that rising energy consumption is not driven by compute volume alone. It is shaped by the architectural decisions organizations make — particularly around model design and workload orchestration.

 

As Boris Gamazaychikov, senior manager on the emissions reduction team at Salesforce, explains, sustainability starts with questioning whether the largest AI models are always necessary:

 

“It’s really important, when we look at AI, to ask ourselves if the most general purpose, largest models are necessary.”

 

Gamazaychikov highlights that energy use increases with model size and parameter count, which is why many enterprises are shifting away from single, massive models toward more focused alternatives. Instead of deploying one oversized system to handle every task, organizations are increasingly using collections of smaller, specialized models that are better aligned to specific business functions.

 

Describing this shift, Gamazaychikov notes:

 

“Instead of having this giant model that’s super inefficient, trying to parse through the entire internet, you have these smart, smaller, modular models that are more efficiently getting that information.”

 

For CIOs, the implication is clear. Compute growth does not automatically require proportionally higher energy use or carbon emissions. Efficiency emerges from deliberate architectural choices — selecting appropriately sized models, orchestrating workloads intelligently, and designing data pipelines that minimize unnecessary computation.

 

Carbon Is Not Abstract – AI Has a Physical Cost

One of the most persistent misconceptions about artificial intelligence is that it is immaterial. In reality, AI systems are deeply physical. They depend on extractive supply chains for hardware, power grids that remain largely carbon-based, water-intensive cooling systems, and data-center infrastructure that must operate continuously at scale.

 

Kate Crawford, author of Atlas of AI and a leading scholar on the environmental impact of machine learning, captures this reality plainly:

 

“AI is made of minerals, energy, and vast amounts of water.”

 

For investor relations teams, this framing has become increasingly material. Carbon exposure now influences capital allocation, regulatory scrutiny, and long-term reputational risk. AI-driven growth that treats compute as an abstract, infinite resource — rather than a physical system with environmental constraints — risks becoming an ESG liability.

 

The pace of AI deployment is also colliding with real-world infrastructure limits. Brian Janous, former Vice President of Energy at Microsoft and now an energy-infrastructure entrepreneur, underscored this tension in a 2024 interview:

 

“This load is coming up faster than utilities can actually build out infrastructure to support it.”

 

Janous’s warning highlights a critical point for enterprise leaders. The sustainability challenge is not only about future technology breakthroughs, but about today’s grid capacity, energy availability, and carbon intensity. 

 

For CIOs, sustainable AI does not mean slowing innovation. It means disciplining it, with visibility into where energy is consumed, when carbon intensity peaks, and how architectural decisions amplify or reduce environmental impact across the data and AI stack.

 

Measurement Is the Foundation of Sustainability

Measurement is where most sustainability conversations stall.

 

Not because leaders do not care, but because the data is fragmented. Energy use sits in one system. Cloud spend in another. Emissions are often estimated, not observed. AI workloads move across regions, platforms, and time windows faster than reporting can keep up.

 

Without reliable numbers, planning becomes guesswork. Infrastructure decisions are made without a clear view of demand. Trade-offs remain implicit. Risk is discussed in broad terms instead of operational detail.

 

This gap is visible at the global level as well. The International Energy Agency (IEA), which tracks energy demand across industries, has repeatedly pointed out how little concrete data exists on the energy impact of AI systems. 

 

Fatih Birol, Executive Director of the IEA, summed up the issue succinctly:

 

“AI is one of the biggest stories in the energy world today – but until now, policymakers and markets lacked the tools to fully understand the wide-ranging impacts.”

 

That lack of transparent energy data is especially concerning because projections show rapid growth. According to the IEA:

 

“Data centres worldwide consumed around 415 terawatt-hours (TWh) of electricity in 2024, a figure expected to more than double to 945 TWh by 2030 as AI workloads expand.”

 

For enterprise leaders, this puts measurement in a different category. It is not a reporting exercise. It is a prerequisite for decision-making. Without clear metrics that connect compute usage to cost and carbon, sustainability goals remain detached from day-to-day operations.

 

You may also like: Sustainable AI Benchmarks: KPIs Every CIO Should Track in 2026

 

Efficiency Is the Highest-ROI Sustainability Strategy

The most credible sustainability strategies are built on efficiency at the system level. In data and AI infrastructure, efficiency determines how much compute is required to deliver outcomes, which directly affects cost, energy use, and carbon exposure.

 

Clear evidence of this already exists in large-scale AI deployments. In a widely cited study conducted by Google and the University of California, Berkeley, researchers showed that the carbon footprint of training a large AI model fell by more than 700 times over four years

 

The reduction came through the use of more efficient models, optimized processors, and greater reliance on low-carbon energy sources.

 

Reflecting on those results, the researchers concluded:

 

“These drastic overall improvements, as well as their trajectory over time, suggest that extrapolating current parameters to predict future carbon dioxide emissions is fraught with peril.”

 

The implications for enterprise leaders are substantial. Efficiency gains can meaningfully alter the environmental profile of AI systems even as overall compute demand increases. Model selection, hardware optimization, and energy-aware infrastructure decisions influence cloud spend, energy consumption, and emissions at the same time.

 

For CIOs, efficiency becomes a strategic lever that shapes long-term infrastructure economics. For investor relations leaders, it provides a sustainability narrative grounded in measurable performance outcomes. Efficiency delivers financial return while reinforcing environmental responsibility, making it one of the most practical paths toward sustainable AI at scale.

 

Smarter Data and AI Pipelines Change the Equation

Sustainability becomes actionable when data and AI pipelines are designed deliberately. Optimized pipelines allow enterprises to extract more value from existing compute capacity while reducing waste across storage, processing, and model execution.

 

Intelligent workload orchestration, model efficiency techniques, and unified data architectures help reduce redundant computation and unnecessary data movement. Together, these practices limit energy consumption without constraining analytical capability or model performance.

 

Many enterprises are addressing this by bringing more discipline into their AI workflows, reducing fragmentation across data, models, and infrastructure.

 

The impact of system-level optimization has been demonstrated quantitatively by Google’s researchers, who observed:

 

“The right combination of model, processor, data center and energy source can reduce the carbon footprint of training an ML system by 1000 times.”

 

For CIOs, this reinforces a practical reality. Well-designed data and AI pipelines support performance, scalability, and sustainability at the same time. Investments in architecture, orchestration, and efficiency deliver benefits that compound across cost control, energy use, and operational resilience.

 

For investor relations leaders, this creates a clear narrative. Capital allocated to modern data infrastructure translates into measurable returns across financial efficiency, environmental performance, and long-term risk management.

 

Why Sustainability of Data Now Matters to Leadership

The sustainability of data has become a leadership priority. It already influences procurement decisions, cloud contracts, regulatory expectations, and investor conversations. Compute choices made today shape cost structures, operational resilience, and carbon exposure over the long term.

 

Data and AI systems now function as physical infrastructure. They consume energy continuously and operate within real-world constraints. Leadership teams that recognize this reality can plan growth with discipline and allocate capital with greater clarity.

 

For CIOs, sustainability supports control over infrastructure complexity, operating costs, and system reliability. For investor relations leaders, it strengthens credibility through narratives grounded in measurable performance and environmental accountability.

 

Eco-friendly enterprise AI reflects a mature approach to innovation. Organizations that treat sustainability as an engineering and governance discipline are better positioned to scale responsibly and build durable enterprise value.

 

Build Sustainable Enterprise AI with Arbisoft and Databricks

Sustainable enterprise AI depends on disciplined data pipelines, measurable outcomes, and architectures designed to scale responsibly.

 

Arbisoft partners with Databricks to help enterprises design, optimize, and govern data and AI systems that balance performance, cost efficiency, and environmental accountability. From modern data platforms to production-ready AI pipelines, we help teams turn sustainability into an operational advantage.

 

Let’s talk about building enterprise AI that scales with control.

Explore More

Have Questions? Let's Talk.

We have got the answers to your questions.