UBC community members have well founded concerns and questions about the environmental impacts of AI use. As we face the reality of an expanding digital footprint for our institution, a small group of us came together to educate ourselves and share what we learned with the rest of the UBC community.
Read on to find out more about what we learned about what the university is doing, what we as individuals can do, and how all UBC’ers can join the conversation as we collectively learn about this new technological and educational landscape.
About the Group

Tara Ivanochko is a Professor of Teaching in the department of Earth and Ocean Sciences. She is also the Academic Director, of the UBC Sustainability Hub, the Director of ENSC, and the Co-Director for the Certificate for Climate Studies and Action.

Simon Bates is the Vice Provost and Associate Vice President, Teaching and Learning, and is a Professor of Teaching in Physics.

Christina Hendricks is the Academic Director of the Centre for Teaching, Learning, and Technology, and a Professor of Teaching in Philosophy.

Elisa Baniassad is the Academic Director of the Learning Technology Innovation Centre and a Professor of Teaching in Computer Science.
What the University Is Doing
UBC, like many other institutions, is beginning to respond to the rising environmental impact of digital tools including AI. While broader institutional plans are evolving, key areas of effort include:
- Supporting responsible use through education: Hosting conversations and providing guidance on ethical AI use, including sustainability considerations.
- Infrastructure planning with ARC: Partnering with ARC (Advanced Research Computing) to consolidate and centralize compute needs going forwards in energy-efficient environments rather than scattered, less efficient deployments.
- Learning Technology Vendor accountability: When working with learning technology vendors we are asking that they disclose their environmental practices and we will prioritize vendors that have strong renewable energy commitments.
- Promoting sustainable inquiry: Encouraging the use of energy-efficient practices and report on model and data efficiency in their publications, in line with emerging norms in “Green AI.” Our LT Incubator will be promoting software engineering practices that are power aware.
- Aligning with UBC’s climate action goals: Ensuring AI infrastructure decisions are consistent with UBC’s broader climate and sustainability commitments.
What Individual UBC Community Members Can Do
Even though the per-query impact of AI tools is small, individual choices can still play a role in shaping demand and promoting sustainable AI. UBC community members can:
- Be aware of AI’s environmental footprint: Using AI tools contributes to global energy demand. A year of daily AI chatbot usage can be equivalent to a short car trip in terms of carbon emissions. While this is small, it is not zero, and it scales with global adoption and frequent use.
- Rightsize the model for the task: Where you have options, choose smaller or task-specific models rather than general-purpose large ones. Avoid using, for example, GPT-4 for tasks that a simpler model or traditional search could handle.
- Use efficient prompting strategies: Minimize the number of prompts by being deliberate in your input. Precise, well-formed prompts reduce computation and result in fewer redundant queries. For instance, you can try to hone a prompt using a locally running model, and then, after iteration, use that single prompt on the larger model.
- Build lightweight applications: If developing AI-powered tools or projects, use efficient code and avoid bloated or over-engineered solutions. Consider model compression and latency-optimized backends. Employ power-usage aware software engineering strategies.
- Incorporate cost into digital literacy training: Help students understand the environmental cost of their digital actions, just as we might for food production, printing or travel.
How to Contribute to the Conversation
This is an emerging area of shared responsibility and we would like to adapt and improve this guidance document over time. Faculty, staff, and students are invited to:
- Read “Our Current Understanding” below: this is where we are now, and this understanding is helping to drive our approach, mitigations, and further explorations.
- Help us learn more: Do you see gaps in our understanding? Are you concerned that we are missing nuance, or have blind spots? If so, please contribute, and help us as an institution learn.
Our Current Understanding of The Environmental Footprint of AI Tools: Concerns, Costs and Mitigations
The widespread expansion of use of artificial intelligence (AI) will continue to impact numerous fields of work, study and society. This has raised concerns around the environmental impact of this enhanced use, particularly due to the high energy consumption and carbon emissions associated with training and operating AI models. Major tech companies report significant jumps in greenhouse gas (GHG) emissions, partly due to expanding data centers for AI workloads – Alphabet’s emissions rose 48% from 2019 to 2023, and Microsoft’s grew 29% from 2020, with both citing AI data center energy use as a key factor (Van der Ven, 2025).
Some projections warn that AI and related digital technologies could consume up to 20% of global electricity by 2030 (Van der Ven, 2025), straining power grids and complicating climate goals. In 2024, the United States supported ~ 45 % of data centres globally (IEA, 2025). Though existing AI data centres consume as much energy as a small city, the largest of the new AI data centres under construction will consume the equivalent of 2 million households. Data centres are concentrated into as few as 5 regions in the United States, therefore having profound local impacts on energy systems (IEA, 2025). These energy demands have an accompanying water footprint: cooling the hardware in data centers requires enormous water usage. Training a single large AI model can use tens of millions of liters of water – for example, Meta’s LLaMA 3.1 consumed an estimated 22 million liters (about 9 Olympic-size pools worth) during training (Masley, 2024). The carbon emissions from such training runs are likewise huge, on the order of hundreds of tons of CO₂ – roughly equivalent to the lifetime emissions of five average cars for one model’s training (Masley, 2024). As a further example, training GPT-3 is estimated to have consumed 1,287 MWh of electricity and emitted over 500 metric tons of CO2 (Patterson et al., 2021).
This level of resource use stands in stark contrast to tech companies’ public commitments to sustainability, highlighting an “AI energy dilemma”: how to continue AI advancement without derailing climate targets (Samhan et al., 2025).In fact, even as AI promises efficiency gains in other domains, its own infrastructure is powered largely by electricity from fossil fuels today, prompting delays in coal plant retirements and even the reopening of a nuclear power plant to meet data center demand (Van der Ven, 2025).Such trends have spurred greater scrutiny and a call for more sustainable “Green AI” practices in research and industry (Samhan et al., 2025).
This document explores the key concerns surrounding the environmental footprint of AI, the costs of querying models and highlights potential mitigations, many of which require action from providers, but also those that can be taken by UBC and community members.
Carbon footprint of AI usage
The carbon footprint of using AI tools (inference) comes primarily from the electricity powering data center servers that handle AI queries. Globally, data centers (which host AI models among other services) account for roughly 1.5% of energy-related GHG emissions in 2024 (IEA, 2025t) In the United States, they already draw about 6% of national electricity and, with surging AI demand, that share is expected to double by 2026 (Samhan et al., 2025). In 2030, data centres are projected to consume only ~0.1% of global energy, this is less than projections for electric vehicles or home air conditioning, which are also continuing to increase over time.However, in nations with advanced economies data centres are projected to account for up to 20% of demand growth by 2030 (IEA, 2025).
Generative AI services like large language models are computationally intensive – running a query on an AI tool like ChatGPT entails far more computation than a basic web search using a search engine like Google. Early estimates suggested each prompt to ChatGPT consumed around 3 Wh of energy (about 10 times a Google search) – equivalent to keeping a light bulb on for 20 minutes (Van der Ven, 2025; You, 2025). If true, the aggregate impact of billions of AI queries would be massive. For instance, one back-of-the-envelope analysis using that figure estimated ChatGPT might consume on the order of 227 GWh of electricity per year to handle around 78 billion queries—roughly half the electricity that Netflix used globally for streaming in 2019 (Masley, 2024).
However, more recent data paints a somewhat less dire picture for the typical carbon cost per query. Thanks to efficiency improvements in models and hardware, a typical ChatGPT prompt (using OpenAI’s latest GPT-4 model) likely consumes only about 0.3 Wh of electricity – an order of magnitude lower than the initial estimate (You, 2025). In context, 0.3 Wh is a relatively trivial amount: less energy than an LED light bulb uses in a few minutes. Thus, the direct carbon emissions for a single short AI query are on the order of only a few tenths of a gram of CO₂. Even when a user engages in a longer conversation, the energy usage remains relatively small. One study found that on the order of 20–50 ChatGPT queries in total might be required to consume the energy needed to evaporate just 500 mL of water for server cooling (Masley, 2024)–highlighting that each individual query’s water and energy impact is small but non-zero. Even a heavy user who prompts an AI assistant dozens of times a day would only consume a few watt-hours total, a tiny fraction of the 28,000 Wh the average U.S. household uses in a day (You, 2025). One analyst translated a year’s worth of daily AI chatbot use into familiar terms: 12 months of ChatGPT usage consumes about as much energy as driving a car 10km, or taking around 5 hot showers (5 minutes each), or running 2 hot baths (Salathé, 2025). In other words, the carbon footprint of frequent AI queries over an entire year might equate to a few kilograms of CO₂ – on the order of a single short car trip. This puts personal AI use in perspective: it is orders of magnitude less impactful than, say, daily commuting by car or long video streaming sessions. The same can be argued for water use to cool data centre infrastructure: it has been estimated that an hour of videoconferencing uses resources that consume between 2 and 12 litres of water for cooling (Obringer et al., 2021).
So, the per-usage carbon footprint of AI tools is low in absolute terms; the aggregate footprint becomes significant only at very large scales of usage. This aggregate impact is what concerns many observers: as millions of users incorporate AI assistants into daily life, the small per-query emissions multiply. The overall energy demand from AI is indeed growing quickly, but whether it will lead to an exponential rise in global ICT emissions is uncertain. Historical trends in computing suggest a counterbalance from efficiency gains: over 2010–2018, data center compute workloads increased by over 550% but energy use grew only ~6% (Ritchie, 2024), thanks to improved chip and cooling efficiency. Continued efficiency improvements may similarly keep the carbon footprint of AI usage in check even as usage soars. Still, given current trajectories, AI’s electricity hunger is already prompting infrastructure shifts (like dedicated power plants for data centers) and requires close monitoring.
Why, then, the concern? Partly, it stems from conflating the one-time training cost of AI models with the incremental cost of using them. Training a large AI model is extremely energy-intensive – it requires running massive computations on power-hungry hardware for days or weeks on end. For example, training a cutting-edge language model has been estimated to consume hundreds of megawatt-hours: on the order of 300 tons of CO₂ emissions for a single model training run (Samhan et al., 2025). However, that training cost is amortized over the model’s useful life and its many users. Once deployed, each additional query does not retrigger the full training; it only incurs the much smaller inference computation. In effect, the marginal cost (energy and carbon) of each query is very low, since the substantial upfront footprint is a sunk cost spread across billions of inferences. Of course, if usage increases dramatically, more servers and maybe more frequent retraining or fine-tuning will be needed, which adds periodic bumps in resource usage. But for now, individual AI queries have a negligible environmental cost when considered in isolation. The key issue is scaling: millions or billions of such queries globally can collectively draw enough power to be noticeable at the grid level, which is why efficiency and clean energy for AI remain important.
Mitigations
While the per-use environmental impact of AI is small, the cumulative footprint of widespread AI adoption is prompting action to make AI more sustainable. Researchers, policymakers, and industry experts are advancing several strategies to mitigate AI’s environmental footprint through a variety of measures.
Improving Data Center Sustainability: Because nearly 97% of AI’s carbon footprint comes from operational energy use (running the servers) rather than manufacturing (Samhan et al., 2025), a top priority is transitioning data centers to cleaner energy. Tech companies are investing heavily in renewable power and novel energy sources to run AI infrastructure. For example, Microsoft has even struck a deal to power its data centers with electricity from a reopened nuclear plant (Three Mile Island) to secure carbon-free, reliable energy (Ritchie, 2024).
Locating AI data centers in regions with abundant clean energy (solar, wind, hydro, etc.) and cooler climates can dramatically cut emissions. Studies find that choosing an optimal site can reduce a data center’s operational carbon output by 60% or more (Samhan et al., 2025). Companies are increasingly signing power purchase agreements for renewables dedicated to their data centers and exploring on-site generation to ensure AI workloads are powered with as little carbon impact as possible. Additionally, better cooling technologies (such as advanced liquid cooling) can reduce water usage and energy loss, making each watt of computation more efficient.
Energy-Efficient Hardware: Another mitigation is the use of specialized, high-efficiency hardware designed for AI computations. AI models run fastest on accelerators like GPUs, but even those can be outclassed by newer AI-specific chips. For instance, Google’s Tensor Processing Units (TPUs) and other neural processors are tailor-made for matrix operations common in AI, and they offer significantly greater performance per watt than general-purpose chips (Samhan et al., 2025). Recent hardware comparisons show AI accelerators achieving 1.5× or more energy efficiency over the best GPUs for the same tasks (Samhan et al., 2025).
By upgrading to efficient hardware (and regularly decommissioning older, less efficient servers), data centers can handle growing AI workloads without a proportional rise in power consumption. This includes exploring emerging processor designs optimized for machine learning workloads and using metrics like TOPS/W (tera-operations per second per Watt) to guide procurement towards the most energy-efficient AI systems. The rapid pace of innovation in AI hardware – from GPUs to TPUs (Tensor Processing Units) to NPUs (neural processing units) – is encouraging, as each generation tends to deliver more performance for the same energy input.
Algorithmic and Model Optimization (Green AI): There is a concerted push in the AI research community toward “Green AI” – developing algorithms that achieve the needed results with much lower computational cost (Schwartz et al., 2020). This includes improving the efficiency of training algorithms, so models reach target accuracy with fewer runs or less data, and employing techniques like model compression, distillation, and weight quantization to reduce the size and inference cost of models without sacrificing quality. Optimizing code and libraries for efficiency, and choosing more efficient model architectures, can yield substantial energy savings. Techniques like model distillation, pruning, and quantization can reduce computational demands without compromising performance (Xu et al., 2021).
Importantly, not every application requires the largest possible model; using smaller, task-specific models where appropriate can slash energy use. A broad, general-purpose AI (like a 175-billion-parameter language model) will consume far more energy than a streamlined model tuned for a specific task (Samhan et al., 2025). Deploying AI solutions with an eye to “right-sizing” the model to the task is a practical way to cut waste. Overall, Green AI advocates for treating efficiency as a key evaluation metric alongside accuracy in AI research. By reporting energy or carbon metrics for model training and inference, researchers can incentivize innovations that reduce resource usage. This cultural shift – akin to focusing on fuel efficiency in car design, not just speed – is aimed at bending the curve of AI’s future resource demands.
Policy, Transparency, and Offsets: Finally, broader policy and transparency measures can help mitigate AI’s environmental impact. Experts argue for more transparent reporting of AI energy use by companies (Ritchie, 2024). If AI providers regularly disclose the power and carbon footprint of their data centers and services, it allows for tracking progress and accountability. Such transparency can also inform policy: governments and regulators could set efficiency standards for data centers or provide incentives for using clean energy. In some regions, integrating large AI data centers into grid planning is crucial – ensuring there is sufficient renewable supply or grid infrastructure so that local spikes in AI power demand don’t lead to more fossil fuel burn elsewhere (Ritchie, 2024).
Additionally, companies are exploring carbon offsetting and carbon removal investments to compensate for the emissions from AI workloads that cannot yet be eliminated. While offsets are not a long-term solution, they can bridge the gap as the sector transitions to sustainable energy. Looking ahead, strategies like scheduling non-urgent AI training jobs for times of surplus renewable energy, and sharing pre-trained models to avoid redundant computations, could further reduce the total energy cost of AI to society. In summary, a combination of clean energy, efficient hardware, optimized algorithms, and transparent practices constitutes the path forward to reconcile AI’s growth with environmental stewardship.
References
International Energy Agency. (2025). Energy and AI. https://www.iea.org/reports/energy-and-ai
Masley, A. (2024, January 27). Using ChatGPT is not bad for the environment. Substack. https://andymasley.substack.com/p/individual-ai-use-is-not-bad-for
Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguía, L., Rothchild, D., & Dean, J. (2021). Carbon emissions and large neural network training. arXiv. arXiv:2104.10350v3
Ritchie, H. (2024, November 18). What’s the impact of artificial intelligence on energy demand? Sustainability by Numbers. https://www.sustainabilitybynumbers.com/p/ai-energy-demand
Salathé, M. (2025, April 3). AI Energy Use in Everyday Terms. Engineering Prompts (Substack). https://engineeringprompts.substack.com/p/ai-energy-use
Samhan, O., Madhok, S., Bhimarao, M., Fathi, S., Rege, S., & Mokashi, S. (2025, March 11). Solving the AI energy dilemma. Willis Towers Watson. https://www.wtwco.com/en-gb/insights/2025/03/solving-the-ai-energy-dilemma
Schwartz, R., Dodge, J., Smith, N. A., & Etzioni, O. (2020). Green AI. Communications of the ACM, 63(12), 54–63. https://doi.org/10.1145/3381831
Van Der Ven, H. (2025, February 24). Is artificial intelligence bad for the environment? UBC Magazine. https://magazine.alumni.ubc.ca/2025/environment-technology/artificial-intelligence-bad-environment
Obringer, R., Rachunok, B., Maia-Silva, D., Arbabzadeh, M., Nateghi, R., Madani, K. (2021). The overlooked environmental footprint of increasing Internet use. Resources, Conservation and Recycling, 167. https://doi.org/10.1016/j.resconrec.2020.105389
You, J. (2025, February 7). How much energy does ChatGPT use? Epoch (Gradient Updates). https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use
