Little

Apr 3, 2026

min

by Garrett Herbst

The Power Behind Your Prompt: Responsible AI Begins With How We Use It

This article is co-authored by Grant Saso.

The concerns around climate and infrastructure challenges of artificial intelligence models are hard to ignore. The numbers are uncomfortable, the implications are complex, and the calls for regulation are absolutely warranted. While this is all true, there is yet another layer to this story that sits closer to the scale of our everyday decisions.

This article offers a complementary perspective to the broader conversation—including Scott Brideau’s recent article “Will AI Be the Final Straw in the Climate Change Equation”—by examining AI through the lens of individual use. It is not intended as a contradiction or a correction, but as an additional angle that considers how personal habits, choices, and expectations around AI contribute to its larger impact. Understanding why, when, and how we use this technology is an essential part of shaping what it ultimately becomes.

A NOTE ON PERSPECTIVE

In our roles on Little’s Emerging Technologies team, we spend a lot of time thinking about how emerging technologies shape the built environment and the systems that support it. In our work, artificial intelligence is no longer an abstract concept or a future scenario. It is already influencing how we design, analyze, and make decisions, often in ways that feel both promising and unresolved. As practitioners who care deeply about sustainability, resilience, and long-term impact, we believe it is important to look not only at what AI can do, but at how it is used and what it quietly but firmly demands in return.

THE PERSONAL SCALE STILL MATTERS

At times, the resource conversation around AI feels abstract and overwhelming. Nuclear plant planning, global water usage projections, and grid resilience are all issues far beyond the control of any single person. And yet, there is another lever, quieter and less visible, but still meaningful: the habits we collectively develop around AI use.

Part of the disconnect comes from how AI’s infrastructure appears. Data centers look enormous because they aggregate millions of tiny actions into one visible place. If everyday devices like microwaves or TVs were collocated, they would look much more alarming. The issue is not the scale of individual use, but the visibility of aggregation, which makes it harder to connect a massive building with a very small per person impact.

To us, it feels very similar to the way we approach recycling. Throwing one plastic bottle into a blue bin will not save the rainforest, and tossing one into the trash will not doom the planet. The point was never the individual bottle. It was the accumulation of choices and the messaging behind a movement.

The same logic applies to AI. One frivolous query will not break the grid, and one thoughtful use will not fix global inequity. But the lifetime habits and norms we cultivate now will scale far beyond our individual impacts, ever more so while the culture of AI is still being written.

UNDERSTANDING WHERE THE ENERGY GOES

People often talk about “AI’s energy use” in all-encompassing terms, but it really comes from two primary use-cases: training the models themselves and the day-to-day queries that people run.

Training state-of-the-art models requires substantial energy. The energy required to train ChatGPT-4 is estimated to have been almost 50 GWh, and while the exact figures are not publicly disclosed that estimate is enough power to run San Francisco for almost three days1. That number can sound alarming in isolation, but it is important to consider a few other factors:

  • Training is a one-time event, not a daily drain.
  • That one-time cost is spread across hundreds of millions of users.
  • Training efficiency is improving quickly. For example, researchers have shown that the L-Mul algorithm, when implemented in hardware, could cut energy use by up to 95 percent.2
  • Advances in quantum computing could one day deliver billions-of-GPU–level performance using a little more than the power of a home outlet.3

Individual, day-to-day queries consume more energy than a traditional Google search, but not by much. We use the term traditional purposefully, as most Google searches today are accompanied by a response from Google’s Gemini model. A single AI query uses around 3 watt-hours of energy, which put into context of your daily life results in the following comparisons:

  • 15 minutes of social media scrolling ≈ 12 AI queries4
  • Streaming Netflix for 1 hour ≈ 27 AI queries5
  • A compact sedan idling for 5 minutes ≈ 146 AI queries6
  • A 5-minute hot shower ≈ 833 AI queries7

Most of us have let an episode of The Office play in the background while making dinner without giving it much thought, yet that single habit consumes more energy than an entire day of typical AI use. Framed this way, the question shifts. It stops being whether an AI query produces emissions, because everything we do does. The more useful question becomes what that query replaces.

USING AI WITH INTENTION

This is where personal agency starts to matter.

If we all treat AI like a novelty generator we flip on for entertainment, convenience, or boredom, demand will skyrocket and become hard to pull back. Infrastructure will grow to match our habits, not our ideals.

We have started thinking about AI the same way we think about household appliances. Running something longer uses more energy, but usefulness matters. Heating leftovers for an extra minute rarely feels irresponsible if it prevents food waste. In the same way, more complex AI prompts, which do consume slightly more energy, often replace hours of research, coordination, or repetitive work. In many cases, that tradeoff feels not only reasonable, but worthwhile.

The choice, then, is not really about avoiding AI. It is about avoiding waste.

When AI is used intentionally and applied to focused, meaningful tasks tied to real value it often stands in for far larger energy, time, or resource costs elsewhere. And because we are still early in this transition, the habits we form now will help determine whether AI demand grows recklessly or settles into something more deliberate and constructive.

A PRACTICAL PATH FORWARD

While we are proponents of Artificial Intelligence, we believe responsible use at both individual and organizational scales starts with a few simple ideas:

Use AI with Intent
If a task is trivial without AI, do it without AI. If AI meaningfully accelerates work that supports human wellbeing, creativity, equity, or sustainability, use it with confidence.

Encourage Institutional Guidelines
Not to restrict creativity, but to define purpose. A culture of thoughtful use inside companies can ripple outward quickly.

Early Habits Matter
How we choose to use AI today sets the tone for how society builds, regulates, and values it tomorrow.

AI is neither inherently harmful nor inherently sustainable. It carries both promise and pressure. Unchecked growth could undermine decades of progress on climate goals. At the same time, walking away from this technology entirely is neither realistic nor desirable. Used intentionally and supported by cleaner infrastructure, AI has the potential to accelerate solutions we have not yet fully imagined.

Recycling did not solve pollution, but it did reshape public awareness around material lifecycles. Artificial intelligence sits at a similar inflection point. Purposeful use will not eliminate its environmental challenges, but it can help steer AI toward supporting scientific discovery, medical advancement, and a more regenerative built environment.

That is the path worth aiming for: to pursue progress through informed intent.


FOOTNOTES

  1. Crownhart, Casey and O’Donnel, James. “We did the math on AI’s energy footprint. Here’s the story you haven’t heard.” MIT Technology Review. 20 May  2025, https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/#:~:text=This%20is%20a%20time%2Dconsuming,San%20Francisco%20for%20three%20days ↩︎
  2. Luo, Hongyin and Sun, Wei. “Addition is All You Need for Energy-efficient Language Models” arXiv. 1 October 2024, https://arxiv.org/abs/2410.00907 ↩︎
  3. Constantino, Tor. “Is Quantum Computing An Unlikely Answer to AI’s Looming Energy Crisis?” Forbes. 2 October 2024, https://www.forbes.com/sites/torconstantino/2024/10/02/is-quantum-computing-an-unlikely-answer-to-ais-looming-energy-crisis/ ↩︎
  4. Social Media: using an average 10.73 mAh per minute ↩︎
  5. Streaming Netflix: using an average 0.08 kWh per hour ↩︎
  6. Compact Sedan Idling: assuming using .16 gallons or 19 MJ per hour ↩︎
  7. Hot Shower: assuming .5 kWh per minute ↩︎

About

Garrett Herbst

Based in Little’s Charlotte office, Garrett is the Director of Emerging Technologies and along with Grant Saso leads the firm’s exploration of advanced computation and Artificial Intelligence tools. A licensed architect and adjunct professor at UNC Charlotte, he balances research and practice while enjoying fabrication, model making, and tackling the never-ending house projects.

More from this author