- Anybody Can AI
- Posts
- Google Wants You To Know The Environmental Cost of Quizzing AI
Google Wants You To Know The Environmental Cost of Quizzing AI
PLUS: NASA & IBM Unleash “Surya”—An AI Weather Forecast for the Solar Storms
Google Lifts the Lid on AI’s Real Environmental Cost

Google has published a technical deep-dive into the environmental impact of AI inference—that is, how much energy, carbon, and water each Gemini App text prompt consumes in real-world conditions. This data provides the most comprehensive picture to date of AI usage’s ecological footprint.
Key Points:
Tiny for a Prompt, Huge at Scale - A single Gemini text query uses just 0.24 watt-hours of energy, emits 0.03 grams of CO₂, and consumes 0.26 mL of water—about five drops. That’s roughly the energy of watching eight to nine seconds of a TV show.
Efficiency Gains That Matter - Over the past year, the average energy use per prompt has dropped 33×, and its carbon footprint 44×, thanks to hardware innovations, smarter software, and greener data center operations.
Transparent, Full-Stack Measurement - Google’s approach covers everything: active AI accelerators, idle machines, CPUs and RAM, cooling infrastructure, and data center overhead—not just GPU usage. This transparency enables reliable comparison and optimization across AI systems.
Conclusion
Google’s study sets a new standard for environmental accountability in AI, revealing just how resource-efficient prompt-based interactions can be—even within massive, latency-sensitive systems. However, while individual usage seems negligible, the collective impact across billions of prompts demands continued innovation, transparency, and greener infrastructure. This is a pivotal move toward truly responsible AI deployment.
NASA & IBM Unleash “Surya” - An AI Weather Forecast for the Solar Storms

NASA and IBM have jointly released Surya, a powerful, open-source AI foundation model that forecasts solar storms—from flares to winds—by analyzing nine years of high-resolution solar imagery from NASA's Solar Dynamics Observatory (SDO). With capabilities to visually predict solar flares up to two hours in advance and deliver around 16% greater accuracy than prior methods, Surya marks a new era in space weather forecasting.
Key Points:
Predictive Power Meets Visual Precision - Surya doesn't just detect solar flares - it visually forecasts their exact location on the Sun's surface, offering a clear, high-resolution preview of where sun eruptions will occur—up to two hours ahead.
Built on Massive SDO Data and Novel Architecture - Trained on nearly a decade of full-resolution (4096×4096 pixel) multi-channel data—including Atmospheric Imaging Assembly and Helioseismic Magnetograph observations - Surya blends spatiotemporal transformers with spectral gating and long-short attention mechanisms to capture both fleeting and global solar dynamics.
Democratizing Space Weather with Open Science - Released under an open-source license via Hugging Face, GitHub, and IBM’s TerraTorch, Surya also comes with SuryaBench—a curated dataset and benchmarks for tasks like flare forecasting, solar wind prediction, active region segmentation, and EUV spectral modeling.
Conclusion
Surya represents a visionary leap in heliophysics—transforming solar data into actionable, visual forecasts that could safeguard satellites, power grids, and navigation systems. By openly sharing Surya and its tools, NASA and IBM are empowering scientists globally to innovate in space weather mitigation. This model isn’t just about predicting the Sun—it’s about preparing Earth.
Runway Unveils Game Worlds Beta: AI Meets Non-Linear Storytelling
Runway has officially launched the Game Worlds Beta, a pioneering AI-powered platform designed to drive non-linear, immersive narratives—melding game dynamics with AI storytelling. Initially announced via their X (formerly Twitter) channel, this beta brings creators a fresh sandbox for dynamic, interactive world-building.
Key Points:
AI-Guided, Non-Linear Narratives - Game Worlds Beta empowers creators to craft branching stories and dynamic experiences with AI at the narrative helm—a creative leap from static visuals toward interactive world-building.
Runway’s Evolution from Tools to Worlds - Known for tools like Gen-4 and Aleph, Runway is elevating its platform—shifting from content generation to fully AI-driven simulation environments that redefine storytelling.
Promising Industry Buzz - Early responses from creators and AI communities—especially on Reddit and X—highlight excitement and curiosity around Game Worlds Beta’s potential. It’s already sparking conversations about the future of interactive AI-powered storytelling.
Conclusion
Runway’s Game Worlds Beta signals a bold evolution: it's no longer just about generating videos or images but about crafting living, adaptable AI-driven narratives. As the platform shifts from static creativity to interactive storytelling, its potential extends into games, film, education, and beyond.
Thankyou for reading.