Tech

Why AI must shrink to reach its enterprise potential

Published

on

From copilots and chatbots to advanced analytics and automation, AI systems are now embedded in how organizations operate and compete. Yet as adoption accelerates, a less visible issue is coming sharply into focus: energy.

Enrique Lizaso

Co-founder and CEO at Multiverse Computing.

Advertisement

Training and running large language models (LLMs) requires enormous computational resources, and every additional layer of complexity translates directly into higher energy use.

Article continues below

Advertisement

This trajectory raises a critical question for the future of AI: how long can innovation continue on a path that depends on ever-increasing power consumption?

Power constraints are shaping AI’s future

The AI industry has spent the past decade chasing scale. Larger models, more parameters and bigger datasets have driven impressive gains in performance. At the same time, the cost of delivering those gains has risen sharply.

Advertisement

Electricity prices, grid capacity and data center availability are no longer background considerations. They are becoming limiting factors. In many regions, access to sufficient power is now a strategic constraint, shaping where AI infrastructure can be built and who can afford to use it.

For businesses, this creates growing tension. Advanced AI promises efficiency and competitive advantage, yet the operational costs of running large models can be prohibitive. For governments and regulators, the challenge is even broader: balancing AI-led economic growth with sustainability targets and grid resilience.

Without changes in how AI systems are built and deployed, energy demand risks slowing progress at exactly the moment when momentum is strongest.

Advertisement

Cost-effective AI is essential for wider adoption

The conversation around democratizing AI often focuses on access to tools or models. In practice, affordability plays an equally important role. If advanced AI remains expensive to run, its benefits will concentrate in the hands of a few large organizations with the deepest pockets and the most robust infrastructure.

Most companies do not need the largest possible model available. They need systems that deliver reliable results at a predictable cost. That applies just as much to public sector organizations, manufacturers and mid-sized enterprises as it does to startups.

Energy-efficient AI lowers the barrier to entry. Reduced power requirements mean lower operational costs, simpler deployment and fewer infrastructure constraints. For data centers, this translates into more efficient use of existing capacity, reduced cooling demands and less need for constant expansion.

Optimized models allow organizations to do more with the infrastructure they already have, easing pressure on energy supply while improving overall economics.

Efficiency also enables new deployment models. Smaller, compressed AI systems can run locally on devices such as smartphones, laptops, vehicles and even home or industrial appliances.

By bringing intelligence closer to where data is generated, organizations can reduce latency, improve reliability and limit dependence on centralized cloud infrastructure. For many use cases, this is a practical advantage as well as a sustainability win.

Advertisement

Smaller models can still deliver strong results

There is a widespread assumption that cutting down models inevitably means sacrificing accuracy. Advances in model optimization are challenging that idea.

Techniques such as compression, pruning and optimization allow LLMs to be significantly reduced in size while preserving performance on real-world tasks.

This allows organizations to deploy efficient AI models in environments where large-scale systems would be impractical or uneconomical, without sacrificing the performance required for enterprise applications.

Advertisement

The impact is dramatic. Compressed models can be up to 95% smaller, requiring far less memory and compute. That reduction translates directly into lower energy consumption and faster inference, while maintaining the level of accuracy organizations expect.

This approach shifts the emphasis from brute-force scaling to intelligent design. Rather than treating size as a proxy for quality, it prioritizes efficiency, precision and real-world applicability.

Sustainability and competitiveness go hand in hand

As AI becomes a core part of digital infrastructure, its environmental footprint will increasingly matter. Businesses are under pressure to meet ESG commitments, and customers are paying closer attention to how digital services are delivered. Governments, meanwhile, are assessing how AI fits into long-term energy planning.

Advertisement

Energy-efficient AI aligns with all of these priorities. Lower power consumption reduces emissions, eases strain on grids and improves the economics of deployment. It also makes AI more resilient, less dependent on scarce resources and better suited to global scale.

The shift toward efficiency does not require slowing innovation. On the contrary, it creates room for growth by removing one of the most significant constraints facing the industry.

Building the next phase of AI

The next chapter of AI will be shaped less by how large models can become and more by how effectively they can be deployed. Progress depends on systems that are powerful, practical and sustainable.

Advertisement

Achieving that balance requires collaboration across the ecosystem – from researchers developing leaner architectures to organizations rethinking how and where AI is deployed. It also calls for a broader definition of innovation, one that values efficiency alongside raw performance.

AI has the potential to transform industries, improve productivity and address complex global challenges. Ensuring that transformation remains accessible and sustainable will determine how widely those benefits are shared.

Solving AI’s energy challenge is part of that work. Done well, it opens the door to a future where advanced intelligence is not limited by power consumption, but enabled by smarter design.

We’ve featured the best AI website builder.

Advertisement

This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Source link

You must be logged in to post a comment Login

Leave a Reply

Cancel reply

Trending

Exit mobile version