Religare Broking
    Blogs

    The Power & Energy Crunch for AI

    The Power & Energy Crunch for AI
    Indian Market & Economy
    Religare Broking
    April 30, 2026

    The demand for artificial intelligence (AI) everywhere is soaring, and the demand for power consumption to run the data centers is also creating a new crunch in the energy sector. The massive use of AI-integrated technology is present in almost every field, putting more pressure on the data centres. AI-enabled chatbots like ChatGPT, Gemini, Meta AI and Grok store and use a huge amount of data that requires a huge level of power to run the data centers.

    However, the crunch for power demand in AI creates a challenge for the energy sector to meet the demand and ensure the smooth running of AI. Let’s talk about how massive use of AI is creating an energy crunch for data centers, including the key factors responsible for such high power demand, future projections and how AI can help to deal with energy crises.

    Read also: AI Infrastructure Boom in 2026

    AI Energy Consumption

    To generate results using the AI models, you need an immense amount of energy. When you run a single AI query, it uses more than 10 times more energy compared to a normal Google search. While on the other hand, a video generated through AI can consume up to 110 Wh, which is enough to power a house for hours. Not only running a query, but the power consumption for AI spreads into various activities to support the entire infrastructure of AI development, implementation, and usage. Let’s find out what the key drivers are in AI power consumption, making energy demand crucial for this sector.

    1. Generating Query: To answer even a simple query, AI can consume between 0.17 to 2 watt-hours (Wh) of energy per request. ChatGPT alone processes approximately 2 billion queries on a daily basis.
    2. Image Generation: Similarly, using AI to generate a high-quality image, a smartphone needs a power of around 0.002 to 0.5 kWh or 1.7 to 2 watt-hours to fully charge. Generating 1,000 images can produce as much CO2 as driving a car up to 4.1 miles.
    3. Generate the Video: On the other hand, creating a video through AI requires around 20–110 Wh. However, the power consumption can differ from model to model and the length of the video.
    4. Model Development: This is another crucial stage where AI consumes a huge amount of energy to become fully functional. Yes, at the time of training the AI model like GPT-3, you need to lose 700,000 litres of freshwater and immense electricity that is enough to power a household.
    5. Powering & Cooling the Data Center: AI data centers are the most power-consuming segment, where energy demand is as high as that of powering 100,000 households. Here, you need an immense power source to not only run the GPU but also maintain the cooling of these data centers.

    AI Energy Consumption Forecast

    Currently, the AI power consumption is around 415 terawatt-hours (TWh), which accounts for 1.5% of the global demand. However, as per the reports, the AI energy consumption forecast is expected to double by 2030, likely to touch around 945–1,350 TWh, mainly driven by AI-optimised server deployment. A high-performance AI server can consume around 100 MW annually, while AI-ready sites are evolving from 10–20 MW to over 1 GW, which would be enough to power the 800,000 homes. The key drivers for AI power consumption in the coming years would be more precise AI model training and developing the infrastructure to constantly power and cool the data centers.

    Additional Read: The Next Phase of Artificial Intelligence

    AI Energy Consumption Problem

    The rapid increase in AI energy consumption is putting strain on electricity-producing and distribution companies. Owing to the massive demand for AI, the electricity demand in data centres is forecasted to double by 2030. With the ever-increasing demand for queries through various AI models, the demand for energy is becoming a new problem for the power grid to meet the demand.

    1. Exponential Demand: AI data centers are becoming the new industrial hub for the high demand for energy. Some of the AI data centers are big enough to consume as much energy as 100,000 households, which is up to 20 times more for the biggest upcoming projects in the AI industry.
    2. Extra Stress on Grid: As per the current trend and increase in AI usage, the AI data centers are projected to consume 1,050 TWh of electricity by 2030. This sudden increase in electricity will put more strain on the power grids to meet such demand, making it difficult to ensure the smooth running of AI data centers.
    3. Environmental Effect: To produce so much energy, the demand for fossil fuels also increases. Similarly, to cool down the data centers, a massive amount of water is also needed, which will also affect the water supply. All these factors will have an adverse impact on the environment.
    4. Training and Inference: The AI training, testing and development also consume a massive amount of energy. And further, the interface of AI by billions of users globally can create energy crises.

    Energy Requirements for AI Data Centers

    The average demand of an average-sized data center is around 5-10 MW, while the big hyperscale data centres can consume up to 100 MW or more power, with the yearly electricity consumption equivalent to the power consumption of around 350,000 to 400,000 electric-powered cars. The demand for energy in AI is mainly for running and cooling the data centers. The AI-optimised servers are expected to surge the demand and likely to push the power usage approximately fivefold from 93 TWh in 2025 to 432 TWh in 2030.

    Power Demand from AI Data Centers to Quadruple in 10 years

    As per the reports, the demand for electricity for AI data centers is expected to quadruple within the next ten years. And this massive demand is likely to rise, as AI-driven data centers are expected to consume 1,600 terawatt-hours by 2035, which will represent the global electricity consumption. The Data centers are upgrading and shifting from traditional CPU racks (150–200W) to high-density AI GPU racks that can run at 700W to 1,200W+ per chip, raising the power demand per rack. Similarly, implementation of large hyperscale facilities and the expansion of AI data centers require more power, putting more strain on power grids and placing 20% of projects at risk of delays due to shortage of energy supply. However, to limit the energy demand, AI can also improve energy efficiency by optimising the grid and adopting renewable energy management to produce low-cost electricity. In addition, advanced technologies for developing more power-efficient chips and cooling systems can support the management of AI related energy demand.

    Read also: How AI Will Change Stock Broking in India

    Energy Efficiency in AI Computing

    Though AI is already contributing to high energy demand due to demand for training and deploying large-scale AI models, which will double the data center electricity consumption by 2030. However, AI can also help in optimising the energy production and consumption with the following techniques.

    1. Optimised Power Management: AI can be deployed in power distribution management with smart electricity meters, adjusting the power supply management as per the changing demands from different fields. AI can also help reduce the load on the grid and analyse the possible breakdowns and energy wastage.
    2. Improved Efficiency: The demand for energy for AI is rising per unit of computing, but over the past decades, due to more efficient AI models and improved versions through better algorithmic design, the energy consumption per unit of computing has improved by a factor of 100,000.
    3. Quantization of AI Models: This will improve the precision levels of AI models to give more accurate results, reducing the frequency of queries by the users to get more precise results. The quantization of AI models will also reduce the model size and enhance the speed of inference without compromising the accuracy.
    4. Efficient Hardware Systems: The chip maker giants like NVIDIA, producing GPUs or TPUs, need to produce more power-efficient chips that are more efficient at matrix multiplication compared to general-purpose CPUs.
    5. AI Optimized Data Centers: Using AI-based monitoring to manage AI infrastructure, like optimising the cooling system or heat management and scheduling the workload management of the data centers. This will help in optimising the energy consumption of data AI centers.

    Conclusion

    Overall, the AI energy consumption surged to a new level, mainly due to power demand for AI data centres. An AI query consumes ten times the energy compared to a normal search engine query. And with the development of more advanced AI models to solve more complex queries or give more accurate results for critical problems will further push the demand for more data centers. While powering and cooling down these data centers is going to increase the electricity demand multiple times in the next ten years. This demand brings an energy crisis due to AI putting more strain on power grids and also has an environmental impact due to more water and fossil fuel consumption. However, with capacity enhancement of power generation and with the development of more efficient AI chips, data center management can help to deal with such energy crises.

    Frequently Asked Questions (FAQs)

    Why is AI causing a surge in power demand?

    AI models, especially large-scale systems like generative AI and machine learning algorithms, require massive computational power. Training and running these models in data centres consume significant electricity, leading to a sharp rise in global energy demand.

    How much energy does AI consume compared to traditional computing?

    AI workloads can consume several times more energy than traditional computing tasks. For instance, training a large AI model may use as much electricity as thousands of households consume in a year.

    What is the role of data centres in AI energy consumption?

    Data centres are the backbone of AI operations. They house powerful GPUs and servers that process AI workloads, making them one of the largest contributors to increased electricity consumption globally.

    Why are GPUs more energy-intensive for AI workloads?

    Graphics Processing Units (GPUs) are designed for parallel processing, which is ideal for AI tasks. However, this high-performance capability also leads to greater power consumption compared to standard CPUs.

    How does AI impact global electricity demand?

    The rapid adoption of AI is expected to significantly increase global electricity demand over the next decade, putting pressure on existing power infrastructure and energy resources.

    Can renewable energy solve the AI energy crisis?

    Renewable energy sources like solar and wind can help offset AI’s energy demand. However, challenges such as intermittency and storage limitations mean they cannot fully solve the issue without supportive infrastructure.

    What are tech companies doing to reduce AI energy consumption?

    Major tech companies are investing in energy-efficient hardware, optimised AI models, and green data centres powered by renewable energy to reduce their carbon footprint.

    What is “green AI”?

    Green AI refers to developing and deploying AI systems that are energy-efficient and environmentally sustainable, focusing on reducing computational costs and emissions.

    How does AI contribute to carbon emissions?

    AI contributes to carbon emissions through high electricity usage, especially when powered by fossil fuels. Training large AI models can produce a carbon footprint comparable to multiple transatlantic flights.

    Will AI lead to an energy crisis in the future?

    If not managed properly, the growing demand for AI could strain global energy supplies. However, advancements in energy efficiency, infrastructure, and renewable adoption can help mitigate this risk.

    Are there any regulations on AI energy consumption?

    Currently, there are limited direct regulations specifically targeting AI energy use. However, broader environmental and energy policies are increasingly influencing how AI infrastructure is developed and operated.

    How can AI itself help solve the energy crisis?

    AI can optimise energy grids, improve efficiency in power generation, and enhance renewable energy forecasting, making it a powerful tool to address the very energy challenges it creates.

    What industries are most affected by AI’s energy consumption?

    Industries such as cloud computing, finance, healthcare, and autonomous systems are heavily reliant on AI, making them key contributors to increased energy usage.

    Is AI energy consumption a concern for investors?

    Yes, rising energy costs and sustainability concerns are becoming important factors for investors evaluating tech companies, especially those heavily invested in AI infrastructure.

    What is the future of energy-efficient AI?

    The future lies in developing smaller, more efficient models, specialised chips, and integrating AI systems with sustainable energy sources to balance innovation with environmental responsibility.

    Religare Dynami

    Trade Anywhere,
    Anytime

    Experience India's seamless trading app with advanced features, intuitive design, and lightning-fast execution.

    Religare Dynami Trading App
    Order Executed
    0.23 seconds

    All in one

    Investment App

    Personalised

    Research Notifications

    Biometric login

    for privacy & security

    Ready made

    strategies available for option traders

    3.9

    Based on 27.4K reviews

    10L+

    Downloads