“We’re maybe at 1% of where the AI adoption will be in the next two to three years. The world is actually headed for a really bad energy crisis because of AI unless we fix a few things,” said the head of AI solutions firm.

The following report is by Yahoo! News:

Artificial intelligence is expected to have the most impact on practically everything since the advent of the internet. Wall Street sure thinks so. The tech-heavy Nasdaq is up 26% year to date thanks to the frenzy over AI-related stocks.

But AI’s big breakout comes at a cost: much more energy.

Take for example OpenAI’s chatbot ChatGPT. Research done at the University of Washington shows that hundreds of millions of queries on ChatGPT can cost around 1 gigawatt-hour a day, or the equivalent energy consumed by 33,000 US households.

The energy consumption of something like ChatGPT inquiry compared to some inquiry on your email, for example, is going to be probably 10 to 100 times more power hungry.

Professor of electrical and computer engineering Sajjad Moazeni told Yahoo Finance.

Industry participants say this is only the very beginning of what’s to come.

We’re maybe at 1% of where the AI adoption will be in the next two to three years. The world is actually headed for a really bad energy crisis because of AI unless we fix a few things.

Arijit Sengupta, founder and CEO of Aible, an enterprise AI solution company, said

Data centers are the heart of the advanced computing process. They are the physical locations with thousands of processing units and servers at the core of the cloud computing industry largely managed by Google, Microsoft, and Amazon.

As you think of this shift towards these larger foundation models, at the end of the day you’re going to need these data centers to require a lot more energy as a whole.

Angelo Zino, VP and senior equity analyst at CFRA Research, told Yahoo Finance.

Data centers have increasingly shifted from using simpler processors, called CPUs, to more advanced graphics processing units, or GPUs. Those components, made by companies like Nvidia, are the most energy intensive.

For the next decade, GPUs are going to be the core of AI infrastructure. And GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive.

Energy consumption is going to dramatically increase on a global scale, simply because of the energy-intensive nature of AI. But if you look at the nuances, what’s interesting is AI is also incredibly efficient at things that humans are not as efficient at.

Explained Brady Brim-Deforest, CEO of Formula Monks, an AI technology consulting company.

‘Huge Massive Infrastructure Cost’

Research done by Benjamin C. Lee, professor of electrical engineering and computer science at the University of Pennsylvania, and professor David Brooks of Harvard showed that data center energy usage grew 25% a year on average between 2015 and 2021. This was before generative AI grabbed national headlines and ChatGPT usage skyrocketed.

Meanwhile, US Energy Information Administration data revealed an annual growth rate in renewable deployment of 7% during the same period, though that number is expected to increase with initiatives like the Inflation Reduction Act.

There’s already this fairly large gap between the growth rates, between data center energy, and renewable energy deployments.

We call it cloud computing; it feels like there’s no cost associated with it. There’s a huge massive infrastructure cost.

Lee told Yahoo Finance.

To counteract such consumption, the major cloud providers like Google Cloud, Microsoft Azure, and Amazon Web Services all invest in renewable energy to match their annual electricity consumption. They hold net-zero pledges, meaning they remove as much carbon as they emit.

Microsoft’s Azure has touted its 100% carbon neutral status since 2012 and says that by 2030 it will be carbon negative. Amazon has said it expects to power its operations with 100% renewable energy by 2025, as part of its goal to reach net-zero carbon emissions by 2040. For its part, Google aims to achieve net-zero emissions across all of its operations by 2030.

Net zero doesn’t mean you’re carbon-free potentially. There will be hours of the day where you don’t have enough sun or enough wind, but you’re still going to be drawing energy straight from the grid at whatever mix the grid will provide to you.

Said Lee

He is studying ways in which data centers can shift or reschedule their computation based on the availability of carbon-free energy.

Maybe you compute much more in the middle of the day when there’s a lot of solar energy and you compute much less in the middle of the night.

Said Lee

It’s no surprise, then, that the energy usage challenge has created a crop of companies aimed at creating more efficient ways of using AI models.

We can literally cut down energy use … in these [types] of AI workloads by one-third by just moving to serverless.

Aible’s Sengupta told Yahoo Finance, describing a technology that uses server resources on demand for more efficiency.

Analysts point out that reducing costs will naturally drive the industry towards energy solutions.

Whether it’s because of emissions or financial efficiencies or investor pressure or anything else, we do see companies looking more at how to be more efficient. It’s an operational cost. The more efficient you are, the lower your operational cost.

Says KPMG’s US climate data and technology leader Tegan Keele.

As Zino of CFRA Research pointed out, the winners in the space are the data center operators.

The actual data usage and how all this comes together is going to be more concentrated to a couple of companies out there.

More and more companies are essentially renting space in the cloud rather than kind of investing and building their own data centers, just because in the future I think it’s going to be a lot more costly in nature.

Said Zino

AUTHOR COMMENTARY

So, once again, we have the geeks, technocrats, and politicians touting and investing in AI, promoting it as a savior to the masses; while at the same time they talk about conserving energy and lower the carbon footprint, even though the reliance upon AI will continue to vastly consume more energy, both forcing prices up and frying the grid at the same time!

As a thorn goeth up into the hand of a drunkard, so is a parable in the mouth of fools.

Proverbs 26:9

Again, let me remind you that the green agenda has nothing to do with going green: it’s just a propaganda spin curated to coverup the real goals for greater control.


[7] Who goeth a warfare any time at his own charges? who planteth a vineyard, and eateth not of the fruit thereof? or who feedeth a flock, and eateth not of the milk of the flock? [8] Say I these things as a man? or saith not the law the same also? [9] For it is written in the law of Moses, Thou shalt not muzzle the mouth of the ox that treadeth out the corn. Doth God take care for oxen? [10] Or saith he it altogether for our sakes? For our sakes, no doubt, this is written: that he that ploweth should plow in hope; and that he that thresheth in hope should be partaker of his hope. (1 Corinthians 9:7-10).

The WinePress needs your support! If God has laid it on your heart to want to contribute, please prayerfully consider donating to this ministry. If you cannot gift a monetary donation, then please donate your fervent prayers to keep this ministry going! Thank you and may God bless you.

CLICK HERE TO DONATE

2 Comments

  • I’m just an dumb mechanic and even I could have told ya that it would take more electric to run AI.
    More electronics=more electric power; do these green morons even know how ya make electricity?
    Just as Jacob states, it’s a complete control agenda.

  • Proverbs26:15 The slothful hideth his hand in his bosom; it grieveth him to bring it again to his mouth.

    from the true word of God the KJV

Leave a Comment

×