I’ve said it before, and I’ll say it again: Fuck this “AI” nonsense, the techbros shoving it into everything, and the Bitcoin cryptobros that came before them.
I don’t understand, clean nuclear power has never been easier. Why not just build some current gen nuke plants?
As I understand it, planning new, grid-scale nuclear power plants takes 10-20 years. While this isn’t a reason not to start that process now, it does mean something needs to fill the demand gap until the nuke plants (and other clean sources) come online to displace the dirty generation, or demand has to be artificially held down, through usage regulation or techniques like rolling blackouts, all of which I would imagine is pretty unpalatable.
I think it’s fair to predict energy consumption will continue to rise. With that timescale, it’s basically “the best time to plant a tree was 20 years ago, the second best time is today”. Doesn’t solve the immediate issue, but if we keep not starting new nuclear projects, it’s going to remain an issue forever.
Oh, I totally agree – didn’t mean to give any impression otherwise. Filling the energy demand gap as quickly as possible with the least impactful generation source should be very high on societal goals, IMO. And it seems like that is what’s happening, mostly. Solar, wind, and storage are the largest share of what’s being brought up this year:
That’s an amazing chart!
It takes a long time to get a nuclear plant up and running. While it would be great to replace coal plants with nuclear, it wouldn’t help with all of the power being wasted on AI right now.
Time…
And a lot of concrete.
It takes a long time to see the climate gains from a nuclear reactor.
Hell, depending on size it can take a decade or longer to finish curing, and part of curing is releasing CO2 into the atmosphere.
I always bring up waste disposal, and always get waved away.
Nuclear waste isnt that big of an issue.
That part is kind of overblown.
Hell, for nuclear waste from naval nuclear reactors, I’m pretty sure we still sell it to France. I know we did up to at least a decade ago. They just refine it again and keep using it.
If it’s radioactive nuclear waste, that means it’s still radioactive.
All you gotta do is get rid of the non radioactive bits and it’s fuel again. By the time you can’t do it anymore due to prohibitive cost to gain ratio, it’s not a big problem to get rid of it, because it’s not that radioactive
The above comment is an example of this getting waved away.
I mean yeah…
Because that part should be…
I mean, statistically speaking I’m probably the only person that will see this thread that had the US government drop over six figures on teaching nuclear engineering…
But feel to do some googling about reusing spent fuel to verify for yourself.
This is the part that has always confused me. Radioactive “waste” should either be radioactive enough that it can continue to be used in some capacity, or it’s inert enough that it’s not too complicated to just bury it, given the relatively small scale. I guess I assumed that there must have been a large gap between being useful and being inert and that must have been the problem with managing waste, but if spent fuel can be refined back into new fuel and inert waste, then I don’t see the issue.
Yes, because if you read their previous comment you’ll see their primary concern is the CO2 released by curing concrete that is the equivalent of running a coal plant for DOZENS of seconds.
Very sustainable technology, this AI 😎
and the techbro ais are mostly a novelty so far…
I mean, it’s all runing on general purpose hardware. If we decoded 4k video on general purpose hardware we’d use more power than every AI company put together, but once that became popular we developed chips capable of decoding it at the hardware level that consume barely any power.
And the exact same thing is starting to happen with dedicated machine learning chips / NPUs.
Let’s have a round of slow claps for the tech industry.
First the “whole ass country of energy use to make fake money” that is bitcoin and now this?
Lovely.
How fucking convenient.
This is a bad article, with a misleading headline.
It shows no direct connection between the two, it just talks about how AI models are less power efficient than search engines, and then talks about how all industries including normal, non AI data centers, manufacturing, etc, are all increasing power usage.
there’s that window closed.
There was always something that was going to prevent this, they never seriously wanted it to happen, that would hurt fossil fuel donations to politicians
I like conspiracy theories like an average bloke does but have you seen quarterly reports from big tech? Their energy consumption and costs are skyrocketing. Are they in cahoots with big
oilcoal?What?
AI is the excuse to burn more fossil fuels.
If it was just about AI, it could be used as an excuse to build up renewables.
What I meant was if the excuse wasn’t AI, it would be something else for the excuse.
The current admin (and trump if he wins) neither have any desire to cut back on fossil fuels. It’s one of the few things propping up “the economy” which is why Biden is shattering the domestic fossil fuel production records that trump just set a few years ago.
There is tangible data on how much energy it’s using.
…
I’m sorry, I just don’t seem to follow these train of comments.
Each reply just seems kind of random and not related to what I’m saying.
AI is not an excuse to burn fossil fuels. AI exploded and it’s energy consumption exploded, there’s lots of data to back it up. You’re saying if not for AI there would be some other excuse. What excuse would that he? Would fossil fuel industry have to invent something that would consume this much energy?
AI is not an excuse to burn fossil fuels
AI is not a valid reasons to burn more fossil fuels
It is definitely being used as an excuse to do so.
And if it wasnt, something else would be.
I’m sorry if I’m still not explaining that in a clear way, but I don’t think we’re going to resolve this at this point if it still isn’t.
Microsoft and other big tech already power their data centers with their own renewables, and they will continue to do so. In the latest quarterly report from MS they admit they didn’t anticipate AI boom to be this big and so they have to buy more power externally. This is not good for them and they wouldn’t do this on purpose. They will catch up because it’s profitable thing to do.
That’s just a link to all datacenters and doesn’t break out how much energy is going to AI vs how much energy is being used to stream Netflix.
You might as well say we should shut down the internet because it uses too much electricity.
OK… warning: wall of text incoming.
TL/DR: We end up comparing LLM executions with Google searches (a single prompt to ChatGPT uses about 10x as much electricity as a single Google search execution). How many Google searches and links do you need to click on vs requesting information from ChatGPT? I also touch on different use cases beyond just the use of LLMs.
The true argument comes down to this: Is the increase in productivity worth the boost in electricity? Is there a better tool out there that makes more sense than using an AI Model?
For the first article:
The only somewhat useful number in here just says that Microsoft had 30% higher emissions than what it’s goals were from 2020… that doesn’t breakdown how much more energy AI is using despite how much the article wants to blame the training of AI models.
The second article was mostly worthless, again pointing at numbers from all datacenters, but conveniently putting 100% of the blame on AI throughout most of the article. But, at the very end of the article it finally included something a bit more specific as well as an actual source:
AI could burn through 10 times as much electricity in 2026 as it did in 2023, according to the International Energy Agency.
Link to source: https://www.iea.org/reports/electricity-2024
A 170 page document by the International Energy Agency.
Much better.Page 8:
Electricity consumption from data centres, artificial intelligence (AI) and the cryptocurrency sector could double by 2026.
Not a very useful number since it’s lumping in cryptocurrency with all Data centers and “AI”.
Moreover, we forecast that electricity consumption from data centres in the European Union in 2026 will be 30% higher than 2023 levels, as new data facilities are commissioned amid increased digitalisation and AI computations.
Again, mixing AI numbers with all datacenters.
Page 35:
By 2026, the AI industry is expected to have grown exponentially to consume at least ten times its demand in 2023.
OK, I’m assuming this is where they got their 10x figure, but this does not necessarily mean the same thing as using 10x more electricity especially if you’re trying to compare traditional energy use for specific tasks to the energy use required by executing a trained AI Model.
Page 34:
When comparing the average electricity demand of a typical Google search (0.3 Wh of electricity) to OpenAI’s ChatGPT (2.9 Wh per request)
Link to source of that number: https://www.sciencedirect.com/science/article/abs/pii/S2542435123003653?dgcid=author
It’s behind a paywall, but if you’re on a college campus or at certain libraries you might be able to access it for free.
Finally we have some real numbers we can work with. Let’s break this down. A single Google search uses a little more than 1/10th of a request made to ChatGPT.
So here’s the thing, how many times do you have to execute a Google search to get the right answer? And how many links do you need to click on to be satisfied? It’s going to depend based on what you’re looking for. For example, if I’m working on doing some research or solving a problem, I’ll probably end up with about 10-20 browser tabs open at the same time by the time I get all of the information I need. And don’t forget that I have to click on a website and load it up to get more info. However, when I’m finally done, I get the sweet satisfaction of closing all the tabs down.
Compare that to using an LLM, I get a direct answer to what I need, I then do a little double checking to verify that the answer is legitimate (maybe 1-2 Google equivalent searches), and I’m good to go. Not only have I spent less time overall on the problem, but in some cases I might have even used less electricity after factoring everything in.
Let’s try a different use case: Images. I could spend hours working in Photoshop to create some image that I can use as my Avatar on a website. Or I can take a few minutes generating a bunch of images through Stable Diffusion and then pick out one I like. Not only have I saved time in this task, but I have used less electricity.
In another example I could spend time/electricity to watch a Video over and over again trying to translate what someone said from one language to another, or I could use Whisper to quickly translate and transcribe what was said in a matter of seconds.
On the other hand, there are absolutely use cases where using some ML model is incredibly wasteful. Take, for example, a rain sensor on your car. Now, you could setup some AI model with a camera and computer vision to detect when to turn on your windshield wipers. But why do that when you could use this little sensor that shoots out a small laser against the window and when it detects a difference in the energy that’s normally reflected back it can activate the windshield wipers. The dedicated sensor with a low power laser will use far less energy and be way more efficient for this use case.
Of course we still need to factor in the amount of electricity that’s required to train and later fine-tune a model. Small models only need a few seconds-minutes to train. Other models may need about a month or more to train. Once the training is complete, no more electricity is required, the model can be packaged up and spread out over the internet like any other file (of course electricity is used for that, but then you might as well complain about people streaming 8k video to their homes for entertainment purposes).
So everything being said, it really comes down to this:
Does the increase in productivity warrant the bump in electricity usage?
Is there a better tool out there that makes more sense than using an AI Model?Thank you for your effort.
Couple of takeaways:
I think we can use Bitcoin difficulty chart to approximate how much crypto weighs in the AI / crypto mix. BTC difficulty stopped increasing in 2024 which could be partially explained by both competing for same resources. The other big one, Ethereum moved to proof of stake fairly recently and I think it’s an attractive proposition for other crypto given the above. With this in mind it’s fair to say crypto won’t be a big factor compared to AI growth and I would expect researchers to come to somewhat similar conclusions.
As to how good AI is at things:
- Effectiveness of AI powered search is debatable but it’s a subjective thing so I don’t want to get into it.
- Translation tech was one of the early ML implementations and it’s good to see it improving even more. Transcription is one of the great uses but how many people need that on frequent basis?
- I remain unconvinced that many multimedia generative AI use is legal due to how training data was obtained. We’re in a limbo until this gets decided by US / EU etc.
- As you’ve mentioned there is concern that we’ll see a lot of wasteful applications of AI. I was horrified when Googled demoed assistant that would find your car plates by scanning your photo library.
The last one is key I think. Since AI is the current buzzword companies will try to shoehorn it everywhere, regardless of it making sense.
What kind of world are we going to leave behind for the AI though?
It’ll be a coal-powered Dyson sphere sustaining data center tasked with generating pictures of celebrity porn, Jesus, flight attendants, babies and seafood. By then AI will enjoy them as much as my mother does.
“I’ve cut out these goatees made of felt for us all to wear until we can grow real ones.”