I don’t think energy consumption matters to most people. I still can’t believe the whole planet is “on board” with using a metal box that weighs around a ton, or much more, fill it with fossil fuel, and burn it to move that heavy box and generally a single person everywhere they go. Then they complain about the price of the fuel that is slowly choking the environment they need to drink, eat and live. But we can’t say anything bad against cars. Just buy an electric car because it fixes all the problems if you ignore wasted energy, microplastic pollution, noise pollution, the impact of parking, the millions of dead humans every year, and the billion animals considered roadkill every year. There is no more gas problem!
So, I wish this would make some people realize how power hungry and destructive AI can be, but I’m pretty sure most don’t give a fuck because “it can be useful”, just like cars. So far I’ve read people defending it saying power usage and efficiency will improve with time. Or that AI will find a tech solution to the AI problem.
It will do the same as the car industry. People will begin to use it, find it some use despite its flaws, and defend it tooth and nails while it’s bringing us closer to environmental doom. You will consoom.
It’s kind of hard for me to criticize cars when without one I wouldn’t have a job, food, clothes, or really anything of the sort. Absolutely zero thought put into public transportation where I live, and the only bike lanes are about an hour bike ride away, with my job being the same distance. I’m just so tired boss.
People will start caring when the energy costs start to spike up. And with our little energy crisis looming; it is a good time to tell them how much of their electricity bill is subsidizing idiots like Sam Altman and Dario Amodei.
Techbro bots: "this is the same energy wasted in: [shit that’s actually fun or useful] As if AI infrastructure isn’t built ON TOP of existing infrastructure and actively leeching off it or lobbying to de-regulate energy to make it even more wasteful.
The difference is that the microwave is not as nearly as cancerous.
As an American I appreciate how they included both metric (Joules) and imperial (hours running a microwave) measurements
I only know how long bald eagles burn for.
How do I convert microwave hours to bald eagle burn time (in number of football games including all dead ball times and the halftime shows)?(in number of football games including all dead ball times and the halftime shows)
Wait are we talking mid-day NFL, primetime (SNF/MNF/TNF/Playoffs), or Superbowl? Or are you talking CFB, and if so is it one of the big conference games or is it only showing on ESPN+?
Yes
(And all owls are superb)
…taking into account up to 5 testicles at 60mph, but take away 3 standard rats (not McDonald’s rats) to the power of 3 number 2 pencils HD if using mechanical pencils. That comes out to 37burnt eagles and 3 frozen yogurts.
Oh I forgot to divide by the length of a Phillips screw driver! That’s right!
deleted by creator
Anyhow it would be running a microwave at 944 watt for an hour 😁
I have a new Retirement plan.
- Buy a gun
- Shoot it at datacenters.
- Eat some chips or something

A truck and container with microwave beam like this.
https://www.rtx.com/raytheon/what-we-do/integrated-air-and-missile-defense/phaser-high-power-microwavehell yeah, road trip to all the data centers! now thats a retirement!
A gun !? You need a Rocket Launcher if you want to do any worthwhile damage !
Truly every time period of human civilization had something in it that would make future generations shake their heads in disbelief.
“How could they use gigawatthours for crypto farms and useless AI applications while fully aware of a climate crisis caused by fossil fuels?”, a student might ask his history teacher some day. “Because they were dumb as fuck.”, the teacher might answer.
Did you see the stockarket?!
Stonkrocket
Stronkracket
Shut up, Pam
3.4 giga joules! Great Scott!
Mega, not Giga. It’s more of medium sized Scott.
metric scott or imperial scott
They definitely need to make measures like this widely available.
Who the hell measures energy in joules?
That’s a little under 1kWh. Or playing your gaming PC for 3 hours.
It’s the international system unit of energy. So a lot of people.
Who the hell measures energy in joules?
i do
Who the hell measures energy in Wh ? I always convert back to Joules, Watt Hours don’t make any sense.
kWh is for practical use, Joules is for physics and because it’s universal among all the energy units
3 hours running completely flat out that is, which does happen with some games.
Even in those cases the gaming PC is producing roughly 2160x the amount of ‘content’ per unit of power. Though I guess some (not me) might argue that 5 seconds of AI generated cats fighting politicians has more ‘worth’ than some 5 second spans of gameplay.
I’d much prefer to pay an artist to be using their PC for three hours than a corporation
Also those 3 hours count as “practicing.”
Imagine if we counted “training” in this energy cost.
1kWh 3hrs playtime. Not on gaming PC of today. Only my 32" 240Hz display itself consumes more than that.
That is a choice. My desktop setup, including speakers and monitor (measured at the power outlet), uses below 140 W at full load in games, for 4060 levels of performance. Yes, your system is likely faster but I can play all modern games with it, at a level that is good enough for me. And I don’t sit in a sauna while gaming as a consequence. In other words, for that 5 sec AI video I can play 7 hours on my system and that does not even consider the tons of energy spent on training the model.
Yeah. I know. I also have that kind of setup for my kids. Or actually theres 5 gaming setups alltogether. Mine isn’t primarily for gaming despite the specs.
I just wanted to point out that gaming PC setups generally consumes more than ~333W of power.
Like yours. Given numbers are estimated averages, as I can’t know precise specifics.
GPU 140W, CPU 65W, MB ~35W, RAM 6W, SSD 3W, KBM 1W, Monitor 35W =280W Everything without monitor consumes about 245W. PSU efficiency is about 80% so to produce that 245W it has to draw minimum of ~306W. This combinrd with monitor’s 35W sums up to ~341W, which already surpasses that 1/3kWh when playing intensive game from few years back.
And your 4060 setup while being a gaming PC isn’t on a power draw scheme very average as that GPU specifically is very low power consuming. Go older, 3000 or 2000 series or up 5000 series or even any AMD, the power consumption goes up. And there are lots of older setups and even X99 setups with Xenons with up to 125W power consumption.
Yes, it is possible to have a gaming setup to run on given power consumption, but on average I would say you’ll reach 1kWh within 2 hours.
I have to correct myself. The 140W were the desktop system. With everything included (screen etc) I just checked and had a power draw of 166W at the power outlet. Those aren’t TDPs, that is the actual power draw including PSU losses and it is also the max draw, it doesn’t really get higher than that. My system is only 4060 like in performance, it is actually a Strix Halo with an integrated 8060s, with a combined CPU+GPU TDP limit of 100W. That has the advantage that I basically have no VRAM limitation (in Indiana Jones: The Great Circle I saw pretty continuous 12 GB memory used by the GPU) at the downside of limited bandwith, still quite close to a 4060 but much lower than high end GPUs of course.
Yes, my system is absolutely not representative. It was one of my goals to get an as energy efficient setup as I could while getting the necessary performance to be able to play modern games.
On modern gaming PCs 500W actual power draw during gaming does sound possible, on my previous system which had a pretty similar performance, but with dGPU (6750 XT). There I had a power draw of roughly 280-300W without the screen, during gaming, if I remember correctly.
I know right, it’s electron volts or nothing, got to get back to basics
coulomb volts is better for practical uses though
Or ~100 hours of my laptop doing simple tasks.
I can’t wait until the terror 2.0 when we destroy it
3.4 megajoules = 944 watt hours
Microwaves are typically rated at 1200-1500 watts, sometimes more. Do they actually use that much? I’m not sure, stuff typically uses less than the rated power on the label.
The headline is a little stupid, but I’m assuming they took 800W or something similar as reference, otherwise they wouldn’t have ended up with over an hour.
typically rated at 1200-1500 watts, sometimes more. Do they actually use that much? I
i’d say that if it says 1500 watts microwave, then that means that 1500 watts are delivered to your food. the microwave might consume more than that to account for losses.
yes, that is how they are rated, by output.
a quick search says they’re typically around 25% more power consumed than delivered
Ive seen more 700-1200 watt microwaves.
That’s like 2-4 miles in my electric car, depending on outdoor temperature.
That’s watts, not watt hours though, so it’s like microwaving something for a little under an hour, which is unrelatable for most people.
But take your 300 watt gaming PC, play on it for 3 hours, suddenly you’re at 900 watt hours and that’s probably easier to picture for most people.
Except when you do that, you only get 3 hours of boring lame video games, rather than 5 whole seconds of thrilling slop.
Thank you for the conversion. We have a common unit for electrical energy already, and megajoules is not it. Trying to make it sound like a bigger number by changing the unit only muddies the waters and honestly makes me slightly less sympathetic to the issue.
Seriously?
Joules are the SI unit for energy measurement. 1 Joule = 1 Watt second, so 3600 Joules = 1 Watt Hour
They teach this in middle school.
Ok, but when it comes to electrical energy nobody uses “watt seconds” in the real world. Devices use hundreds of watts, and run for minutes and hours. Dividing by 3.6 million isn’t exactly easy mental math to get the unit (kWh) we all see on our electric bills.
nobody uses “watt seconds”
Joules. They don’t say watt seconds because they say joules.
But they don’t use that either in the context of real-world electricity usage. Maybe in the middle school classroom setting, when you can make up the numbers you work with, but when I’m trying to quantify how much energy something uses at home I multiply how many watts it uses by now many hours it’s running. Divide that by 1000 for kilowatt-hours, and multiply by $.11 to know the cost to do it at home. If I need to do a multiplication/division of 3.6 million when nobody else is, something’s not right.
Similarly, a meter is a standard unit for length, but we don’t use it when measuring the distance to different galaxies because light-years are more practical at that scale. If you start using meters you’d get some funny looks, just as I’m feeling for joules instead of kilowatt-hours. But you know, “almost a kilowatt-hour” makes for a pretty boring headline.
Also, you’ll notice that I specifically mentioned electrical energy. Electrical power is almost universally measured in watts, the product of voltage and current, not joules per second (even if that’s the same thing). So going from instantaneous power measurements to energy accumulated over time, it’s not crazy to use the term “watt second” the way one would use “kilowatt hour”… Even if that’s also called a “Joule”
Yeah, that’s true, but joules typically isn’t used today. When people talk about energy consumption it’s almost always in watts or watt-hours. I’ve seen/heard people use joules less than 5 times since college.
a human consumes about 8 MJ of chemical energy per day.
Okay Sam Altman.
that’s not what i meant.
now they have heard it 6 times used.
Oh! Well, Sam Altman basically used this to try to defend AI energy usage. Training a human takes a lot of energy and water, after all. I don’t think he realized this made him sound like a fucking supervillain.
The way a headline was phrased makes you change your mind about objective truth…? This isn’t a family feud where you’re being asked to take sides. It’s still climate change even if someone tries to trick you into thinking it’s slightly worse than it actually is.
Not speaking for the sponge, but I know it gives me pause to consider what else they may be manipulating, and also why they’re manipulating it
Would any manipulation at all justify caring less about climate change? We know from a million sources that ai takes a ton of electricity
@[email protected] spoke for me perfectly. When you make things weird, I have to start by assuming malice or incompetence - both of which should be red flags.
No doubt AI is sucking a lot of electricity and that presents loads of problems to consider. But instead of (for example) 5 seconds being converted to an hour running a microwave (because who even does that?) how about 3 minutes being about as much as a typical American home uses in a day? Or something like that?
You specifically said you became less sympathetic to this cause though. Even if you exaggerated by many times this would still be an issue, which you just admitted.
I get thinking it’s a bad idea for people first learning about this, but it’s not changing my mind exaggerated or not.
I generally get skeptical when people go out of their way to use weird units. I don’t disagree with the message, just the way it’s conveyed whether it’s this or giving the price in Zimbabwean dollars (outside of Zimbabwe, of course). If something is weird, one should ask why. And I wish this headline didn’t make things weird leading people to ask why.
Fuck this just remindede me I left my microwave on.
Depends on the model - If you’re using Wan 2.x on your RTX 5090 it’s like 600W for 10 minutes so roughly 600x600=360kj
Lighting the planet on fire for SpongeBob police chase videos
Probably more like Spongebob with bouncing juggs videos
Where are those?
Same as playing 10 min Cyberpunk 2077
3.4 Megajoules is equal to 0.94kWh
If we assume a gaming PC running at 800W (incl monitor), then it would mean slightly over an hour of gaming
How much does Peter Thiel pay you?













