yeah i mean ofc if you also put everyone in the world that that datacentre is serving in a human datacentre, I’m sure it’d also consume tons of power (in food)
Natural inteligence would not consume Twix and cocaine.
Real Genius runs on cigarettes, coffee, and cheating on your cousin-wife.
How can we know if the AI is intelligence unless we can prove it is horny?
“Or so I’ve heard”
We use about 20% of our caloric intake (at rest, not doing math) for our bio intelligence. Having superpowers of social organization is expensive and power hungry.
So it’s really no surprise that the computation machines that can run AI require tens of megawatts to think.
“Pretend to think” at that lmao.
Yeah, it’s nowhere near thinking. More like arranging things into a pattern.
it’s really good at writing termination notices without making middle managers feel bad about letting their employees go.
It’s not artifical intelligence. A Large Language Model is not intelligent.
And yes yes, scientifically, LLM belongs there and whatnot. But important is, what the people expect.
Not to be pedantic, but the original use of the word intelligence in this context was “gathered digested information.”
Unfortunately, during the VC funding rounds for this, “intelligence” became the “thinky meat brain” type, and a marketing term associated with personhood, and the intense personalization along with it.
I completely agree that LLMs aren’t intelligent. On the other hand, I’m not sure most of what we call intelligence in human behavior is any more intelligent than what LLMs do.
We are certainly capable of a class of intelligence that LLMs can’t even approach, but most of us aren’t using it most of the time. Even much (not all) of our boundary pushing science is just iterating algorithms that made the last discoveries.
On the other hand, I’m not sure most of what we call intelligence in human behavior is any more intelligent than what LLMs do.
Human intelligence is analog and predicated on a complex, constantly changing, highly circumstantial manifestation of consciousness rooted in brain chemistry.
Artificial Intelligence (a la LLMs) is digital and predicated on a single massive pre-compiled graph that seeks to approximate existing media from descriptive inputs.
The difference is comparable to the gulf between a body builder’s quad muscle and a piston.
Not to be pedantic, but the original use of the word intelligence in this context was “gathered digested information.”
Unfortunately, during the VC funding rounds for this, “intelligence” became the “thinky meat brain” type, and a marketing term associated with personhood, and the intense personalization along with it.
Btw, you got it double-posted.
See, the thing is, I watch piss porn. Hear me out. I told my friend that the thing is, to do piss porn, you kind of have to be into it. You could try and fake it, but it wouldn’t be very convincing. So, my contention is, piss porn is more genuine than other types of porn, because the people partaking are statistically more likely to enjoy doing that type of porn. Which is great, I think, because then they really get into it, which is hot. It’s that enjoyment that gets me off. Their enjoyment.
She said, “Krooklochurm, you’re an idiot. Anyone can fake liking getting pissed in the face.”
So I said, “Well, if you’re so adamant, get in the tub and I’ll piss in your mouth, and let’s see if it’s as easy as you claim.”
So she said, “All right. If I can fist you in the ass afterwards.”
Which I felt was a fair deal, so I took it.
My (formal) position was strengthened significantly by the former event. And I can also attest that I could not convincingly fake enjoying being ass-fisted.
What does that have to do with anything, you ask? Genuinity. The real deal. That’s what.
I could see it. A lot of people dont like it, and its not my personal thing, but sex can get messy so its p much whatever. Always makes me think of this though:

What the fuck did I just read
AI poison.
Fresh lemmy copypasta
Some lost green text post or the internet comment etiquette guy.
Plot twist, she really did like getting pissed on, and she knew it ahead of time. She was gaming you for that golden shower.
Honestly disgusted to read that, but you do you…:) If you feel like thats your thing, I guess thats what you are here to enjoy.
What a good piece of meal, thank you
This is nice to be confused with shit porn. Which is just not very good.
In other words, it is shit.
Valid, but not the first two things that I’d come up with.
I’d trade cocaine for massive amounts of caffeine!
How much have you got? I’ve got about 3kg of coffee.
I have about a gallon of liquid caffiene, comes with a pump so you can add it to home made soda one dose at a time.
I suspect you could do the same with coke…
So, you are saying, I should mix my cocaine with twix bars for maximum efficiency? (Would still be stupid, but now more efficiently)??
The left Twix has cocaine in it. The right one does, too, but it comes from the Right Cocaine Twix factory.
Same stupid, more fun
I think if you add some twix bars in the mix you have good chances that it wont get worse because of it. Only logical choice is to go the twix route.
To be fair a lot of people think they’re intelligent and they really really aren’t.
Why do people keep telling me this?
Especially if they’ve had cocaine.
And then the LLMs get trained on those idiots.
I’m not trying to grade on potential but betting on human potential vs AI potential feels like it rewards ourselves for being better vs a machine. Would we have Albert Einstein if we didn’t have Isaac Newton?
That’s kind of a false dichotomy. They may be separate today, but there’s no reason to believe we won’t augment human minds with artificial neural networks in the future. Not in the magical cure all fix all way techbros like to sell it, but for like really boring and mundane things initially. Think replacing a small damaged part of some brain region, like the visual or auditory cortexes, to repair functional deficiencies. Once they get the basic technology worked out to be reliable, repeatable, and not require too much maintenance (cough subscriptions and software licenses), there’s no reason to believe we won’t progress rapidly to other augmentations and improvements. A simple graphical interface for like a heads up display or a simple audio interface for direct communications both come to mind, but I’m sure our imaginations will be comically optimistic about some things and comically pessimistic about others. All that to say that any true AI potential will be human potential in time. We won’t stop at making super intelligent AGI. We will want to BE super intelligent AGI. Since we already know highly efficient and capable intelligence is possible (see yourself) it’s only a matter of time until we make it ourselves, provided we don’t kill ourselves somehow along the way.
Don’t forget the Red Bull and Vodka system coolant…
Isn’t it more like they’re comparing all the hamburgers and everything else you have eating since you were born?
That’s what they’re doing with AI enegry usages isn’t it? I thought it was including the training which is where the greatest costs come from vs just daily running.
No. “In practice, inference [which is to say, queries, not training] can account for up to 90% of the total energy consumed over a model’s lifecycle.” Source.
I think the entire idea of ai and the Internet in general taking up power and water needs to be fleshed out and explained to everyone. Even to me it’s a vague notion, I heard about it a few years back but can’t explain it to someone like my parents who would have no idea the Internet requires water to run
It’s not too hard. AI requires a LOT of work. Work requires energy. Some energy is wasted during this and the byproduct is heat. The heat has to be removed for many reasons, and water is very good at doing that.
It’s like sweating, it cools you down. But you need water to sweat.
Wasn’t there an article posted yesterday about a group trying to create a biological computer that was living cells do to their efficiency of use on less power? (They are far from close, they basically took skin cells, ionized them, and had no idea how they were going to get them to stay alive long term yet.
Even that won’t be anywhere close to the efficiency of neurons.
And actual neurons are not comparable to transistors at all. For starters the behaviour is completely different, closer to more complex logic gates built from transistors, and they’re multi-pathway, AND don’t behave as binary as transistors do.
Which is why AI technology needs so much power. We’re basically virtualising a badly understood version of our own brains. Think of it like, say, PlayStation 4 emulation - it’s kinda working but most details are unknown and therefore don’t work well, or at best have a “close enough” approximaion of behaviour, at the cost of more resource usage. And virtualisation will always be costly.
Or, I guess, a better example would be one of the many currently trending translation layers (e.g. SteamOS’s Proton or macOS’ Rosetta or whatever Microsoft was cooking for Windows for the same purpose, but also kinda FEX and Box86/Box64), versus virtual machines. The latter being an approximation of how AI relates to our brains (and by AI here I mean neural network based AI applications, not just LLMs).
There’s already been some work on direct neural network creation to bypass the whole virtualization issue. Some people are working on basically an analog FPGA style silicon based neural network component you can just put in a SOM and integrate into existing PCB electronics. Rather than being traditional logic gates they directly implement the neural network functions in analog, making them much faster and more efficient. I forget what the technology is called but things like that seem like the future to me.
I’m very much aware of FPGA-style attempts, however I do feel the need to point out that FPGAs (and FPGA style computing) is even more hardware-strained than emulation.
For example, current mainstream emulation FPGA DE10 Nano has up to 110k LE/LUT, and that gets you just barely passable PS1 emulation (primarily, it’s great for GBA emu, and mid to late 80s, early 90s game console hardware emulation). In fact it’s not even as performant as GBA emulation on ARM - it uses more power, costs more, and the only benefit is true to OG hardware execution (which isn’t always true for emulation).
Simply said, while FPGAs provide versatility, they’re also much less performant than similarly priced SoCs with emulation of the specific architecture.
I meant in the context of machine learning not gaming
The context doesn’t matter. The bottom line is that FPGAs provide flexibility, not improved performance. Period.













