this post was submitted on 06 May 2024
346 points (95.1% liked)

Technology

59693 readers
4184 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc Chief Executive Officer Rene Haas.

top 50 comments
sorted by: hot top controversial new old
[–] assassinatedbyCIA@lemmy.world 42 points 6 months ago

I wonder if they made an error as simple as this in their projections. There’s no guarantee that AI interest continues to grow.

[–] whoreticulture@lemmy.blahaj.zone 31 points 6 months ago (2 children)

I can't think of a single thing AI does that is worth the amount of energy consumption.

[–] Ibuthyr@lemmy.wtf 4 points 6 months ago (1 children)

The only really useful AI thing is the denoiser in Adobe Lightroom. I can shoot pictures in pitch black darkness with the highest ISO settings. Obviously it is a grainy mess. The denoiser manages to clean that up while retaining all of the details. It's really fucking great!

Anything else is just novelty bullshit.

[–] whoreticulture@lemmy.blahaj.zone 2 points 6 months ago (1 children)

Sounds useful, but not at all worth the amount of energy being used to produce AI. You could just use that energy to feed/house people who could do the labor of denoising.

[–] Ibuthyr@lemmy.wtf 3 points 6 months ago (1 children)

I know what you mean, but it's not really possible to manually denoise a picture the way the AI denoiser does. Let alone within 10 seconds. Plus, it's more of a niche usage. I don't think it consumes all that much energy.

Generating shitty images, creating deepfakes, prompting all kinds of bullshit... now that is a waste of energy as it really just makes the world worse. AI generated articles are popping up all over the internet. They aren't even reviewed anymore. Enshittification of the internet took some gigantic strides since the AI boom.

[–] frezik@midwest.social 2 points 6 months ago (1 children)

Do you know if the model is running locally or some cloud shit? If locally, the actual energy usage may be modest.

Energy spent training the model initially may have been prohibitive, though.

[–] Ibuthyr@lemmy.wtf 1 points 6 months ago

Good question, I'll look it up!

[–] FiniteBanjo@lemmy.today 0 points 6 months ago (7 children)

I can’t think of a single thing AI does

[–] loutr@sh.itjust.works 7 points 6 months ago* (last edited 6 months ago) (1 children)

Come on that's not fair, it's very good* at drawing album covers and video game assets, which gives more time to artists to go work for Starbucks or Amazon instead of doing something they actually enjoy.

* passable actually, but much cheaper.

[–] FiniteBanjo@lemmy.today 3 points 6 months ago

I'm not in the industry per se, but I wouldn't hire an AI to do art for my games or album covers.

[–] UnderpantsWeevil@lemmy.world 5 points 6 months ago (1 children)
[–] FiniteBanjo@lemmy.today 1 points 6 months ago

I'm glad to know AI can't think of a single thing AI does, although if it thought otherwise I still wouldn't care.

[–] ArkyonVeil@lemmy.dbzer0.com 2 points 6 months ago (1 children)

Correction, AI in the LLM/Diffusion sense is a decent tutor for cheap. Can cobble together rough temp art, and if used by an actually capable artist, make cool stuff.

Anything else and it's a garbage firehose, it's the undisputed king of mediocrity. Which, given the standards of SPAM and the modern web, is exactly what it's being used for.

What a shame.

[–] FiniteBanjo@lemmy.today 2 points 6 months ago (2 children)

Anything you learn from AI has a margin of error that could ruin you.

[–] UnderpantsWeevil@lemmy.world 4 points 6 months ago (1 children)

And we're rapidly liquidating the reserves of useful information in order to feed this beast.

Google results are declining as websites like Stack Exchange and Reddit crap out, Wikipedia pages are filling up with misinformation, news articles are increasingly full of nonsense and procedural generated fluff.

Its not just garbage on its face. Its a cancer that's spreading through the rest of our internet archives, blotting out the good and bloating front pages with bad data.

[–] FiniteBanjo@lemmy.today 4 points 6 months ago

It also gets worse every generation as it recursively feeds on the bad data.

load more comments (1 replies)
load more comments (4 replies)
[–] Rognaut@lemmy.world 27 points 6 months ago (1 children)

Sounds like some sensationalized bullshit. They don't give a single number or meaningful statement and they are paywalled.

[–] kakes@sh.itjust.works 20 points 6 months ago (9 children)

I don't disagree that they should back up their claim, but it does intuitively make sense. AI - GPT LLMs in particular - are typically designed to push the limits of what modern hardware can provide - essentially eating whatever power you can throw at it.

Pair this with a huge AI boom and corporate hype cycle, and it wouldn't surprise me if it was consuming an incredible amount of power. It's reminiscent of Bitcoin, from a resource perspective.

load more comments (9 replies)
[–] NeoNachtwaechter@lemmy.world 19 points 6 months ago (1 children)

I wonder why countries let them.

Using up more electric power than there's available is NOT a simple matter of demand and supply.

If they actually pull too much from the grid, they are going to cause damage to others, and maybe even to the grid itself.

[–] john89@lemmy.ca 2 points 6 months ago (1 children)

Because they're not actually pulling too much from the grid to cause damage to others or even the grid itself.

Any musings about curtailing AI due to power consumption is just bullshit for clicks. We'll improve efficiency and increase productivity, but we won't reduce usage.

[–] frezik@midwest.social 1 points 6 months ago* (last edited 6 months ago)

Improving the models doesn't seem to work: https://arxiv.org/abs/2404.04125?

We comprehensively investigate this question across 34 models and five standard pretraining datasets (CC-3M, CC-12M, YFCC-15M, LAION-400M, LAION-Aesthetics), generating over 300GB of data artifacts. We consistently find that, far from exhibiting "zero-shot" generalization, multimodal models require exponentially more data to achieve linear improvements in downstream "zero-shot" performance, following a sample inefficient log-linear scaling trend.

It's taking exponentially more data to get better results, and therefore, exponentially more energy. Even if something like analog training chips reduce energy usage ten fold, the exponential curve will just catch up again, and very quickly with results only marginally improved. Not only that, but you have to gather that much more data, and while the Internet is a vast datastore, the AI models have already absorbed much of it.

The implication is that the models are about as good as they will be without more fundamental breakthroughs. The thing about breakthroughs like that is that they could happen tomorrow, they could happen in 10 years, they could happen in 1000 years, or they could happen never.

Fermat's Last Theorem remained an open problem for 358 years. Squaring the Circle remained open for over 2000 years. The Riemann Hypothesis has remained unsolved after more than 150 years. These things sometimes sit there for a long, long time, and not for lack of smart people trying to solve them.

[–] MonkderDritte@feddit.de 18 points 6 months ago

Yeah, don't AI everything, please.

[–] dutchkimble@lemy.lol 13 points 6 months ago

Soon they'll need to make Duracells out of humans

[–] iAvicenna@lemmy.world 11 points 6 months ago

main use cases: government surveillance and chatbot girl friends

[–] JoShmoe@ani.social 10 points 6 months ago

They finally reached crypto miner level awareness.

[–] foggy@lemmy.world 8 points 6 months ago

Weird metric, but ok.

[–] Chessmasterrex@lemmy.world 4 points 6 months ago

It won't be needed because nobody will have a job to pay for it. I forsee kurt vonnegut's book "Player Piano" on steroids.

[–] explodicle@sh.itjust.works 2 points 6 months ago

This focus on individual applications shifts blame onto consumers, when we should be demanding that energy prices include the external cost of production. It's like guilt tripping over the "carbon footprint" (invented by big oil) of your car.

[–] bilb@lem.monster 0 points 6 months ago

Take that, India! 😎

load more comments
view more: next ›