this post was submitted on 04 Oct 2024
19 points (85.2% liked)
technology
23306 readers
390 users here now
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
- Ways to run Microsoft/Adobe and more on Linux
- The Ultimate FOSS Guide For Android
- Great libre software on Windows
- Hey you, the lib still using Chrome. Read this post!
Rules:
- 1. Obviously abide by the sitewide code of conduct. Bigotry will be met with an immediate ban
- 2. This community is about technology. Offtopic is permitted as long as it is kept in the comment sections
- 3. Although this is not /c/libre, FOSS related posting is tolerated, and even welcome in the case of effort posts
- 4. We believe technology should be liberating. As such, avoid promoting proprietary and/or bourgeois technology
- 5. Explanatory posts to correct the potential mistakes a comrade made in a post of their own are allowed, as long as they remain respectful
- 6. No crypto (Bitcoin, NFT, etc.) speculation, unless it is purely informative and not too cringe
- 7. Absolutely no tech bro shit. If you have a good opinion of Silicon Valley billionaires please manifest yourself so we can ban you.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Super. Link to the best critique of AI on here?
Its energy consumption is absolutely unacceptable, it puts the Crypto market to utter shame regarding its ecological impact. I mean, Three Mile Island Site 1 is being recommissioned to service Microsoft Datacenters instead of the 800,000 homes it could service with its 835 megawatt output. This is being made possible thanks to taxpayer backed loans provided by the federal government. So American's tax dollars are being funneled into a private energy company, to provide a private tech company 835 megawatts of power output, for a service they are attempting to make a profit from. Instead of being provided clean, reliable energy to their households.
Power consumption is only one half of the ecological impact that AI brings to the table, too. The cooling requirement of AI text generation has been found to consume just over 1 bottle of water (519 milliliters) per 100 words, or the equivalent of a brief email. In areas where electricity costs are high, they consume an insane amount of water from the local supply. In one case, The Dalles, Google's datacenters were using nearly a quarter of all the water available in the town. Some of these datacenters use cooling towers where external air travels across a wet media so the water evaporates. Which means that they do not recycle the water being used to cool, and it is consumed and removed from whatever water supply they are drawing from.
These datacenters consume resources, but often do not bring economic advantages to the people living in the areas they are constructed. Instead, those people are subject to the sounds of their cooling systems (if being electrically cooled), a hit to their property value, strain on their local electric grid, and often are a massive consumer of local water (if being liquid cooled).
Models need to be trained and that training happens in datacenters, which can at times take months to complete. The training is an expense the company pays just to get these systems off the ground. So before any productive benefits can be gained by these AI systems, you have to consume a massive number of resources just to train the models. Microsoft’s data center used 700,000 liters of water while training GPT-3 according to the Washington Post. Meta used 22 million liters of water training its LLaMA-3 open source AI model.
And for what exactly? As others have pointed out in this thread, and others outside this community broadly, these models only wildly succeed when placed into a bounded test scenario. As commenters on this NYT article point out:
These systems are only capable of performing within the bounds of existing content. They are incapable of producing anything new or unexplored. When one data scientist looked at the o1 model, he had this to say about the speed at which the o1 model constructed code that took him months to complete:
He makes these remarks, with almost no self-awareness. The likelihood that this model was trained on his very own research is very high, and so naturally the system was able to provide him a solution. The data scientist labored for months creating a solution that, to be assumed, wasn't a reality beforehand, and the o1 model simply internalized his solution. When asked to provide that solution, it did so. This isn't an astonishing accomplishment, it's a complicated, expensive, and damaging search engine that will hallucinate an answer when you've asked it to produce something that sits outside the bounds of its training.
The vast majority of use cases for these systems by the public are not cutting-edge research. It's writing the next 100 word email you don't want to write, and sacrificing a bottle of water every time they do it. It's replacing jobs being held by working people and replacing them with a system that is often exploitable, costly, and inefficient at the task of performing the job. These systems are a parlor trick at best, and a demon whose hunger for electric and water is insatiable at worst.
The classic 🙉🙈 of the "futurist". We are on our way to AI Utopia, and you won't tell me otherwise!
I had someone do that with NFTs. Aged like fine milk!
You said this less than 15 minutes after the good comment.
Crypto is still worse on electricity usage - I haven't seen actual stats for AI-only electricity usage but crypto uses 0.4 percent of the global electricity supply compared to ~1.5 percent usage from all data centres. I don't think AI comprises a full third of total usage. The AI hype crowd have projections that it will increase significantly which would require much better "AI" and actual use cases to see the sort of growth to make it a substantial issue.
Evaporative losses are vastly worse for agricultural with open irrigation. 22 million liters of water seems like a lot, but it's only 0.022 gigalitres. Google used a total of ~25 gigalitres across all their data centres, while Arizona uses about 8,000 gigalitres a year.
For the Dalles example, Google used ~1.3 gigalitres in a town with a population of 15-25,000 thousand people, so 25 percent for a massive data centre is not unreasonable.
As you note, it's junk so a waste of resources, but unless they manage to double the industry year on year, (doubt) it won't be a huge issue.
See?
This is so much more credible than going "I hate AI, AI is shit"
Posters like UlyssesT making everyone look bad.
Shut the fuck up. Nobody wanted to respond to you (except RedWizard, he must have the patience of a saint) because we’ve already done this topic to death, and leading with an ableist meme doesn’t exactly imply you’re acting in good faith.
Hey, you're talkin' about my man UlyssesT all wrong, it's the wrong tone. You do it again, and I'll have to pull out the PPB. Still nothing to say though, I see. Do you not have much of a defense against the idea that the slop slot machine everyone worships is destroying communities and the ecosystem at large? I'm not sure how you can look at the comment I left and have so little to say about these truths. Do you believe the ends justify the means in some way? What is it?
You’re going to have to wait a while.
Oh well!
What a stupid thing to say. We don't sit around writing long posts DEBOONKING every techbro talking point about AI for reference every time someone comes through. Since you care enough to post this way I'm sure you've seen all the criticism of others online and still come away with