this post was submitted on 28 Jan 2025
1088 points (97.6% liked)

Microblog Memes

6320 readers
3398 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Tartas1995@discuss.tchncs.de 17 points 1 day ago (1 children)

"ai bad" is obviously stupid.

Current LLM bad is very true. The method used to create is immoral, and are arguably illegal. In fact, some of the ai companies push to make what they did clearly illegal. How convenient...

And I hope you understand that using the LLM locally consuming the same amount as gaming is completely missing the point, right? The training and the required on-going training is what makes it so wasteful. That is like saying eating bananas in the winter in Sweden is not generating that much CO2 because the distance to the supermarket is not that far.

[–] daniskarma@lemmy.dbzer0.com 10 points 1 day ago* (last edited 1 day ago) (1 children)

I don't believe in Intelectual Property. I'm actually very against it.

But if you believe in it for some reason there are models exclusively trained with open data. Spanish government recently released a model called ALIA, it was 100% done with open data, none of the data used for it was proprietary.

Training energy consumption is not a problem because it's made so sparsely. It's like complaining about animation movies because rendering takes months using a lot of power. It's an irrational argument. I don't buy it.

[–] Tartas1995@discuss.tchncs.de 5 points 1 day ago* (last edited 1 day ago) (1 children)

I am not necessarily got intellectual property but as long as they want to have IPs on their shit, they should respect everyone else's. That is what is immoral.

How is it made sparsely? The training time for e.g. chatgtp 4 was 4 months. Chatgtp 3.5 was released in November 2023, chatgtp 4 was released in March 2024. How many months are between that? Oh look at that... They train their ai 24/7. For chatgtp 4 training, they consumed 7200MWh. The average American household consumes a little less than 11000kWh per year. They consumed in 1/3 of the time, 654 times the energy of the average American household. So in a year, they consume around 2000 times the electricity of an average American household. That is just training. And that is just electricity. We don't even talk about the water. We are also ignoring that they are scaling up. So if they would which they didn't, use the same resources to train their next models.

Edit: sidenote, in 2024, chatgtp was projected to use 226.8 GWh.

[–] daniskarma@lemmy.dbzer0.com 4 points 1 day ago* (last edited 1 day ago) (2 children)

2000 times, given your approximations as correct, the usage of a household for something that's used by millions, or potentially billions, of people it's not bad at all.

Probably comparable with 3d movies or many other industrial computer uses, like search indexers.

[–] Tartas1995@discuss.tchncs.de 4 points 1 day ago* (last edited 1 day ago)

Yeah, but then they start "gaming"...

I just edited my comment, just no wonder you missed it.

In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are "gaming" 24/7, it is quite wasteful.

Edit: just in case, it isn't obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.

[–] Zos_Kia@lemmynsfw.com 1 points 1 day ago

Yeah it's ridiculous. GPT-4 serves billions of tokens every day so if you take that into account the cost per token is very very low.