this post was submitted on 11 Oct 2023
178 points (100.0% liked)

Technology

37717 readers
424 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Archive link

Silicon Valley has bet big on generative AI but it’s not totally clear whether that bet will pay off. A new report from the Wall Street Journal claims that, despite the endless hype around large language models and the automated platforms they power, tech companies are struggling to turn a profit when it comes to AI.

Microsoft, which has bet big on the generative AI boom with billions invested in its partner OpenAI, has been losing money on one of its major AI platforms. Github Copilot, which launched in 2021, was designed to automate some parts of a coder’s workflow and, while immensely popular with its user base, has been a huge “money loser,” the Journal reports. The problem is that users pay $10 a month subscription fee for Copilot but, according to a source interviewed by the Journal, Microsoft lost an average of $20 per user during the first few months of this year. Some users cost the company an average loss of over $80 per month, the source told the paper.

OpenAI’s ChatGPT, for instance, has seen an ever declining user base while its operating costs remain incredibly high. A report from the Washington Post in June claimed that chatbots like ChatGPT lose money pretty much every time a customer uses them.

AI platforms are notoriously expensive to operate. Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint. At the same time, the infrastructure to run AI systems—like powerful, high-priced AI computer chips—can be quite expensive. The cloud capacity necessary to train algorithms and run AI systems, meanwhile, is also expanding at a frightening rate. All of this energy consumption also means that AI is about as environmentally unfriendly as you can get.

you are viewing a single comment's thread
view the rest of the comments
[–] psudo@beehaw.org 3 points 1 year ago (1 children)

The hype cycle. And just like how even a reasonable read on the supposed benefits are going to leave most people very disappointed when it happens. And I'm glad you're one of the people that have found a good use for LLMs, but you're in the vocal minority, as far as I can tell

[–] Lanthanae@lemmy.blahaj.zone 1 points 1 year ago (1 children)

That's a weird argument. Most technological advancements are directly beneficial to the work of only a minority of people.

Nobody declares that it's worthless to research and develop better CAD tools because engineers and product designers are a "vocal minority." Software development and marketing are two fields where LMMs have already seen massive worth, and even if they're a vocal minority, they're not a negligible one.

[–] psudo@beehaw.org 1 points 1 year ago (1 children)

I don't see how saying things failing to live up to their promises and helping a mere fraction of the people claimed is. And I can't speak to marketing, but I can to software development and it really is not having the impact claimed, at least in my professional network.