this post was submitted on 10 Dec 2023
229 points (96.7% liked)

Technology

58138 readers
4899 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

OpenAI says it is investigating reports ChatGPT has become ‘lazy’::OpenAI says it is investigating complaints about ChatGPT having become “lazy”.

you are viewing a single comment's thread
view the rest of the comments
[–] rtfm_modular@lemmy.world 124 points 9 months ago (7 children)

Yep, I spent a month refactoring a few thousand lines of code using GPT4 and I felt like I was working with the best senior developer with infinite patience and availability.

I could vaguely describe what I was after and it would identify the established programming patterns and provide examples based on all the code snippets I fed it. It was amazing and a little terrifying what an LLM is capable of. It didn’t write the code for me but it increased my productivity 2 fold... I’m a developer now a getting rusty being 5 years into management rather than delivering functional code, so just having that copilot was invaluable.

Then one day it just stopped. It lost all context for my project. I asked what it thought what we were working on and it replied with something to do with TCP relays instead of my little Lua pet project dealing with music sequencing and MIDI processing… not even close to the fucking ballpark’s overflow lot.

It’s like my trusty senior developer got smashed in the head with a brick. And as described, would just give me nonsense hand wavy answers.

[–] backgroundcow@lemmy.world 18 points 9 months ago (5 children)

Was this around the time right after "custom GPTs" was introduced? I've seen posts since basically the beginning of ChatGPT claming it got stupid and thinking it was just confirmation bias. But somewhere around that point I felt a shift myself in GPT4:s ability to program; where it before found clever solutions to difficult problems, it now often struggles with basics.

[–] Linkerbaan@lemmy.world 19 points 9 months ago (3 children)

Maybe they're crippling it so when GPT5 releases it looks better. Like Apple did with cpu throttling of older iphones

[–] tagliatelle@lemmy.world 17 points 9 months ago* (last edited 9 months ago) (2 children)

They probably have to scale down the resources used for each query as they can't scale up their infrastructure to handle the load.

[–] backgroundcow@lemmy.world 4 points 9 months ago

This is my guess as well. They have been limiting new signups for the paid service for a long time, which must mean they are overloaded; and then it makes a lot of sense to just degrade the quality of GPT-4 so they can serve all paying users. I just wish there was a way to know the "quality level" the service is operating at.

[–] monkeyslikebananas2@lemmy.world 2 points 9 months ago

This is most likely the answer. Management saw the revenue and cost and said, “whoa! Turn all that unnecessary stuff off!”

load more comments (1 replies)
load more comments (2 replies)