this post was submitted on 24 Dec 2024
71 points (96.1% liked)

Technology

1574 readers
60 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution

founded 1 year ago
MODERATORS
 

The 200 year-old company may soon go public on the back of AI-powered education products.

you are viewing a single comment's thread
view the rest of the comments
[–] PhilipTheBucket@ponder.cat 48 points 4 days ago (3 children)

More general-purpose models like ChatGPT suffer from hallucinations because they have hoovered up the entire internet, including all the junk and misinformation.

Incorrect. ChatGPT hallucinates because that’s how LLMs work. Hoovering up misinformation is a separate problem.

A company in the space of selling educational books that has seen its fortunes go the opposite direction is Chegg. The company has seen its stock price plummet almost in lock-step with the rise of OpenAI’s ChatGPT, as students canceled their subscriptions to its online knowledge platform.

Incorrect. Chegg is a cheating platform. It is the opposite of a knowledge platform.

Why is Gizmodo paying people to write articles who apparently know pretty much nothing about the subject they are writing about?

[–] FaceDeer@fedia.io 9 points 4 days ago

Because they know their audience.

[–] Alexstarfire@lemmy.world 6 points 4 days ago

Having bad information in your dataset surely has to increase the odds of hallucinations though.

[–] Cuervo@lemmygrad.ml 3 points 4 days ago (1 children)

that's not completely fair ; they could have done their research (using ai)

[–] PhilipTheBucket@ponder.cat 3 points 4 days ago

I read some of the author’s other articles. They have a habit of regurgitating highly suspect claims from press releases or company self-descriptions as if they were reality.

OpenAI is confident in o3, and offers impressive benchmarks—it says that in a Codeforcing testing, which measures coding ability, o3 got a score of 2727. For context, a score of 2400 would put an engineer in the 99th percentile of programmers. It gets a score of 96.7% on the 2024 American Invitational Mathematics Exam, missing just one question.

There’s also the article which claims that AI puts the entire power grid at risk, and then when you read the article, you learn that in order for that to be true, you need to lump AI in with crypto mining, other datacenter expansion, electric cars, and climate control for people’s homes.