this post was submitted on 21 Nov 2023
995 points (97.9% liked)

Technology

59345 readers
5876 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Over half of all tech industry workers view AI as overrated::undefined

you are viewing a single comment's thread
view the rest of the comments
[–] Admax@lemmy.world 4 points 11 months ago* (last edited 11 months ago) (1 children)

I've seen it refered to as AGI bit I think itns wrong. Chat GPT isnt intelligent in the slightest, it only makes guesses on what word is statistically more likely to come up next. There is no thikinking or problem solving involved.

A while ago I saw an article that with a tittle along the lines of "spark of AGI in ChatGPT 4" because it chose to use a calculator tool when facing a problme that required one. That would be AI (and not AGI). It has a problem, it learns and uses available tools to solve it.

AGI would be on a whole other level.

Edit: Grammar

[–] thedeadwalking4242@lemmy.world 11 points 11 months ago (1 children)

The argument "it just predicts the most likely next word" while true massively under values what it even means to predict the next word or token. Largely these predictions are based on sentences and ideas the model has trained on from its data sets. It's pretty intelligent if you think about it. You read a text book then when you apply the knowledge or take a test you use what you read to form a new sentence in relation to the context of the question or problem. For the models "text prediction" to be correct it has to understand certain relationships between complex ideas and objects to some capacity. Yes it absolutely is not as good as human intelligence. But what it's doing is much more advanced then text to type on your phone keyboard. It's a step in the right direction, over hyped right now but the hype is funneling cash into research. The models are already getting more advanced. Right now half of what it says is hot garbage but it can be pretty accurate.

[–] eronth@lemmy.world 6 points 11 months ago (1 children)

Right? Like, I, too, predict the next word in my sentence to properly respond to inputs with desired output. Sure I have personality (usually) and interests, but that's an emergent behavior of my intelligence, not a prerequisite.

It might not formulate thoughts the way we do, but it absolutely emulates some level of intelligence, artificially.

[–] NightAuthor@lemmy.world 1 points 11 months ago

I think so many people overrate human intelligence, thus causing them to underrate AI. Don’t get me wrong, our brains are amazing, but they’re also so amazing that they can make crazy cool AI that is also really amazing.

People just hate the idea of being meat robots, I don’t blame em.