this post was submitted on 30 Oct 2023
546 points (94.8% liked)

Technology

59288 readers
4375 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
  • Big Tech is lying about some AI risks to shut down competition, a Google Brain cofounder has said.
  • Andrew Ng told The Australian Financial Review that tech leaders hoped to trigger strict regulation.
  • Some large tech companies didn't want to compete with open source, he added.
you are viewing a single comment's thread
view the rest of the comments
[–] jarfil@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

three-way race between AI, climate change, and nuclear weapons proliferation

Bold of you to assume that people behind maximizing profits (high frequency trading bot developers) and behind weapons proliferation (wargames strategy simulation planners) are not using AI... or haven't been using it for well over a decade... or won't keep developing AIs to blindly optimize for their limited goals.

First StarCraft AI competition was held in 2010, think about that.

[–] henfredemars@infosec.pub 1 points 1 year ago (1 children)

I will appeal to my previous ignorance. I had no idea that AI saw that much usage over 10 years ago!

[–] jarfil@lemmy.world 2 points 1 year ago

We used to run "machine learning", "neural networks", over 25 years ago. The "AI" term has always been kind of a sci-fi thing, somewhere between a buzzword, a moving target, and undefined since we lack a fixed comprehensive definition of "intelligence" to begin with. The limiting factors of the models have always been the number of neurons one could run in real-time, and the availability of good training data sets. Both have increased over a million-fold in that time, progressively turning more and more previously untractable problems into solvable ones to the point where the results are equal or better and/or faster than what people can do.

Right now, there are supercomputers out there orders of magnitude more capable than what runs stuff like ChatGPT, DallE, or all the public facing "AI"s that made the news. Bigger ones keep getting built... and memristors are coming, to become a game changer the moment they can be integrated anywhere near current GPU/CPU levels.

For starters, a supercomputer with the equivalent neural network processing power of a human brain, is expected for 2024... that's next year... but it won't be able to "run a human brain", because we lack the data on how "all of" the human brain works. It will likely become obsoleted by ones with several orders of magnitude more processing power, way before we can simulate an actual human brain... but the question will be: do we need to? Does a neural network need to mimick a human brain, in order to surpass it? A calculator already does, and it doesn't use a neural network at all. At what point the integration of what size and kind of neural network, with what kind of "classical" computer, can start running circles around any human... or all of humanity taken together?

And of course we'll still have to deal with the issue of dumb humans telling/trusting dumb "AI"s to do things way over their heads... but I'm afraid any attempt at "regulation", is going to end up like the case with "international law": those who want, obey it; those who should, DGAF.

Even if all tech giants with all lawmakers got to agree on the strictest of regulations imaginable, like giving all "AI"s the treatment of weapons of mass destruction, there is a snowflake's chance in hell that any military in the world will care about any of it.