this post was submitted on 21 Apr 2024
31 points (100.0% liked)
TechTakes
1397 readers
86 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As soon as some of these LLMs get a math module to do math correctly (And not just via the LLM lookuptable thing) people could write scripts to externalize some more intensive calculations needed for crypto mining. Sure it will be inefficient as fuck, and I doubt the chance of getting a coin reward will be low, but it will be free.
last week there were a couple of articles about how easy it is to craft an input that makes public chatgpt bots execute scripts (usually as root) on their hosting containers, which is almost definitely the result of a module like that being implemented for better programming-related results (aka fucking cheating), so this is very likely already happening
WHAAAT
links plz?
found the original post! https://mastodon.social/@kennwhite/112290497758846218 the prompt to make them execute code is incredibly basic. no idea right now if the exploit is in the chatbot framework or the model itself though
oh shit, somehow I figured you knew already! I’ll skim through my browser history and masto boosts and see if I can find one of the articles
Happy to at least not be the first to think of that idea, and sad to hear people will wreck the commons more.