200fifty

joined 1 year ago
[–] 200fifty@awful.systems 14 points 1 year ago (1 children)

What I don't get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes... humans don't entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can't do when it doesn't have access to the physical world, only things humans have written about it?

Even if it is using its godly intelligence to predict the next word, wouldn't it only be able to predict the next word as it relates to things that have already been discovered through experiment? What's his proposed mechanism for it to suddenly start deriving all of biology from first principles?

I guess maybe he thinks all of biology is "in" the DNA and it's just a matter of simulating the 'compilation' process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that's such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material

[–] 200fifty@awful.systems 13 points 1 year ago (1 children)

Well all I know is I definitely trust the research and knowledge and informed-ness about biological sex of the person who uses the word "hermaphroditism" with regards to humans. Now that's a person who knows what they're talking about, I think to myself

[–] 200fifty@awful.systems 5 points 1 year ago

I mean they'll use an LLM instead of going to therapy too...

[–] 200fifty@awful.systems 10 points 1 year ago (1 children)

yeah, my first thought was, what if you want to comment out code in this future? does that just not work anymore? lol

[–] 200fifty@awful.systems 10 points 1 year ago* (last edited 1 year ago)

I definitely think the youths are stressed because of 'environmental pollution,' but not in the way this commenter means...

[–] 200fifty@awful.systems 23 points 1 year ago (5 children)

"We are told that technology is helping redistribute wealth from the common people to a small subset of extremely rich men. But, as an extremely rich man, I don't really understand why this is a bad thing? Technology seems pretty cool to me!"

[–] 200fifty@awful.systems 10 points 1 year ago

since we both have the High IQ feat you should be agreeing with me, after all we share the same privileged access to absolute truth. That we aren’t must mean you are unaligned/need to be further cleansed of thetans.

They have to agree, it's mathematically proven by Aumann's Agreement Theorem!

[–] 200fifty@awful.systems 10 points 1 year ago* (last edited 1 year ago) (14 children)

This is good! Though, he neglects to mention the group of people (including myself) who have yet to be sold on ai's usefulness at all (all critics of practical AI harms are lumped under 'reformers' implying they still see it as valuable but just currently misguided.)

Like, ok, so what if China develops it first? Now they can... generate more convincing spam, write software slightly faster with more bugs, and starve all their artists to death? ... Oh no, we'd better hurry up and compete with that!

[–] 200fifty@awful.systems 8 points 1 year ago* (last edited 1 year ago) (1 children)

a boring person’s idea of interesting

Agh this is such a good way of putting it. It has all the signifiers of a thing that has a lot of detail and care and effort put into it but it has none of the actual parts that make those things interesting or worth caring about. But of course it's going to appeal to people who don't understand the difference between those two things and only see the surface signifiers (marketers, executives, and tech bros being prime examples of this type of person)

ETA: and also of course this explains why their solution to bias is "just fake it to make the journalists happy." Why would you ever care about the actual substance when you can just make it look ok from a distance

[–] 200fifty@awful.systems 10 points 1 year ago

I had the same thought as Emily Bender's first one there, lol. The map is interesting to me, but mostly as a demonstration of how anglosphere-centric these models are!

view more: ‹ prev next ›