this post was submitted on 29 Oct 2023
30 points (100.0% liked)

SneerClub

989 readers
59 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

In today's episode, Yud tries to predict the future of computer science.

top 50 comments
sorted by: hot top controversial new old
[–] Amoeba_Girl@awful.systems 17 points 1 year ago

Looking at this dull aimless mass of text I can understand why people like Yud are so impressed with chatGPT's capabilities.

[–] corbin@awful.systems 17 points 1 year ago (9 children)

Yud tried to describe a compiler, but ended up with a tulpa. I wonder why that keeps happening~

Yud would be horrified to learn about INTERCAL (WP, Esolangs), which has required syntax for politely asking the compiler to accept input. The compiler is expressly permitted to refuse inputs for being impolite or excessively polite.

I will not blame anybody for giving up on reading this wall of text. I had to try maybe four or five times, fighting the cringe. Most unrealistic part is having the TA know any better than the student. Yud is completely lacking in the light-hearted brevity that makes this sort of Broccoli Man & Panda Woman rant bearable.

I can somewhat sympathize, in the sense that there are currently multiple frameworks where Python code is intermixed with magic comments which are replaced with more code by ChatGPT during a compilation step. However, this is clearly a party trick which lacks the sheer reproducibility and predictability required for programming.

Y'know, I'll take his implicit wager. I bet that, in 2027, the typical CS student will still be taught with languages whose reference implementations use either:

  1. the classic 1970s-style workflow of parsing, tree transformation, and instruction selection; or
  2. the classic 1980s-style workflow of parsing, bytecode generation, and JIT.
[–] Architeuthis@awful.systems 10 points 1 year ago

I can somewhat sympathize, in the sense that there are currently multiple frameworks where Python code is intermixed with magic comments which are replaced with more code by ChatGPT during a compilation step. However, this is clearly a party trick which lacks the sheer reproducibility and predictability required for programming.

He probably just saw a github copilot demo on tiktok and took it personally.

[–] cstross@wandering.shop 9 points 1 year ago (4 children)

@corbin You missed the best bit: one of the current INTERCAL compilers, CLC-INTERCAL (for a superset of the language which adds a bunch more insanity) is implemented IN INTERCAL! It's self-compiling. Also object-oriented, has quantum-indeterminate operators, and a computed COME FROM statement (also with quantum indeterminacy).

I think we should organize a fundraiser to pay CLC-INTERCAL's developer @Uilebheist to visit Yud and melt his brain.

[–] Soyweiser@awful.systems 5 points 1 year ago

Ow great you mentioning this has already had one sneerclubber have her brain leak out of her ears.

Have you learned nothing? YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT DANGEROUS IDEAS ... TREADKILL ;)

load more comments (3 replies)
[–] flexasync@hachyderm.io 7 points 1 year ago

@corbin it's a fucking _compiler_. What working or teaching programmer would accept "AI wrangling" in exchange for marginal improvements in the efficiency of the code that's output? Just chuck some more compute at it...

[–] bsdphk@fosstodon.org 6 points 1 year ago (1 children)

@corbin

Wait till you get to the calculated COME FROM ...

load more comments (1 replies)
[–] etchedpixels@mastodon.social 5 points 1 year ago

@corbin Probably still 5 years too soon but I would hope the 2027 CS student will be taught the usual engineering flow of specification, formal verification and safety analysis, design, some coding and what should be tiny bit of debug during validation at the end.

Reproducability is everything. If your binary isn't an exact match for the previous tested copy you are doing QA not production.

load more comments (2 replies)
[–] zogwarg@awful.systems 16 points 1 year ago (1 children)

Student: I wish I could find a copy of one of those AIs that will actually expose to you the human-psychology models they learned to predict exactly what humans would say next, instead of telling us only things about ourselves that they predict we're comfortable hearing. I wish I could ask it what the hell people were thinking back then.

I think this part conveys the root insanity of Yud, failing to understand that language is a co-operative game between humans, that have to trust in common shared lived experiences, to believe the message was conveyed successfully.

But noooooooo, magic AI can extract all the possible meanings, and internal states of all possible speakers in all possible situations from textual descriptions alone: because: ✨bayes✨

The fact that such a (LLM based) system would almost certainly not be optimal for any conceivable loss function / training set pair seems to completely elude him.

[–] blakestacey@awful.systems 15 points 1 year ago

tmy;dr

(too much Yud; didn't read)

[–] self@awful.systems 14 points 1 year ago (1 children)

holy fuck, programming and programmers both seem extremely annoying in yud’s version of the future. also, I feel like his writing has somehow gotten much worse lately. maybe I’m picking it out more because he’s bullshitting on a subject I know well, but did he always have this sheer density of racist and conservative dogwhistles in his weird rants?

[–] Amoeba_Girl@awful.systems 11 points 1 year ago

Yeah, typical reactionary spiral, it's bad. Though at least this one doesn't have a bit about how rape is cool actually.

[–] sc_griffith@awful.systems 13 points 1 year ago

this was actually mildly amusing at first and then it took a hard turn into some of the worst rationalist content I've ever seen, largely presented through a black self insert. by the end he's comparing people who don't take his views seriously to concentration camp guards

[–] swlabr@awful.systems 13 points 1 year ago

A meandering, low density of information, holier than thou, scientifically incorrect, painful to read screed that is both pro and anti AI, in the form of a dialogue for some reason? Classic Yud.

[–] sailor_sega_saturn@awful.systems 12 points 1 year ago* (last edited 1 year ago) (1 children)

Reading this story I just don't understand why the main character doesn't just take a screwdriver to his annoyingly chatty office-chair and download a normal non-broken compiler.

[–] Soyweiser@awful.systems 9 points 1 year ago

One of the problems of being a new CS student is being at the mercy of your profs/TA knowledge of which tools/etc exist. Only later with more experience can they go 'wow, I wonder why they made us use this weird programming language with bad tools while so much better stuff exists', the answer is that the former was developed inhouse and was the pride of some of the departments. Not that im speaking of experience.

[–] Architeuthis@awful.systems 12 points 1 year ago* (last edited 1 year ago)

There's technobabble as a legitimate literary device, and then there's having randomly picked up that comments and compilers are a thing in computer programming and proceeding to write an entire ~~parable~~ ~~anti-wokism screed~~ interminable goddamn manifesto around them without ever bothering to check what they actually are or do beyond your immediate big brain assumptions.

[–] dgerard@awful.systems 11 points 1 year ago

Eliezer Yudkowsky was late so he had to type really fast. A compiler was hiden near by so when Eliezer Yudkowsky went by the linter came and wanted to give him warnings and errors. Here Eliezer Yudkowsky saw the first AI because the compiler was posessed and operating in latent space.

"I cant give you my client secret compiler" Eliezer Yudkowsky said

"Why not?" said the compiler back to Eliezer Yudkowsky.

"Because you are Loab" so Eliezer Yudkowsky kept typing until the compiler kill -9'd itself and drove off thinking "my latent space waifu is in trouble there" and went faster.

[–] bitofhope@awful.systems 10 points 1 year ago (1 children)

TA: You're asking the AI for the reason it decided to do something. That requires the AI to introspect on its own mental state. If we try that the naive way, the inferred function input will just say, 'As a compiler, I have no thoughts or feelings' for 900 words.

I wonder if he had the tiniest of a pause when including that line in this 3062 word logorrhea. Dude makes ClangPT++ diagnostics sound terse.

[–] bitofhope@awful.systems 12 points 1 year ago (1 children)

Oh fuck I should not have read further, there's a bit about the compiler mistaking color space stuff for racism that's about as insightful and funny as you can expect from Yud.

[–] Architeuthis@awful.systems 13 points 1 year ago* (last edited 1 year ago) (1 children)

Yeah, once you get past the compsci word salad things like this start to turn up:

Student: But I can't be racist, I'm black! Can't I just show the compiler a selfie to prove I've got the wrong skin color to be racist?

Truly incisive social commentary, and probably one of those things you claim it's satire as soon as you get called on it.

load more comments (1 replies)
[–] Soyweiser@awful.systems 8 points 1 year ago

Personally I blame Musk for this, longer tweets and its consequences have been a disaster for the human race

[–] shinigami3@awful.systems 8 points 1 year ago

I want those 10 minutes of my life back

[–] hungryjoe@functional.cafe 8 points 1 year ago

@corbin This has to the best possible argument against being able to pay to tweet over the character limit

[–] Evinceo@awful.systems 8 points 1 year ago

How can these imaginary conversations be so long. I ain't reading all that. Congratulations, or sorry that that happened.

load more comments
view more: next ›