titotal

joined 1 year ago
[–] titotal@awful.systems 5 points 1 year ago (7 children)

Hey, thanks so much for looking through it! If you're alright with messaging me your email or something, I might consult you on some more related things.

With your permission, I'm tempted to edit this response into the original post, it's really good. Have you looked over Yudkowsky's word salad in the EA forum thread? Would be interested in getting your thoughts on that as well.

[–] titotal@awful.systems 14 points 1 year ago

Thanks! I strive for accuracy, clarity, humility, and good faith. Aka, everything I learned not to do from reading the sequences.

[–] titotal@awful.systems 12 points 1 year ago* (last edited 1 year ago)

EA as a movement was a combination of a few different groups (This account says Giving what we can/80000 hours, Givewell, and yudkowsky's MIRI). However, the main source of early influx of people was the rationalist movement, as Yud had heavily promoted EA-style ideas in the sequences.

So if you look at surveys, right now a a relatively small percentage (like 15%) of EA's first heard about it through lesswrong or SSC. But back in 2014, and earlier, Lesswrong was the number one onroad into the movement (like 30%) . (I'm sure a bunch of the other answers may have heard about it from rationalist friends as well). I think it would have been even more if you go back earlier.

Nowadays, most of the recruiting is independent from the rationalists, so you have a bunch of people coming in and being like, what's with all the weird shit? However they still adopt a ton of rationalist ideas and language, and the EA forum is run by the same people as Lesswrong. It leads to some tension: someone wrote a post saying that "yudkowsky is frequently confidently, egregiousl wrong", and it was somewhat upvoted on EA forum but massively downvoted on Lesswrong.

[–] titotal@awful.systems 2 points 1 year ago (1 children)

Do you have any links to this, out of curiosity? I looked a bunch and couldn't find any successor projects.

[–] titotal@awful.systems 2 points 1 year ago* (last edited 1 year ago) (1 children)

what are the other ones?

I guess the rest of the experimental setup that recombines the photon amplitiudes. Like if you put 5 extra beam splitters in the bottom path, there wouldn't be full destructive interference.

when i’m thinking about splitter with pi/4 phase shift, i’m thinking about coupled line coupler or its waveguide analogue, but i come from microwave land on this one. maybe this works in fibers?

I'm not sure how you'd actually build a symmetric beam splitter: wikipedia said you'd need to induce a particular extra phase shift on both transmission and reflection. (I'm fully theoretical physics so I'm not too familiar).

[–] titotal@awful.systems 11 points 1 year ago (5 children)

If you want more of this, I wrote a full critique of his mangled intro to quantum physics, where he forgets the whole "conservation of energy" thing.

[–] titotal@awful.systems 5 points 1 year ago (2 children)

What I think happened is that he got confused by the half mirror phase shifts (because theres only a phase shift if you reflect off the front of the mirror, not the back). Instead of asking someone, he invented his own weird system which gets the right answer by accident, and then refused to fix the mistake ever, saying that the alternate system is fine because it's "simpler".

[–] titotal@awful.systems 12 points 1 year ago

My impression is that the toxicity within EA is mainly concentrated in the bay area rationalists, and in a few of the actual EA organizations. If it's just a local meetup group, it's probably just going to be some regular-ish people with some mistaken beliefs that are genuinely concerned about AI.

Just be polite and present arguments, and you might actually change minds, at least among those who haven't been sucked too far into Rationalism.

[–] titotal@awful.systems 4 points 1 year ago (1 children)

Obvious reminder: do not assume that anonymous tumblr posts are accurate. (this is the only post the tumblr account made).

Has anyone attempted a neutral unpacking of the mess of claims and counterclaims around Ziz and related parties?

[–] titotal@awful.systems 17 points 1 year ago* (last edited 1 year ago) (2 children)

I roll a fair 100 sided dice.

Eliezer asks me to state my confidence that I won't roll a 1.

I say I am 99% confident I won't roll a 1, using basic math.

Eliezer says "AHA, you idiot, I checked all of your past predictions and when you predicted something with confidence 99%, it only happened 90% of the time! So you can't say you're 99% confident that you won't roll a 1"

I am impressed by the ability of my past predictions to affect the roll of a dice, and promptly run off to become a wizard.

[–] titotal@awful.systems 4 points 1 year ago

Yeah, I've been writing up critiques for a year or two now, collected over at my substack. I've been posting them to the EA forum and even Lesswrong itself and they've been generally well received.

[–] titotal@awful.systems 42 points 1 year ago (15 children)

As a physicist, this quote got me so mad I wrote an excessively detailed debunking a while back. It's staggeringly wrong.

view more: ‹ prev next ›