pyrex

joined 6 months ago
[–] pyrex@awful.systems 6 points 4 months ago* (last edited 4 months ago) (2 children)

You've pegged me OK! I know how I want to feel about my writing. Well, wanting it hasn't made it happen. Telling myself "Well, this is the emotion I should have" hasn't changed the emotions I do have. Telling myself "Time to not eat" doesn't make me starve less.

In the past I've tried to mutilate the impulse out of my own brain, but I think it mostly made me hate myself. Right now I'm doing the experiment of admitting -- I'm probably going to crave adoration until I die -- and asking "OK, what happens next?"

On Scott -- as far as I can tell, Scott's playing a version of the "debate in good faith" game. The rules are that you only say things you believe, and when someone convinces you of something, you admit it.

Every philosopher in the world, good or bad, plays a version of this game. A third secret rule of this game is always implicit, taking the form of the answer to this question: "When do I become convinced of something?"

How Scott answers this question is clearly part of his success and a key commonality with his audience. Scott is clearly willing to state strong belief in things he has not thought very much about, and Scott is clearly unusually easy to convince. I assume that whatever rules are etched in his brain, similar rules are etched in his audience's brains.

Based on how he plays the game how he likes it, and other people move, and I don't move, the particular rules in his head clearly aren't the same ones in mine. Or at least I've decided not to be moved by this particular guy. I also think people say they've moved when they haven't, as a rhetorical strategy -- Marc Andreessen says he's just now becoming a Republican. Scott's commentors act as if they've just now considered that eugenics might be the answer.

In other responses I've offered some opinions on why he would choose to play this particular game: I think the way he happens to play the game is a second-order phenomenon of "the extreme ambivalence of wanting to hold terrible social attitudes and strong belief in your own personal virtue at the same time." I think you observe this: "enabling psychiatrist [...] happy to overmedicate his patients" is a good figurative characterization.

(Actually, is it literally true? It feels like it would be invasive to check.)

[–] pyrex@awful.systems 5 points 4 months ago* (last edited 4 months ago) (1 children)

I don't think you sent this to me personally, but it has been sent to me. I still like it quite a bit. I reread it now to make sure of that!

I think your summary (and additional analysis) is pretty accurate. I think I would add a few things:

  • He's not being evil in every post. Some of the posts are OK.
  • [Elizabeth Sandifer observes this.] He tends to compare a bad argument to a very bad argument, and he's usually willing to invite snark or ridicule.

There's a crunchy systemic thing I want to add. I'm sure Elizabeth Sandifer gets this, it's just not rhetorically spotlit in her post --

A lot of people who analyze Scott Alexander have difficulty assigning emotional needs to his viewers. Scott Alexander decides to align himself with Gamergate supporters in his feminism post: Gamergate isn't a thing you do when you're in a psychologically normal place.

An old Startup Guy proverb says that you should "sell painkillers, not vitamins" -- you want people to lurch for your thing when they're doing badly because you're the only thing that will actually solve their problem. When people treat Scott Alexander's viewers as if they're smug, psychologically healthy startup twits, they typically take his viewers' engagement with Scott Alexander and make it into this supererogatory thing that his audience could give up or substitute at any time. His influence by this account is vitamin-like.

This makes the tech narcissists seem oddly stronger than normal people, who are totally distorted by their need for approval. We kind of treat them like permanent twisted reflections of normal people and therefore act as if there's no need for funhouse mirrors to distort them. We make the even more fundamental error of treating them like they know who they are.

This is how I think it actually works: the narcissists you meet are not completely different from you. They're not unmoored from ethics or extremely sadistic. They're often extremely ambivalent -- there's a clash of attitudes in their heads that prevents them from taking all the contradictory feelings inside them and reifying them as an actual opinion.

From what I can tell, Scott is actually extremely effective at solving the problem of "temporarily feeling like a horrible person." He's specifically good at performing virtue and kindness when advocating for especially horrible views. He's good at making the thing you wanted to do anyway feel like the difficult last resort in a field of bad options.

I'll admit -- as a person with these traits, this is another place where the basis for my analysis seems completely obvious to me, yet I see an endless dogpile of nerds who seem as if they willfully do not engage with it. I assume they thought of it, find it convincing on some level and therefore they make significant effort to repress it. If I'm going to be conceited for a moment, though, this is probably simultaneously expecting too much intelligence and too much conventionally narcissistic behavior from my audience, who are, demographically, the same people who thought Scott was brilliant in the first place.

[–] pyrex@awful.systems 5 points 4 months ago (2 children)

Hey, thank you! Actually, as a person who can produce extremely large amounts of coherent text really fast, I find this oddly reassuring. I have a limited number of things to say but I can certainly say them a lot.

I might be overestimating how much of his success is him, but look at the situation as you've drawn it: he's not in a fishbowl with 40 million readers, he's in a fishbowl with 40 similar fish. He's the biggest one. Well, how did that happen? 39 other fish would like to know.

[–] pyrex@awful.systems 5 points 4 months ago (2 children)

Hey! Thank you for liking the things I write!

I think you're right that both early-stage and late-stage Scott aren't doing the thing that I implied I should be doing. (exaggerated and hamfisted system-building arranged around eventual predictions of doom) A thing I didn't mention: I wrote an article in this style on a throwaway on LessWrong years ago and they totally ignored it. So I still don't know if they hated it or if it just wasn't their deal.

Soupy vague praise of powerful people is a separate thing he also seems to do, which you have clearly noticed. I don't think it's the only thing he does.

(What does he do? I'm systematically responding to everyone here, so I won't paraphrase other people's comments on what he does and will instead respond to them directly as I get to their posts.)

Anyway: I refuse to act as if he's bad at the thing he's doing. Even the people who criticize him generally refuse to summarize him accurately, which is a behavior of people who have recognized that someone else's rhetoric has power over them and they don't like it.

I'm also not sure yet if I'm unwilling to do it myself. One: I'm the cofounder of a startup. Doing what he does means more money for me. Two: right now I'm chewing on 8 responses to my post, so I'm "hungry" but not starving. Ask me what I'm selling in a week and my catalogue may have changed.

(PS: It might interest you to know that the original draft of this OP was about Paul Graham! I switched the mentioned figure to Scott Alexander because I had more to say about him and everyone here hates him more.)

[–] pyrex@awful.systems 3 points 4 months ago

Actually, as a furry, I'm obligated not to hate this.

[–] pyrex@awful.systems 2 points 4 months ago (3 children)

This seems bleak but not inaccurate. Not a big fan of it. I'll be economical by not explaining why.

[–] pyrex@awful.systems 3 points 4 months ago (1 children)

Ack, I meant to go around responding to everyone and I missed this one! Hope it was good.

[–] pyrex@awful.systems 7 points 4 months ago* (last edited 4 months ago)

Before I was posting about tech on the internet I was posting about philosophy. I don't know enough about philosophy to be good at it -- I've read almost nothing -- but I noticed you could get pretty far by saying "Kant probably didn't have anything valuable to say -- he was a massive racist." A balm for people who are looking for an excuse not to have read Kant.

My bleak theory is that to be convincing I'd have to switch to calculatedly mediocre text deliberately orchestrated to be unsurprising. My experience is that when an extremely successful article contains genuine insight, it separately contains an absolutely mediocre take that is the real explanation for why it went viral.

Let's start with "Scott is a bigot" as an example claim. That's true, but the evidence is basically just a bland admission of "yeah." Nobody can spin that into a detailed and personal story about how Scott got mindhacked, which is the single part of Scott Alexander's bigotry that can be discussed at a level interesting to bored idiots. Discussing his bigotry directly would make it obvious -- he hasn't stated any takes that aren't incredibly commonplace for tech-adjacent eugenics losers, and has waffled publicly about whether or not to disavow even those stances.

What options are left? I could write a history of the ideas involved and risk boring people to sleep: such a story would contain basically zero concrete events, because we only have his distant past-tense account of how he came to his current conclusions. Or I could write something wildly speculative and commit defamation: "here's how it might have happened: a fictionalized account of how a mediocre person became racist." Or I could go into hyperbole: Eliezer Yudkowsky is Scott Alexander is Mencius Moldbug is George Lincoln Rockwell.

Would the latter post do OK? I'm afraid to try it: one because I'm afraid it wouldn't and I'd feel like more of a failure, and two because I'm afraid it would.

These are the opinions I don't like having about other people, but they also feel increasingly vindicated when I look at what text performs well on Reddit, and when I observe the basically-zero correlation between the topic of an article and the text of its responses. I've seen an enormous number of successful posts that can be summarized as "the author presents their grand unifying theory of X, with the understanding that the reader will never attempt to apply it to examples outside the post."

[–] pyrex@awful.systems 10 points 4 months ago* (last edited 4 months ago)

I don't understand why people take him at face value when he claims he's always been a Democrat up until now. He's historically made large contributions to candidates from both parties, but generally more Republicans than Democrats, and also Republican PACs like Protect American Jobs. Here is his personal record.

Since 2023, he picked up and donated ~$20,000,000 to Fairshake, a crypto PAC which predominantly funds candidates running against Democrats.

Has he moved right? Sure. Was he ever left? No, this is the voting record of someone who wants to buy power from candidates belonging to both parties. If it implies anything, it implies he currently finds Republicans to be corruptible.

[–] pyrex@awful.systems 3 points 4 months ago

The plan isn't totally serious, but the worldview I'm promoting, which you seem to be picking up on, actually is serious.

The observation I have made is that most people in positions of power were selected by people in previous positions of power, usually for their affability and willingness to comply. Most of the most powerful people I have met were total conformists in practically every way, although they usually had high general intelligence.

[–] pyrex@awful.systems 3 points 4 months ago

Oh. I don't know how to get other people to vote better. I know things about software, I guess!

[–] pyrex@awful.systems 3 points 4 months ago

Stuck in my brain: I used to work at a dating site for Indian people, and one of the things we tried was "LLM-generated pickup lines."

I don't remember most of them and we never made the feature public, but one of them sticks out in my mind for being the most incomprehensible.

Guy-to-girl:

What's your remedy for a Bollywood love affair?

view more: ‹ prev next ›