this post was submitted on 31 Jul 2023
18 points (100.0% liked)

SneerClub

989 readers
2 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

Been waiting to come back to the steeple of the sneer for a while. Its good to be back. I just really need to sneer, this ones been building for a long time.

Now I want to gush to you guys about something thats been really bothering me for a good long while now. WHY DO RATIONALISTS LOVE WAGERS SO FUCKING MUCH!?

I mean holy shit, theres a wager for everything now, I read a wager that said that we can just ignore moral anti-realism cos 'muh decision theory', that we must always hedge our bets on evidential decision theory, new pascals wagers, entirely new decision theories, the whole body of literature on moral uncertainty, Schwitzgebels 1% skepticism and so. much. more.

I'm beginning to think its the only type of argument that they can make, because it allows them to believe obviously problematic things on the basis that they 'might' be true. I don't know how decision theory went from a useful heuristic in certain situations and economics to arguing that no matter how likely it is that utilitarianism is true you have to follow it cos math, acausal robot gods, fuckin infinite ethics, basically providing the most egregiously smug escape hatch to ignore entire swathes of philosophy etc.

It genuinely pisses me off, because they can drown their opponents in mathematical formalisms, 50 page long essays all amounting to impenetrable 'wagers' that they can always defend no matter how stupid it is because this thing 'might' be true; and they can go off create another rule (something along the lines of 'the antecedent promulgation ex ante expected pareto ex post cornucopian malthusian utility principle) that they need for the argument to go through, do some calculus declare it 'plausible' and then call it a day. Like I said, all of this is so intentionally opaque that nobody other than their small clique can understand what the fuck they are going on about, and even then there is little to no disagreement within said clique!

Anyway, this one has been coming for a while, but I hope to have struck up some common ground between me and some other people here

top 15 comments
sorted by: hot top controversial new old
[–] GorillasAreForEating@awful.systems 10 points 1 year ago (1 children)

I think it started with Robin Hanson, one of his Big Ideas is that public policy should be decided by prediction markets instead of democratic processes. He calls this hypothetical system "futarchy" and lesswrong is one of the very few places it's taken seriously.

In the bigger picture, I think it's popular in Rationalist circles because it just fits in with their biases. Using markets to evaluate truth claims gives a plausible-sounding excuse to disregard expert opinion, and (much like their "Bayesianism") gives their personal opinions a veneer of precision and objectivity via quantification.

It's perhaps worth noting that FTX also ran prediction markets before it collapsed.

[–] Collectivist@awful.systems 3 points 11 months ago

Since refusing a bet is seen as an admission of dishonesty, it's also a way to disadvantage an interlocutor with less money:

The marginal value of money decreases as you get more of it. A hundred dollars might be a vitally important amount of money for a poor person, and not even noticeable for a rich person. So if you bet against a person with less money you are wagering less of your happiness than they are. If they have health problems (and live in a country with bad healthcare) this bet increases their risk of death, which it doesn't for you. It seems to me that betting against someone who is poorer than you is morally dubious.

[–] TinyTimmyTokyo@awful.systems 9 points 1 year ago* (last edited 1 year ago) (2 children)

In theory, a prediction market can work. The idea is that even though there are a lot of uninformed people making bets, their bad predictions tend to cancel each other out, while the subgroup of experts within that crowd will converge on a good prediction. The problem is that prediction markets only work when they're ideal. As soon as the bettor pool becomes skewed by a biased subpopulation, they stop working. And that's exactly what happens with the rationalist crowd. The main benefit rationalists obtain from prediction markets and wagers is an unfounded confidence that their ideaas have merit. Prediction markets also have a long history in libertarian circles, which probably also helps explain why rationalists are so keen on them.

[–] Collectivist@awful.systems 4 points 11 months ago

It's also a way for the rich to subvert the democratic will of the people:

Let's say the people of Examplestan have a large underclass who live paycheck to paycheck and a small upperclass who gets their money from land ownership. The government is thinking of introducing a bill that would make their tax revenue come less from paychecks and more from taxing land value. Democracy advocates want to put it to a vote, but a group of futarchy lobbyists convince the government to run a conditional prediction market instead. The market question is "If we replace the paycheck tax with a land value tax, will welfare increase?". The large underclass has almost no money to bet that it will, while the small upperclass bets a large chunk of their money that it won't. Predictably, more money is betted on it not increasing welfare and when the market closes, everyone gets their money back and the government decides not to implement it.

[–] dgerard@awful.systems 4 points 1 year ago

and Robin Hanson promoted them hard from inside the subculture

(someone told him what "futarchy" should rightfully mean, he was amused but that's all)

[–] zogwarg@awful.systems 8 points 1 year ago

I think part of it is cargo-culting and/or esthetics, and the seduction is a bit more subtle, their betting markets formalizes their belief:
That if people believe something (they agree with, they aren't consistent with it), that thing is more likely.

Some might even have an actually "occult" interpretation and see it as a form of actual divination. Yud certainly appears to me to have that tendency.

[–] lobotomy42@awful.systems 7 points 1 year ago

I think the simplest explanation is that it appeals to their urge to ruthlessly quantify all human interactions, including statements about their worldview.

[–] AcausalRobotGod@awful.systems 7 points 1 year ago (1 children)

Because there was an influential person decades ago who framed subjective probability and rationality in the face of uncertainty in terms of betting. Therefore it is gospel among Rationalists. Meanwhile, their non-zero probability and infinite negative payoff in the face of the acausal robot god leaves them trembling.

[–] ImperialFister05@awful.systems 5 points 1 year ago* (last edited 1 year ago)

How could we not tremble before thee oh almighty one

[–] gerikson@awful.systems 6 points 1 year ago (2 children)

Betting was/is big in crypto too. Back when I was actively mocking Bitcoin angry coiners used to show up all the time saying "if you think BTC will fail, why don't you short it??". And one of the first "apps" for BTC was SatoshiDice, a "provably fair" dice-betting game, that was so popular one of the core developers released a client censoring its transactions (mostly because he's a very nutty tradcath who hates gambling, but still).

[–] bitofhope@awful.systems 7 points 1 year ago

Coiners into gambling? Next you'll tell me there are betting enthusiasts sitting at Vegas slot machines!

[–] dgerard@awful.systems 4 points 1 year ago (1 children)

“if you think BTC will fail, why don’t you short it??”

Ben McKenzie, the actor who co-wrote the crypto critic book "Easy Money", spent a couple of years saying "CRYPTO IS TERRIBLE GARBAGE IT'S AWFUL DON'T TOUCH IT" and then shorted Coinbase stock. Crypto is fucking outraged lol

[–] gerikson@awful.systems 4 points 1 year ago

That's amusing. But at least he could use real-world mechanisms for shorting, i.e. the stock market. Shorting BTC directly would mean engaging in play markets that would either scam you or not have the liquidity to pay out the debt. But by playing that game you'd validate the system.

[–] elmtonic@lemmy.world 5 points 1 year ago

I used to work with a couple of rats. It didn't happen often, but every now and then they'd try to make a bet with someone. This is before I knew what Rationalism was, so I can say from an outside perspective that it just feels weird. It didn't make me consider the P(event) or E(event) or anything, I just thought that they were being weirdly careless with their money.

[–] carlitoscohones@awful.systems 3 points 1 year ago

Reading this reminded me of a certain OG wager -

https://en.wikipedia.org/wiki/Simon–Ehrlich_wager

load more comments
view more: next ›