speaking as somebody who I don't think actually got any money directly from FTX Future Fund that I can recall
'All my firearms sank in a boating accident' energy.
speaking as somebody who I don't think actually got any money directly from FTX Future Fund that I can recall
'All my firearms sank in a boating accident' energy.
Far as I can tell none of them had the attention span for the BtB live show on Yudkowsky either which is a shame because I actually learned quite a bit from it (being that I'm not going to slog through HPMOR, even out of spite.)
Individualist mindrot
He keeps talking though. This is my favorite bit:
The part I don’t understand is that you feel Gawker was scum. Thiel removed Gawker’s ability to be scum. Thus, by logic, the world was improved. Isn’t that the core of what you’re saying you wish billionaires did? Improve the world?
Wish we'd gone with software carpenter or software plumber.
Most realistic Musk sales pitch.
Nobody tell these guys that the control problem is just the halting problem and first year CS students already know the answer.
You'd be surprised how big the Libertarian/UBI overlap is.
At the most basic level, there's a pretty core libertarian belief that the government can't or shouldn't make decisions and individuals can or should. In this framework, giving money away to individuals will lead to it being spent better than if it was used for actual welfare. So although Libertarians tend to be anti-welfare, sometimes they make an exception for UBI, because it's still a free market solution.
Some progressives like it to, of course. Means testing is a burden, and limiting what you can spend (for example) WIC on can feel a lot like haves trying to control the lives of the have-nots.
For TREACLES though, I think there's a more pathological element at play: they plan to put everyone out of work and need a way to avoid a torches and pitchforks scenario.
I was going to say that but couldn't find the article that told me.
Hasn't Future Perfect always been a TREACLES mouthpiece?
Most self aware rationalist.
So close to being deprogrammed. So close. It's like when a kid finds out about the Easter Bunny but somehow still clings to Santa.
He links to this (warning, so long it has a whole 'why write this' section) article on Yudkowsky being wrong which amuses me.
This, but for AI lol.
LW equivalent of fight me irl bro