this post was submitted on 18 Mar 2024
26 points (100.0% liked)

TechTakes

1427 readers
226 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Feel like you want to sneer about something but you don't quite have a snappy post in you? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut'n'paste it into its own post, there’s no quota here and the bar really isn't that high

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] gerikson@awful.systems 7 points 8 months ago (12 children)

Vernor Vinge, patron saint of the singularity and noted winner of a couple of libertarian fiction awards, has passed. HN wanted a black bar[1] but were denied. They compensated by posting a lot of bad takes.

Submission: https://news.ycombinator.com/item?id=39775304

'jart (aka Justine) has this to say about the impending Singularity (feat. Trump and Twitter!):

https://news.ycombinator.com/item?id=39777929

This hackernews wishes he has been frozen, to be thawed in the future (for some reason, no-one expects the future to be Idiocracy):

https://news.ycombinator.com/item?id=39776217


[1] when a notable person in CS dies there's sometimes a black bar under the HN header

[–] self@awful.systems 7 points 8 months ago (9 children)

fucking christ. it takes a lot to fuck up my day, but a quick scroll through that thread seeing how quick these vultures (including one notable person who’s the reason why I’m ashamed to talk about my lambda calculus projects) are trying to capitalize on Vernor’s legacy is absolutely doing it

HN wanted a black bar[1] but were denied.

why in the fuck? is the famous sci-fi author with a heavy CS background not notable enough for the standards of the site whose creator is a much less notable self-help author whose CS background is failing to make a working Lisp 3 times and writing programming textbooks nobody reads?

load more comments (9 replies)
[–] Soyweiser@awful.systems 6 points 8 months ago* (last edited 8 months ago)

The replies to somebody aggressively (and downvoted) pointing out some of the flaws in Jarts post are bad. Damn.

an LLM cannot be used to create a better LLM

By that logic most humans are also not intelligent.

No you dweeb, they are talking about model collapse, that thing what happens to this 90's tech.

Oh, it doesn't work? That's because IT'S NOT INTELLIGENT.

Ok, let's run this test of "real intelligence" on you. We eagerly await to see your model. Should be a piece of cake.

This is both a weird adhom and a god is hiding in the gaps style argument. (While I have some sympathy for this Peter Watts style argument it is incredibly weak (their post history (8) is more of this very weak stuff)).

Edit: forgot to mention what I actually initially wanted to say:

Idiocracy

I still think that movie actually is quite hopeful, it shows us a world where the current consumerist society in the USA can remain to exist for 500 years (that is how long he is frozen), and there is nobody invading and taking over when the USA becomes this automated and dumb. We should all hope the future is this hopeful (guess they fixed the climate, and achieved fully automated luxury capitalism (which still sucks)) and non-violent.

load more comments (10 replies)
[–] dgerard@awful.systems 6 points 8 months ago (1 children)

just realised that Musk's angry face in the Don Lemon interview is the same face as an angry Skibidi Toilet

load more comments (1 replies)
[–] swlabr@awful.systems 6 points 8 months ago* (last edited 8 months ago) (1 children)

CW: Palestine-Israel.

Something I’ve observed is various members of the rats being zionist/pro-israel/anti-palestine. This doesn’t really surprise me, but that’s not what I’m commenting on.

One topic that occasionally gets discussed is Israel’s use of “AI” in warfare. I’m only going to link one source and I’m not going into it too deeply.

So what I’m wondering is: this seems to pattern match to one of the great AI doom narratives, ie. of AGI* being given control of military assets; where is the rationalist outrage? I searched Lesswrong for mentions of Israel and the IDF but turned up empty handed.

To be clear: this is a request for submissions. I am not rhetorically sneering at what I am perceiving as a hypocritical lack of outrage, to do so would require proof. That being said it’s probably clear from this comment that my mind is constructing that narrative.

*of course I am not saying that whatever the IDF is saying is AI is AGI, as AGI is not real.

[–] skillissuer@discuss.tchncs.de 6 points 8 months ago (1 children)

the old thing was use of ai in iron dome, because generally mistaking a civilian target for a rocket is pretty hard given differences in size and speed, and autonomous air defense is a thing that already exists for decades. the ai part comes from predicting if a given rocket will fall on populated area, and pointing radar-guided interceptors to those that do

the new thing, well nobody can tell you that you're doing things wrong if nobody knows for sure what are you doing. to even tell whether it's working or not you'd need to sit in heads of israeli military planners and know what are their exact objectives and acceptable collateral damage

people at palantir are probably making very detailed notes

load more comments (1 replies)
load more comments
view more: ‹ prev next ›