this post was submitted on 11 Nov 2024
48 points (98.0% liked)

philosophy

19821 readers
2 users here now

Other philosophy communities have only interpreted the world in various ways. The point, however, is to change it. [ x ]

"I thunk it so I dunk it." - Descartes


Short Attention Span Reading Group: summary, list of previous discussions, schedule

founded 4 years ago
MODERATORS
 

I don’t know how there aren’t a myriad of problems associated with attempting to emulate the brain, especially with the end goal of destroying livelihoods and replacing one indentured servant for another. In fact, that’s what promoted this post- an advertisement for a talk with my alma mater’s philosophy department asking what happens when see LLMs discover phenomenological awareness.

I admit that I don’t have a ton of formal experience with philosophy, but I took one course in college that will forever be etched into my brain. Essentially, my professor explained to us the concept of a neural network and how with more computing power, researchers hope to emulate the brain and establish a consciousness baseline with which to compare a human’s subjective experience.

This didn’t use to be the case, but in a particular sector, most people’s jobs are just showing up a work, getting on a computer, and having whatever (completely unregulated and resource devouring) LLM give them answer they can find themselves, quicker. And shit like neuralink exists and I think the next step will to be to offer that with a chatgpt integration or some dystopian shit.

Call me crazy, but I don’t think humans are as special as we think we are and our pure arrogance wouldn’t stop us from creating another self and causing that self to suffer. Hell, we collectively decided to slaughter en masse another collective group with feeling (animals) to appease our tastebuds, a lot of us are thoroughly entrenched into our digital boxes because opting out will result in a loss of items we take for granted, and any discussions on these topics are taboo.

Data-obsessed weirdos are a genuine threat to humanity, consciousness-emulation never should have been a conversation piece in the first place without first understanding its downstream implications. Feeling like a certified Luddite these days

you are viewing a single comment's thread
view the rest of the comments
[–] NuraShiny@hexbear.net 6 points 1 month ago* (last edited 1 month ago) (1 children)

"Oh no my phone keyboard knows the next word I want to type almost like it's intelligent!"

Thee are not intelligent programs. They don't have memory of the past except for a few previous prompts. You are giving these tech bros way too mcu credit. It's like snake oil. Of course Doctor Health's snake oil can cure all ailments! Please buy it now, while it's this cheap! We just need ot get to the step where we tar and feather these fucks for their lies.

[–] Saeculum@hexbear.net 2 points 1 month ago (2 children)

LLMs almost certainly aren't going to be where it eventually comes from, but I have no doubt we'll get there some day.

[–] spicehoarder@lemm.ee 1 points 1 month ago

LLMs are the Vacuum Tubes of AI, we just need to invent the transistor.

[–] NuraShiny@hexbear.net 1 points 1 month ago (1 children)

No doubt? Okay. What makes you so sure?

[–] Saeculum@hexbear.net 1 points 1 month ago* (last edited 1 month ago) (1 children)

Nature has already shown us that it's possible, and anything nature can do, we can iterate and improve upon.

[–] NuraShiny@hexbear.net 1 points 1 month ago

Nature has not shown that AI is possible.