this post was submitted on 20 Dec 2024
636 points (99.4% liked)

196

16745 readers
2255 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 2 years ago
MODERATORS
 
(page 2) 43 comments
sorted by: hot top controversial new old
[–] kate@lemmy.uhhoh.com 7 points 1 week ago

no, stupid questions!

[–] geneva_convenience@lemmy.ml -5 points 1 week ago (2 children)

Not going to lie LLMs give pretty solid medical advice. They are trained on huge medical datasets. Works well as long as you have a common problem.

[–] Umbrias@beehaw.org 4 points 1 week ago* (last edited 1 week ago) (1 children)

they do not and saying this sort of thing actually harms people. do not trust an llm with anything medical, ever.

llms have no conception (of anything) of truth, you are getting a probabilistic bullshit. in a literal sense.

[–] geneva_convenience@lemmy.ml 3 points 1 week ago* (last edited 1 week ago) (1 children)

You do not trust them. Fact check their recommendations. Check the diagnose and see if the symptoms match on google. And if it is something possibly grave then go to a doctor.

But many people do not feel like going to the doctor for every small thing they wonder about.

[–] Umbrias@beehaw.org 1 points 1 week ago (1 children)

the effort to fact check the bussshrt machine is exactly the same hs not using the bullshit machine inethe first place.

[–] geneva_convenience@lemmy.ml 2 points 1 week ago (1 children)

No it helps suggest the same answer faster.

One could argue who needs Google when you have books as well.

[–] Umbrias@beehaw.org 2 points 1 week ago* (last edited 1 week ago) (1 children)

except the "answer" is likely to be wrong such that the same search needs to be done sans ai to verify it. you do not understand ai if you think they provide the same answers, they are probabilistic bullshit machines. Google does not provide answers (well didnt used to, when it was better). it catalogued places to find them. your analogy is more accurately "who needs libraries when you have books".

Do not ever use llm for medical advice, doing so will literally harm and kill people. if you want bullshit, just make some up, you dont even need the llm for that.

i will repeat this in no uncertain terms, llm cannot provide information, trusting one with your healthcare in any capacity is Stupid. it is stupid, harmful, and will kill people. if you value your health at all, just use normal web searches to get info from medical websites, call nurse hotlines, use urgent cares, call doctors offices for advice or appointments.

[–] geneva_convenience@lemmy.ml 2 points 1 week ago* (last edited 1 week ago) (1 children)

My dude you don't need to believe all medical advice it gives. Just use it to get a feel in the right direction and then check whether the symptoms match. It will often suggest multiple options and rate them by likelyhood.

I do not think you understand how much training these LLM's have on medical material. They can accurately diagnose almost any common disease it is not like WebMD always suggesting you have stage 5 cancer.

Seriously try it once before going full anti AI mode.

[–] Umbrias@beehaw.org 2 points 1 week ago (1 children)

You quite literally cannot trust them, their produced information entropy is too high. I understand how much training they have on medical text, you dont understand how little that means. These models are fundamentally incapable of assessing the truth of a statement, you are using something you dont even understand to give you advice about something it cannot reliably give and lack the expertise needed to understand how accurate they actually are at any given answer, on a topic that directly influences your actual physical wellbeing!

"just try it bro it's good i promise" you should actually prompt an llm about a topic you know about in detail. the amount of errors are rampant, then apply that same inaccuracy to topics you know nothing about.

my next recommendation is that since you are not a healthcare professional, do not give medical advice like "use llm" as you personally clearly cannot verify the accuracy of llm for this role.

[–] geneva_convenience@lemmy.ml 2 points 1 week ago (3 children)

If you want to visit a Doctor for every minor thing feel welcome. So far LLM's have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.

This does not mean they are infaillible. But you can easily check what they suggest and see whether the symptoms match other websites and the doctors description.

load more comments (3 replies)
[–] GrammarPolice@lemmy.world 1 points 1 week ago (1 children)

Oops, you triggered the AI hate squad

[–] Psythik@lemmy.world 3 points 1 week ago

I love AI. I use the new AI Sidebar in Firefox every single day.

load more comments
view more: ‹ prev next ›