this post was submitted on 15 Feb 2024
29 points (87.2% liked)

Autism

6750 readers
37 users here now

A community for respectful discussion and memes related to autism acceptance. All neurotypes are welcome.

We have created our own instance! Visit Autism Place the following community for more info.

Community:

Values

  • Acceptance
  • Openness
  • Understanding
  • Equality
  • Reciprocity
  • Mutuality
  • Love

Rules

  1. No abusive, derogatory, or offensive post/comments e.g: racism, sexism, religious hatred, homophobia, gatekeeping, trolling.
  2. Posts must be related to autism, off-topic discussions happen in the matrix chat.
  3. Your posts must include a text body. It doesn't have to be long, it just needs to be descriptive.
  4. Do not request donations.
  5. Be respectful in discussions.
  6. Do not post misinformation.
  7. Mark NSFW content accordingly.
  8. Do not promote Autism Speaks.
  9. General Lemmy World rules.

Encouraged

  1. Open acceptance of all autism levels as a respectable neurotype.
  2. Funny memes.
  3. Respectful venting.
  4. Describe posts of pictures/memes using text in the body for our visually impaired users.
  5. Welcoming and accepting attitudes.
  6. Questions regarding autism.
  7. Questions on confusing situations.
  8. Seeking and sharing support.
  9. Engagement in our community's values.
  10. Expressing a difference of opinion without directly insulting another user.
  11. Please report questionable posts and let the mods deal with it. Chat Room
  • We have a chat room! Want to engage in dialogue? Come join us at the community's Matrix Chat.

.

Helpful Resources

founded 1 year ago
MODERATORS
 

It’s called Pi and it’s a conversational AI made to be more of a personal assistant. In the bit of time I’ve used it, it’s done far better than I expected at reframing and simplifying my thoughts when I’m overwhelmed.

Obviously, talking to a real person is much better if possible, but the reality is some of us don’t have the finances to pay for therapy or other ways to cope with the anxiety/depression that so often comes with ASD. What are your thoughts on this?

you are viewing a single comment's thread
view the rest of the comments
[–] shootwhatsmyname@lemm.ee 9 points 7 months ago (8 children)

That’s a huge concern for me too. They do explicitly state in the Privacy Policy that your data will never be sold or shared with third parties for advertising purposes, but that only means so much. It would be nice to see a full list of exact companies/services they use behind-the-scenes. Regardless, I really look forward to the day I can self-host something this powerful myself

[–] haui_lemmy@lemmy.giftedmc.com 7 points 7 months ago (4 children)

It might be correct at this moment (as they have not yet found the data breach or decided selling your data is more profitable).

I would absolutely prefer something selfhosted. If its small, it can run on a pi. If it needs gpu power, one could host it for their friend group or family and recoup the cost and effort that way.

But I honestly dont think a post training conversational AI (for one person) should be that demanding. We‘d need a ML specialist to confirm that though. Some really know what they’re talking about.

[–] shootwhatsmyname@lemm.ee 4 points 7 months ago (3 children)

Agreed. I’ve dabbled in it some but I’m no expert, maybe someone else could chime in. I just haven’t found anything that works quite as well as Pi yet and it was really intriguing to say the least. You can even talk to it verbally back and forth like a phone call

[–] TheBluePillock@lemmy.world 3 points 7 months ago

I would love to be corrected, but when I looked into it, it sounded like you'd probably want 32gb VRAM or better for actual chat ability. You have to have enough memory to load the model, and anything not handled by your GPU takes a major performance hit. Then, you probably want to aim for a 72 billion parameter model. That's a decently conversational level and maybe close to the one you're using (but it's possible they're higher? I'm just guessing). I think 34B models are comparatively more prone to hallucination and inaccuracy. It sounded like the 32GB VRAM was kinda entry point for the 72B models so I stopped looking, because I can't afford that.

So somebody with more experience or knowledge can hopefully correct me or give a better explanation, but just in case, maybe this is a helpful starting point for someone.

You can download models on huggingface.co and interact with them through a web-ui like this one.

load more comments (2 replies)
load more comments (2 replies)
load more comments (5 replies)