this post was submitted on 14 Oct 2024
16 points (76.7% liked)

Asklemmy

43852 readers
1190 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

I'm looking to get my first subscription of a machine learning model and I've been using POE for a while but I'm not sure if paying for it would be better than paying for a GPT subscription. I almost never use them to generate images, mostly for help with my business and some programming.

I also want my wife to be able to use the same account when I start paying for it.

I'm not sure what the benefits of each are and which would outweigh.

all 20 comments
sorted by: hot top controversial new old
[–] xmunk@sh.itjust.works 35 points 4 weeks ago (1 children)

Honestly, all of the generative AI subscriptions are pretty fucking steep at this point compared to just running a model locally.

[–] Bluefruit@lemmy.world 2 points 4 weeks ago

I agree with this. I'm using a 1070ti for image gen and it would be more than capable for handling some LLM stuff. An AMD 7700xt ive found dors well with 7B models on my main rig but im sure you could get away with somthing cheaper or less powerful.

That said, the amount of text you can genrate or the context length of its answers will depend the model you use and the larger the model, the more power it takes.

If youre just messing around with it or want it to review or answer small questions, I'd say a 1070ti like I'm using would be just fine. Some folks use even more budget friendly options. If you got a gaming machine with any semi recent GPU, I'd say go for it. Worst case, you can pay for a subscription later if you really want.

[–] PerogiBoi@lemmy.ca 14 points 4 weeks ago (1 children)

Download gpt4all and you can get an open source model that performs basically as good as any of the paid ones.

[–] SurpriZe@lemm.ee 1 points 3 weeks ago (1 children)

Thanks, I've done just that and installed it too! What's the best gpt4all LLM model or the model you'd recommend?

[–] PerogiBoi@lemmy.ca 1 points 3 weeks ago (1 children)

Llama is a solid choice. Or mistral. I use moistral which was made for porn but it’s pretty uncensored in general. Doesn’t have qualms about ethics or illegalities.

[–] SurpriZe@lemm.ee 1 points 3 weeks ago (1 children)

Does it mean Llama does have that? And how does that affect the performance? I mean the thing about "no qualms about ethics"

[–] PerogiBoi@lemmy.ca 1 points 3 weeks ago

I’m sure there’s an uncensored llama somewhere but the ones I’ve tried weren’t truly uncensored.

In terms of performance what it just means is that if I ask it something mildly sexual or inappropriate, it will answer it without giving an “as an ai language model, I can’t do…” speech.

[–] flashgnash@lemm.ee 10 points 4 weeks ago (1 children)

Don't get chatgpt plus, just get an API token and use one of the desktop apps/CLIs, it's pay as you go and way cheaper unless you're using gpt 4 all day every day or something

[–] B0rax@feddit.org 1 points 4 weeks ago (1 children)

Do you have an example for a desktop app that would use these tokens?

[–] flashgnash@lemm.ee 2 points 4 weeks ago

I don't, you'd have to have a Google

I use gpt-cli which is pretty good if you're ok with using a terminal https://github.com/kharvd/gpt-cli

[–] tyler@programming.dev 8 points 4 weeks ago

I canceled my ChatGPT subscription a month or two ago. It just got completely unreliable. Like someone else said, Claude is way better but they’re both disappointing at this point. I only subscribed to Claude like last week to help solve an incredibly last minute thing. Not sure I’m going to stay subscribed.

[–] PhilipTheBucket@ponder.cat 8 points 4 weeks ago (1 children)

Claude.ai is quite a bit superior to GPT in my experience. That one, I pay for, and it seems like it's worth it.

[–] SurpriZe@lemm.ee 0 points 3 weeks ago (1 children)

Thanks but why would you say it's superior to GPTo1?

[–] PhilipTheBucket@ponder.cat 1 points 3 weeks ago

I haven't played around with GPT o1; I just checked, and I don't have access. I'm not saying it's necessarily bad without having experienced it. But OpenAI has been getting steadily worse for a while, so I'm assuming that the stuff I've interacted with is indicative of the quality of the new stuff. It's all of a piece.

[–] GeorgeGR@lemmy.world 4 points 4 weeks ago

I've run some local llms (3060, 12g vram), and I daily generate images locally (wouldn't pay for that), but I do pay for a chatgpt subscription. I think it's worth it for my purposes. Responses are way faster and higher quality than any local model I've tried, web search integration, image recognition, mobile app seamless, I use all of those features regularly. Unfortunately I've never ver used POE so I can't compare, sorry.

[–] mosiacmango@lemm.ee 3 points 4 weeks ago* (last edited 4 weeks ago)

Try Ollama. No payment required.

[–] nitefox@sh.itjust.works 1 points 3 weeks ago
[–] tempest@lemmy.ca 0 points 4 weeks ago

Maybe check out Kagi's ultimate tier. They let you swap between some of the different options to see which you might find useful. As a bonus you also get kagi search which can be useful.

https://help.kagi.com/kagi/ai/assistant.html

[–] Rolando@lemmy.world -1 points 4 weeks ago

FWIW I only ever used those services if they accepted a prepaid credit card. OpenAI didn't accept prepaid cards when I tried, not sure about Poe. Just something to think about.