this post was submitted on 22 Jul 2024
69 points (94.8% liked)

No Stupid Questions

35751 readers
654 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
 

Is there any computer program with AI capabilities (the generative ones seen in ChatGPT; onlineText-to-Picture generators, etc.) that is actually standalone? i.e. able to run in a fully offline environment.

As far as I understand, the most popular AI technology rn consists of a bunch of matrix algebra, convolutions and parallel processing of many low-precision floating-point numbers, which works because statistics and absurdly huge datasets. So if any such program existed, how would it even have a reasonable storage size if it needs the dataset?

you are viewing a single comment's thread
view the rest of the comments
[–] Rhaedas@fedia.io 7 points 3 months ago (1 children)

The AI, image, and audio models that can run on a typical PC have all been broken down from originally larger models. How this is done affects what the models can do and the quality, but the open source community has come a long way in making impressive stuff. First question is more hardware - do you have an Nvidia GPU that can support these types of generations? They can be done through CPU alone, but it's painfully much slower.

If so, then I would highly recommend looking into Ollama for running AI models (using WSL if you're using Windows) and ComfyUI for graphical generation. Don't let the workflow of complicated ComfyUI scare you, starting from the basics with plenty of Youtube help out there it will make sense. As for TTS, there's a lot of constant "new stuff" out there, but for actual local processing in "real time" (still takes a bit) I have yet to find anything to replace my Coqui TTS copy with Jenny as the model voice. It may take some digging and work to get that together, it's older and not supported anymore.

[–] hendrik@palaver.p3x.de 1 points 3 months ago* (last edited 3 months ago) (1 children)

I don't think they break them down. For most models the math requires to start at the beginning and train each model individually from ground up.

But sure, a smaller model generally isn't as capable as a bigger one. And you can't train them indefinitly. So for a model series you'll maybe use the same dataset but feed more into the super big variant and not so much into the tiny one.

And there is something where you use a big model to generate questions and answers and use them to train a different, small model. And that model will learn to respond like the big one.

[–] Rhaedas@fedia.io 4 points 3 months ago (1 children)

The breaking down I mentioned is the quantization that forms a smaller model from the larger one. I didn't want to get technical because I don't understand the math details myself past how to use them. :)

[–] hendrik@palaver.p3x.de 1 points 3 months ago* (last edited 3 months ago)

Ah, sure. I think a good way to phrase it is to say they lower the precision. That's basically what they do, convert the high precision numbers to lower precision fomats. That makes the computations easier/faster and the files smaller.

And it also doesn't apply to text, audio and images. As far as I know quantization is mainly used with LLMs. It's also possible with images and audio, but generally they don't do that. As far as I remember it leads to degradation and distortions pretty fast. There are other methods like pruning used with generative image models. That brings down their size substantially.