This is an automated archive.
The original was posted on /r/singularity by /u/acotwo on 2024-01-11 01:46:37+00:00.
I'm looking for some guidance here hopefully we can have a good discussion. All of this sea of information on Local AI's or just AI in general is greek to me but I'm trying to learn as I go. I'm trying to setup a local AI that interacts with sensitive information from PDF's for my local business in the education space. In essence I'm trying to take information from various sources and make the AI work with the concepts and techniques that are described, let's say in a book (is this even possible). A lot of this information I would prefer to stay private so this is why I would like to setup a local AI in the first place.
My specs are as follows:
Intel(R) Core(TM) i9-10900KF CPU @ 3.70GHz 3.70 GHz
Installed Ram: 16.0 GB
System type: 64-bit operating system, x64-based processor
NVIDIA GeForce RTX 3070
MSI Z490-A Pro motherboard
If the preferred local AI is Llama what else would I need to install and plugin to make it work efficiently. I'd imagine I would need some extra setups installed in order for my pdf's or other types of data to be read, thanks.