this post was submitted on 06 Oct 2024
13 points (62.3% liked)
Apple
17607 readers
32 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Apparently people don’t like hearing that. xD
I use all three, Mac, Linux, and Windows, all the time. Mac is the only one I’m ok with having 8GB of RAM. At least 12 on the other two, unless you use zram swap on Linux, then you can get away with 8. Afaik, Windows doesn’t have anything like that, so 16 is best, but 12 is ok.
I don’t really understand why people would downvote that.
I realized recently that my Raspberry Pi 4 has just 4GB of RAM, but while syncing huge files to Storj I’ve noticed it doesn’t fill up whatsoever (even with slow spinning hard drives).
I’m starting to think for most things I do CPU is more important than having tons of memory.
I also have a Raspberry Pi with 4G and it handles its load perfectly fine.
BUT lack of memory is a well known bottleneck so when I got a Raspberry Pi 5 with double the processing I also doubled the memory to keep it fed. While I haven’t really found a good niche for the new beast yet, if I’m spending money on a faster processor, faster board, why would I limit it by cheaping out on memory.
While we know that Apples memory is much faster than anyone else’s, we also know the entire system is outstanding. If I spend so much on a system with such high throughput, why would I want to cripple it by cheaping out on memory to save a relatively small cost? It’s not that I really have a need but that I’m paying for a beast so it better be able to go beast mode
If you’re transferring files over a socket (like through SMB or SFTP), the receiving end usually has a small buffer, like 64KB. It’ll just pause the stream if it’s receiving data faster than it can push it to disk and the buffer gets full. So usually a file transfer won’t use much memory.
There is some poorly written software that doesn’t do that, though. I ran into a WebDAV server that didn’t do that when I was writing my own server. That’s where you could run into out of memory errors.
That lines up with what I know about networking, but on the software side I figured it would chew through memory quick (especially because it’s encrypting it on the fly).