this post was submitted on 19 Feb 2024
98 points (85.0% liked)
Asklemmy
43816 readers
1133 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why would we run out of RAM? Is there new matter being created? It's not like we're storing anything. We will keep using the same resources.
The nature of quantum interactions being probabilistic could be some resource saving mechanism in a higher order simulation.
New human instances are being created, and as our society's general education keeps going up, they demand more processing power.
As our tech goes up, this has to be simulated as well. Not only things like telescopes and the LHC, but your computer who's running a game world doesn't actually exists and it's the super computer who's running it.
Obviously, this is just a drop in the bucket for an entity that can make a fully simulated universe but the situation quickly becomes untenable if we start creating hyper advanced simulation as well, we are maybe only a few decades away.
Human instances still run on the same underlying physics. No further RAM is needed.
Everything is made up of atoms/photons/etc. If every particle is tracked for all interactions, it doesn't matter how those particles are arranged, it's always the same memory.
Atoms and photons wouldn't actually exist, they would be generated whenever we measure things at that level.
Obviously, there's many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn't make for good conversation since it would be indistinguishable from reality.
I was thinking more of a video game like simulation, where the sim doesn't render things it doesn't need to.
That can't work unless it's a simulation made personally for you.
I don't follow. If there are others it would render for them just as much as me. I'm saying it wouldn't need to render at an automic level except for the few that are actively measuring at that level.
Everything interacting is "measuring" at that level. If the quantum levels weren't being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.
If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.
None of it would be real, the microscopic effects would just be approximated unless a precise measurement tool would be used and then they would be properly simulated.
We wouldn't know the difference.
But you already said you have to go that far whenever someone is doing something where they could notice microscopic effects.
So it's not a simulation as much as a mind reading AI that continuously reads every sentient mind in the entire universe so as to know whether they are doing a microscopic observation that needs the fine grained resolution result or an approximation can be returned.
There would be no need to go that far at all times is what I'm saying. It's the equivalent of a game rendering stuff far away only when you use a scope. Why render everything at all times if it isn't being used and does not affect the experience. It would augment the overhead by an insane amount for little to no gain.
This is also just a thought exercise.
But how does the simulation software know when it needs to calculate that detail? If you are the only person in the simulation, it's obvious because everything is rendered from your perspective. But if it's more than one person in the universe, an ai program has to look at the state of the mind of everyone in the universe to make sure they aren't doing something where they could perceive the difference.
Am I microwaving a glass of water to make tea, or am I curious about that YouTube video where I saw how you can use a microwave to measure the speed of light. Did I just get distracted and didn't follow through with the measurement? Only something constantly monitoring my thoughts can know. And it has to be doing it for everyone everywhere in the entire universe.
The way I see it, it would be coupled with the tool and not the intention someone has with it. So every microwave would render it properly at all time, as well as most electronics just by their very nature, regardless of what the person plans to do with it.
Actually I think they can probably just approximate the microwave stuff and just keep the electrical tools rendering like oscilloscopes.
They only need to render for things that give an exact measurement, the microwave trick has a 3% tolerance which is huge in the scope of things.
It seems like a lot but it's less than simulating every single atom imo.
It's more than electronics. Every piece of diffraction grating could be used to make a wave interference measurement. Every fiber optic line in the world- because bend it too much and the wave doesn't stay bound inside.
But that still doesn't get rid of the AI part because you need something watching to know when an electronic device is created by anyone everywhere in the universe and understand that that device is a type of device that could be used to reveal detailed measurements.