I just...I just don't need fps and resolution that much. Godspeed to those that feel they do need it.
Hardware
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - !augmented_reality@lemmy.world
- Gaming Laptops - !gaminglaptops@lemmy.world
- Laptops - !laptops@lemmy.world
- Linux Hardware - !linuxhardware@programming.dev
- Mechanical Keyboards - !mechanicalkeyboards@kbin.social
- Microcontrollers - !microcontrollers@lemux.minnix.dev
- Monitors - !monitors@lemm.ee
- Raspberry Pi - !raspberry_pi@programming.dev
- Retro Computing - !retrocomputing@lemmy.sdf.org
- Single Board Computers - !sbcs@lemux.minnix.dev
- Virtual Reality - !virtualreality@lemmy.world
Icon by "icon lauk" under CC BY 3.0
No one should, video graphics haven't progressed that far. Only the lack of optimisation has.
You're missing a major audience willing to pay $2k for these cards, people wanting to run large AI language models locally.
What if I want a ton of VRAM for blender
That is another audience & good point. There are people that want these though for other uses than gaming.
I'm staying on 1440p deliberately. My 3080 is still perfectly fine for a few more years, at least current console gen.
You're not wrong. I just recently upgraded my whole machine going from a 3090 to a 4090 on 1440p and basically can't tell the difference.
VR enthusiasts can put it to use. The higher end headsets have resolutions of over 5000 x 5000 pixels per eye.
You are basically rendering the entire game twice, once for each eye, and the resolution is like eight times as many pixels compared to your typical 1080p game
The prices are high, but what really is shocking are the power consumption figures. The 5090 is 575W(!!), while the 5080 is 360W, 5070Ti is 300W, and the 5070 is 250W.
If you are getting one of these, factor in the cost of a better PSU and your electric bill too. We're getting closer and closer to the limit of power from a US electrical socket.
Going to need to run a separate PSU on a different branch circuit at this rate.
1000W PSU pulls max 8.3A on a 120v circuit.
Residential circuits in USA are 15-20A, very rarely are they 10 but I've seen some super old ones or split 20A breakers in the wild.
A single duplex outlet must be rated to the same amperage as the breaker in order to be code, so with a 5090 PC you're around half capacity of what you'd normally find, worst case. Nice big monitors take about an amp each, and other peripherals are negligible.
You could easily pop a breaker if you've got a bunch of other stuff on the same circuit, but that's true for anything.
I think the power draw on a 5090 is crazy, crazy high don't get me wrong, but let's be reasonable here - electricity costs yes, but we're not getting close to the limits of a circuit/receptacle (yet).
Actually the National Electric Code (NEC) limits loads for 15 Aac receptacles to 12 Aac, and for 20 Aac receptacles 16 Aac iirc because those are the breaker ratings and you size those at 125% of the load (conversely, 1/125% = 80% where loads should be 80% of the break ratings).
So with a 15 Aac outlet and a 1000 Wac load at minimum 95% power factor, you're drawing 8.8 Aac which is ~73% of the capacity of the outlet (8.8/12). For a 20 Aac outlet, 8.8 Aac is ~55%% capacity (8.8/16).
Nonetheless, you're totally right. We're not approaching the limit of the technology unlike electric car chargers.
It's clear what must be done - all US household sockets must be changed to 220V. Sure, it'll be a notable expense, but it's for the health of the gaming industry.
Don't be silly.
Just move your PC to your laundry room and plug it into the 240V dryer outlet.
575W TDP is an absolute fucking joke.
Welp, looks like I'll start looking at AMD and Intel instead. Nvidia is pricing itself at a premium that's impossible to actually meet compared to competitors.
There will be people that buy it. Professionals that can actually use the hardware and can justify it via things like business tax benefits, and those with enough money to waste that it doesn't matter.
For everyone else, competitors are going to be much better options. Especially with Intel's very fast progression into the dedicated card game with Arc and generational improvements.
Just using this thread as a reminder the new Intel Arc B580 is showing 4060 performance for only $250
5000 series cards are made for idiots
5000 series cards are made for professionals and idiots
I’m here to represent the professionals ∩ idiots. We exist too.
Although seeing those prices is reminding me my mobile 3070 has been perfectly usable
Some people don't care about spending $2000 for whatever. I mean, I'm not one of those people but they probably exist.
I'm probably one of those people. I don't have kids, I don't care much about fun things like vacations, fancy food, or yearly commodity electronics like phones or leased cars, and I'm lucky enough to not have any college debt left.
A Scrooge McDuck vault of unused money isn't going to do anything useful when I'm 6 feet underground, so I might as well spend a bit more (within reason*) on one of the few things that I do get enjoyment out of.
* Specifically: doing research on what I want; waiting for high-end parts to go on sale; never buying marked-up AIB partner GPUs; and only actually upgrading things every 5~6 years after I've gotten good value out of my last frivolous purchase.
From google:
The RTX 4090 was released as the first model of the series on October 12, 2022, launched for $1,599 US, and the 16GB RTX 4080 was released on November 16, 2022 for $1,199 US.
So they dropped the 80 series in price by $200 while increasing the 5090 by $400.
Pretty smart honestly. Those who have to have the best are willing to spend more and I’m happy the 80 series is more affordable.
I’m happy the 80 series is more affordable
I'd hardly call $1200 affordable.
It’s $999 now which is more affordable than $1200
The 2k USD price is surely only in order to make the cheaper cards appear reasonably priced.
And this is BEFORE the tariffs!
But you see because of the tariffs the American gamers will just default to American GPUs, duh.
I bought my 4080 super recently and hopes it last me a good +12 years like my old card did. These prices are insane!
I really wanted a 512bit-bussy GPU for a decade+ ... but perhaps never is just as good.
I don't know a lot about computers, but I do know a fair amount about bussy. $2000 for 512 is a steal!
My question is will the 5080 perform half as fast as the 5090. Or is it going to be like the 4080 vs 4090 again where the 4080 was like 80% the price for 60% the performance?
You know it's the latter... and that even those numbers are probably optimistic.
Why do people buy this stuff? It only takes like a year before it falls in price as the next one comes along. Gotta get that last 2FPS, I guess.
it seems people realized this and the old cards aren't even properly falling in price anymore, even on the used market
Yeah I'm never willing to afford the best. I usually build a new computer with second best parts. With these prices my next computer will be with third best stuff I guess.
Yeah sure, the 5090 will be a 2k the same way a 3080 went for 800...i watched them peak at 3500 (seriously, i screenshotted it but it got lost as i gave up the salt).
The 4090 is sitting at 2400 ($2500)right now over here, i can 100% assure you the 5090 will cost more than that when it gets here.
They claim the 5070 gives 4090 performance for $549. that lower end of the 50 series line up looks nice.
That "4090 performance" includes the new frame generation where only a quarter of all frames is rendered. It's borderline false advertising.