this post was submitted on 31 Aug 2023
11 points (100.0% liked)

PC Gaming

106 readers
2 users here now

Discuss Games, Hardware and News on PC Gaming **Discord** https://discord.gg/4bxJgkY **Mastodon** https://cupoftea.social **Donate** https://ko-fi.com/cupofteasocial **Wiki** https://www.pcgamingwiki.com

founded 2 years ago
 

AMD's Radeon boss has talked about the RDNA 3 GPU power efficiency, 12VHPWR on Radeon RX 7000 GPUs & ray tracing capabilities.

The interview is very detailed and we would like you to visit Club386 and read the full thing here but some interesting comments were made regarding a few aspects of the RDNA 3 "Radeon RX 7000" GPU family and what we can expect in the coming generation.

Back when AMD was in the process of launching its RDNA 3 GPU architecture, the company promised a monumental +54% increase in power efficiency vs. RDNA 2 GPUs through the use of chiplets and other changes. However, the launch saw little gains in the efficiency department, all the while NVIDIA took their efficiency to a whole new level with the Ada GPU architecture. Scott says that AMD believes in offering good performance per watt across their GPU lineup & that it matters more on the notebook front. So far, AMD has only introduced its non-chiplet Navi 33 to laptops.

you are viewing a single comment's thread
view the rest of the comments
[–] wahming@monyet.cc 3 points 1 year ago (2 children)

Does your GPU really contribute that much to your electric bill? Idk, I haven't done the math myself

[–] wolfshadowheart@kbin.social 8 points 1 year ago (2 children)

It's pretty sizeable if you're running it 24/7 without checks.

A recently common example might be someone interested in running Stable Diffusion locally. Run the program overnight that's drawing ~300-400 watts for 12 hours. For a comparison, an electric heater can run up to 1,200 watts and those are known to absolutely raise the electric bill gone unchecked (like putting it on a timer vs. running over night).

For gaming 20 hours a week? Probably not too much. For gaming 20 hours a week and running AI a few times a week overnight (40 hours) it's noticeable for sure.

However there's also the factor of ambient heating, so there's technically some offset cost if you need heating... And it's also not something to ignore, I had a room consistently 68F with bad insulation up to 85F with a window open, hotter in the summers of course.

Overall, yes but also no. Like with most things it's really about the use case and consistency. NVIDIA GPU in a media server? Higher energy costs than something like an intel quicksync for limited realistic gains but somewhat noticeable cost increase. Gaming GPU running high idle all the time just browsing and watching videos? Definitely more expensive than just a laptop, but with proper checks like the gaming computer put in eco mode they're more equatable.

[–] wahming@monyet.cc 5 points 1 year ago (1 children)

Righto, thanks for the detailed reply.

The CEO might not be far wrong in that case, the average user probably doesn't run their GPU long enough to notice efficiency gains. And given their preferred market are the ones with money to burn, it makes sense they'd target improved performance over efficiency.

[–] wolfshadowheart@kbin.social 3 points 1 year ago

I'm inclined to agree as well, although I do think energy efficiency is environmentally important and the solution shouldn't be to throw more power at the hardware. For that reason I do appreciate some middleground between the two.

Realistically, my friends 7900XTX compared to my 3080 are within the same power consumption under load but he has 24GB of VRAM where I do not. To get that there with NVIDIA needs an extra 150 watts for the 4090 or 3090. Regardless of performance elsewhere, that's pretty sizeable, so it would be a shame to potentially lose that in place of something like a 30GB VRAM card pushing 450 watts from AMD.

[–] arefx@lemmy.ml 0 points 1 year ago* (last edited 1 year ago) (1 children)

Most people aren't running their computer 24/7 doing crazy tasks tho. I almost never turn my PC off but if I'm not using it it's generally in sleep mode, some times I just leave it on and running like if I know I have to go out for a few hours but will be right back on the PC where I left off when I return home, but generally it's asleep and I can't imagine it's using much electricity at that point.

[–] wolfshadowheart@kbin.social 2 points 1 year ago (1 children)

It's all relative. To add some more context with your description,

As mentioned under load my PC (5800x3D + 3080 10gb) draws between 350 and 575 watts (depends on if I have monitors plugged into my UPS and the GPU power draw, some programs draw more than others).

Idling my PC draws about 175 watts.

In sleep my PC draws about 68-80 watts.

Like the NVIDIA GPU server example, even though it's not a lot of power, in comparison to more efficient computers doing the same task it's exorbitant.

You're right that most people with a GPU won't even be running something under load for a few hours - if they don't do rendering and they don't use AI then gaming is the only thing left that can really put a GPU to use.

So then it becomes about efficiency deciding how to optimize those tasks. If AMD can push out performance relative to NVIDIA but for 100 watts less, that's the difference between a PC in idle and a PC in sleep. That's pretty sizeable to ignore, even if you just leave the PC on 24/7 as a gaming PC+ready to use web browser. Similarly, if I'm deciding to put my GPU to use at all, it seems reasonable to consider long term cost efficiency. It's weird to think about since we don't push it much, but 20 hours a week gaming even 5 years ago vs. today is a huge power difference. Just look at the 1080Ti, a beast back then and still holds up today. Draws only 300 watts under load, and the 980ti can get 250 watts.

In terms of performance, 450w for even the 3090 let alone the 4090 absolutely blows these out of the water, but in terms of long term idle they are also, relatively, much more expensive.

All in all, most people aren't putting their PC under load 24/7 but most people also aren't only turning it on as needed. While it's true that they're not consistently drawing 300+ watts all the time, they are still likely idling (on just not being used actively) at higher levels than previous generations. My idle is quite close to the 980ti under load which is pretty insane.

[–] geosoco@kbin.social 2 points 1 year ago

Yeah, it also matters the specific setups and cards. There's still some issues with idle power on newer AMD cards in multi-monitor configs where they draw >50w doing nothing and some configs draw 100+w idling. It's been an issue since the 7k cards released I believe. They've had a few updates which has helped for some people according to various reddit threads, but not everyone. I think the Nvidia ones by comparison only pull like 8w-20w idling.

This isn't major for most users utilizing sleep most of the day, but it's also add up over time.

[–] geosoco@kbin.social 5 points 1 year ago* (last edited 1 year ago) (1 children)

as with everything "it depends" on many aspects. In the US it's relatively cheap in many places (~0.12-0.15 USD per kwh), and high-end cards running at high settings can suck ~300w. Averaging 3hrs a day could cost ~36-45 cents a day just on the GPU alone for high-end settings. Not a huge deal, but in places where electricity is 2-3x the price it could be more of an issue (or at surge times for folks in Texas).

But with the new atx3.0 and 600w power cables, we could see double those costs for high end cards in the next few years, and putting full pc power at ~900+ watts.

Efficiency does affect more than just electricity costs. less efficient chips also means more heat and more massive coolers. Many of the higher end cards today have thick, heavy coolers that we now have anti-sag braces. It also potentially means more noise for fan coolers or requiring more expensive coolers and more electronics to regulate that power in stages.

[–] wahming@monyet.cc 3 points 1 year ago

Yeah, I guess electricity is pretty cheap in my corner of Europe (like a quarter that), so I don't really notice.

Efficiency does affect more than just electricity costs. less efficient chips also means more heat and more massive coolers. Many of the higher end cards today have thick, heavy coolers that we now have anti-sag braces. It also potentially means more noise for fan coolers or requiring more expensive coolers.

Good points. I guess the average customer for their new cards is probably willing to put up with said issues to get the latest and greatest performance, or that's what the CEO is counting on