this post was submitted on 16 May 2024
69 points (96.0% liked)

Firefox

17952 readers
290 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
all 35 comments
sorted by: hot top controversial new old
[–] sgibson5150@slrpnk.net 11 points 6 months ago* (last edited 6 months ago) (1 children)

I set this up today on my work laptop with (internal) RTX3060. According to the status indicators on the Adjust Video Image Settings page in the Nvidia control panel, super resolution is working in Chrome v124.x and v125.x but not at all in Firefox v126.0. My eyes tell me the same thing. I was able to play a 480p YT stream in Chrome and it looked surprisingly good on my external 1440p monitor. In FF it looks like ass. I may set up a secondary profile in FF just to make sure I haven't changed some config setting over the years that would prevent it from working right in FF. Will update if I find anything interesting.

Edit: Just tried this again with YouTube in FF v126.0 with a clean profile. It does work, but only when the video is full screen (which makes sense I guess, but the behavior is different from Chrome) and I had to manually set the quality level in the Nvidia control panel. In Chrome the auto setting used level 4 (the highest level), but in FF the auto setting only used level 1.

[–] geekwithsoul@lemm.ee 4 points 6 months ago (1 children)

Weird. I’m on desktop with an RTX 3080 and both super resolution and HDR are working just fine for me in both full screen and not. Results are actually quite good for me.

I think the default setting for auto depends on source resolution and desired display resolution from what I can see, so it’s variable depending on how and what you’re watching.

You on Windows 10 or 11?

[–] sgibson5150@slrpnk.net 3 points 6 months ago (1 children)

Sorry. Should have mentioned. OS is Windows 11 Pro 23H2 22631.3593. Also, video driver is Nvidia Game Ready Driver 552.44.

[–] geekwithsoul@lemm.ee 1 points 6 months ago (1 children)

Interesting - I’m running the same driver version but on latest version of Windows 10 Pro. In FF, under about:config, is gfx.webrender.enabled or gfx.webrender.all set to true? If not, that might be part of it.

[–] sgibson5150@slrpnk.net 1 points 6 months ago* (last edited 6 months ago)

On the new clean profile I created in v126.0, I didn't have a gfx.webrender.enabled and gfx.webrender.all was set to false. Changing gfx.webrender.all to true didn't really change the behavior. Nvidia control panel only shows super resolution active when full screen. Watching the same test video as yesterday at the same requested resolution. I did notice that if I set the Quality back to auto, with gfx.webrender.all = true, it picked 2 today instead of 1. 🤷‍♂️

Edit: One DDG search later https://support.mozilla.org/en-US/questions/1445419

[–] frankgrimeszz@lemmy.world 8 points 6 months ago (1 children)

I don’t think they thought this through.

[–] Carighan@lemmy.world 22 points 6 months ago (2 children)

Why not? They are one of the last browsers to add support, so I think they quite did?

[–] taladar@sh.itjust.works 18 points 6 months ago (1 children)

One of the last browsers out of the two that exist (ignoring those that don't really develop any of those features themselves)?

[–] Turun@feddit.de 1 points 6 months ago (1 children)
[–] taladar@sh.itjust.works 0 points 6 months ago (1 children)

Safari is also just one of the forks of the KHTML/WebKit/Blink codebase Chrome is based on. Admittedly they probably implement some of the stuff they do implement themselves too because the common ancestor version is quite a long time ago now.

[–] Turun@feddit.de 1 points 6 months ago

They don't incorporate chromium changes in safari, so it should be considered separate.

[–] frankgrimeszz@lemmy.world 9 points 6 months ago (1 children)

It’s a vendor specific feature, as opposed to something any graphics chip can use. It’s kinda like… endorsing a closed source driver feature.

[–] Carighan@lemmy.world 2 points 6 months ago

Meh, with games we want them to work independent of which type of controller we use, but display each driver's specific button graphics as needed. I see no difference here. Do I want dynamic upscaling and auto-HDR for all graphics cards? Sure! Do I still want it optimized for each type of graphics card unless the hardware makers can - unlikely - present a unified API? Of course I do.

[–] Johanno@feddit.de 2 points 6 months ago (3 children)

I don't get the hate.

I mean how many Firefox users can even use this? Requires new gpu including compatible monitor.

And where is the usecase? Videos in browser? Those are usually chopped down anyway. Upscaling will not help there.

So it's a cool feature for the 10 people who can use it.

[–] Midnitte@beehaw.org 16 points 6 months ago (3 children)

I mean how many Firefox users can even use this? Requires new gpu including compatible monitor.

Isn't that exactly why the hate?

Mozilla should focus on adding features everyone can use, not gimmicks from Nvidia that require you to buy their GPU and their approved monitors. Plus considering Nvidia's history with Linux which is a popular OS for Firefox...

AMD doesn't require that shit for, say, FreeSync or FSR.

[–] PolarisFx@lemmy.dbzer0.com 4 points 6 months ago

Nvidia, wayland issues aside is still the superior card 9/10 times. This isn't a gimmick to get people to buy Nvidia, most of us will buy it anyway. During my last purchase as I pondered over whether I should get an amd and move over to wayland or an Nvidia card that would allow me to locally generate images of whatever I wanted. It was a pretty easy decision, I'll stick to X11 and Nvidia until the end. Stuff like this is just a bonus

[–] PipedLinkBot@feddit.rocks 1 points 6 months ago

Here is an alternative Piped link(s):

Nvidia's history

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] Johanno@feddit.de -1 points 6 months ago

True. That is a good argument

[–] swayevenly@lemm.ee 11 points 6 months ago (2 children)

You think only 10 people have an RTX GPU?

Also super resolution and HDR are separate checkboxes.

[–] Johanno@feddit.de -3 points 6 months ago (1 children)

Obviously I was exaggerating. But maybe only 10 people have an rtx, use Firefox, and even want to use the feature.

[–] lud@lemm.ee 1 points 6 months ago* (last edited 6 months ago) (1 children)

Well Nvidia is the biggest Graphic card manufacturer by far so it's quite likely that any individual Firefox user has an RTX card.

I do too, but unfortunately it's an old RTX card which doesn't support this feature (if I recall correctly anyways)

[–] swayevenly@lemm.ee 1 points 6 months ago (1 children)
[–] lud@lemm.ee 1 points 6 months ago

That's nice. Support for my card (2080ti) seems to have been added last October.

[–] breakingcups@lemmy.world 2 points 6 months ago

Well, you've pointed out one of the issues. Was it really worth dedicating an engineer's limited time to?

[–] breakingcups@lemmy.world 2 points 6 months ago (2 children)

Meh, most upscaling is pretty bad and doesn't really add anything.

[–] sgibson5150@slrpnk.net 4 points 6 months ago

From my testing today I found that this actually works pretty well (though not in FF haha). See my top level comment this thread.

[–] underscores@lemmy.dbzer0.com 4 points 6 months ago* (last edited 6 months ago) (1 children)

The problem with AI upscaling is that it does add something. It fills in the details with things that could plausibly be there, regardless of if they are. It's especially dangerous if it's used for something like security footage, where it'll do stuff like make up a face based on a few pixels.

[–] NewNewAccount@lemmy.world 5 points 6 months ago (1 children)

Not a major issue when I’m chilling watching my YouTubes.

[–] underscores@lemmy.dbzer0.com 3 points 6 months ago (2 children)

Depending on the context it's probably not that bad, but there's plenty of details in youtube videos that people pay attention to, like in news, history, tutorials, educational content, and so on. Even for a fictional story, it could add nonsense that people assume is part of the actual show.

[–] Ferk@programming.dev 4 points 6 months ago* (last edited 6 months ago)

If the original footage is so bad that "nonsense that people assume is part of the actual show" "could plausibly be there", then the problem is not with the AI... it wouldn't be the first time I'm confused by the artifacts in a low quality video.

[–] lud@lemm.ee 3 points 6 months ago (1 children)

At worse upscaling will make the video worse. It won't add aliens or some shit to your videos.

[–] underscores@lemmy.dbzer0.com 2 points 6 months ago* (last edited 6 months ago) (1 children)

With AI upscaling it fills it in based on the training from other images/videos. So it probably won't be an alien, but small details common in other videos that looked similar will also show up in the upscaled videos. If an extra flower shows up in a field of grass it's usually not a big deal, but for some things like faces or symbols, small details can really change the way people interpret it.

[–] lud@lemm.ee 1 points 6 months ago

I highly doubt the upscaling could do anything close to adding faces or something.

The best it could do with a blurry face like thing is probably just to make it look sharper.