this post was submitted on 07 Sep 2023
432 points (93.9% liked)

PC Master Race

14946 readers
1 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 1 year ago
MODERATORS
 

lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

you are viewing a single comment's thread
view the rest of the comments
[–] gamermanh@lemmy.dbzer0.com 159 points 1 year ago (4 children)

I'll believe they actually optimized their PC port when their PC ports start showing some signs of effort at being actual PC ports

No FOV slider after FO76 had one is, to me, a great sign of how little Bethesda actually cares about the platform that keeps them popular (thanks to mods)

[–] givesomefucks@lemmy.world 15 points 1 year ago (4 children)

They don't want to put the work in for the biggest benefit of PC gaming.

I don't think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.

And have the bare bones setting be able to work on shitty systems that do need upgraded.

Bethesda just wants to make games that run on a very small slice of PCs, and who can blame them when they can't even release a stable game on a single console? They're just not good at

[–] verysoft@kbin.social 53 points 1 year ago* (last edited 1 year ago) (3 children)

I don’t think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.

This is the mentality they want you to have. And it's a shit one. PCs should be able to run any game well when it comes out.

[–] Streptember@kbin.social 17 points 1 year ago (2 children)
[–] givesomefucks@lemmy.world 23 points 1 year ago (3 children)

Waaaaay too many people want that endorphin hit from setting everything to ultra.

Even if Medium looked exactly the same.

[–] whileloop@lemmy.world 11 points 1 year ago

I'm gonna make a game where the graphics are basically the same at all presets, but with different filters. Crank the bloom and add a sepia filter, call it pro max or something.

[–] PwnTra1n@lemmy.world 6 points 1 year ago

“But this one goes to eleven”

[–] greenskye@lemm.ee 1 points 1 year ago

The industry kind of did it to themselves. We had a really long period what 1080p was the default resolution and games really didn't even try to push graphics at all. Things kind of plateaued post-Crysis for about 10 years before I even felt like we had a game that looked significantly better than it did.

So a lot of people have gotten used to being able to hit ultra settings day 1 because their entire gaming life that's been possible.

[–] YeetPics@mander.xyz 0 points 1 year ago

On a 10 year old potato

[–] schmidtster@lemmy.world 5 points 1 year ago (2 children)

If you’ve got a 5 year old pc, sorry you shouldn’t expect to play on max, let alone anything over medium.

People need to temper their expectations about what a PC can actually do over time.

[–] verysoft@kbin.social 17 points 1 year ago* (last edited 1 year ago) (1 children)

We are talking modern hardware, nobody is expecting a 5 year old PC to be running maxed out games anywhere near as well as the latest hardware should be. People are just more and more willing to bend over and accept shit given to them, there's no reason Starfield couldn't be running better, they certainly had the capabilities at Bethesda to make it so.

[–] schmidtster@lemmy.world -1 points 1 year ago* (last edited 1 year ago) (1 children)

Read the comment chain again, because you missed the persons original point….

They talk about old and modern hardware, you can’t just ignore half their point.

[–] verysoft@kbin.social 5 points 1 year ago* (last edited 1 year ago) (2 children)

I think you are imagining modern hardware to just be like a 4090. Any modern hardware here meaning current generation GPU/CPUs. They should be able to run at max settings yes. The performance match ups of low to mid range hardware of this generation overlaps with mid to high of the last generation (and so on), so just talking years here doesn't really translate.

People holding on to their 1080tis obviously shouldn't be expecting max settings, but people with 3080s, 4070s, 6800XTs (even 2080ti/3070s), should absolutely be able to play on max settings. Especially games like Starfield that are not exactly cutting edge, there's a lot older games that had a lot of work put into performance from the start and they look and run a lot better than this.

[–] Asafum@feddit.nl 2 points 1 year ago* (last edited 1 year ago) (1 children)

I have an i9 9900k and a 4070ti and can play it butter smooth max settings in 4k 100% rendering. The CPU is definitely starting to show its age, but I haven't had any complaints about starfields performance.

That said I can't fucking stand consoles. I get that companies would be stupid to not sell something to as many people as possible, but I'm so sick and tired of seeing good games handicapped because they need to run on a rotten potato Xbox from 10 years ago or whatever...

[–] verysoft@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

Like 40-45fps? I've seen a couple people say this now, but every outlet I have seen benchmark performance contradicts it. I don't consider 40fps smooth at all, but I guess consoles even have to suffer with 30fps in some cases, so a lot of people are okay with it.

Consoles dictate a lot of triple A games, that's where the biggest profit is and why PC is an after thought like it was here.

[–] Asafum@feddit.nl 1 points 1 year ago (2 children)

I actually never pulled up an FPS meter as it has been so smooth I never felt the need to check. I'll see when I get home later what it actually is in neon or somewhere "busy."

[–] verysoft@kbin.social 1 points 1 year ago (1 children)
[–] Asafum@feddit.nl 1 points 1 year ago (1 children)

I'll never understand why developers add stuff that make the game look so much worse...

Looking at you chromatic aberration, motion blur, film grain, vignette...

The first thing I do with a new game is check graphics settings and nuke that extra garbage lol

[–] verysoft@kbin.social 2 points 1 year ago

Yup like sure add it but at least disable it all by default, but motion blur does make low fps look better, if you can put up with the blur that is (I can't), it's used heavily on consoles for that reason.

[–] schmidtster@lemmy.world -3 points 1 year ago (1 children)

Modern literally means the most recent release. And games should be pushing those to the limits on max settings. I semi agree that even the next release of GPUs should be able yo get more out of the game, ie design the game for the future.

If you’re expecting a 2080 to run a game on max, what limits are we pushing with every new gen? You’d be hampering yourself and leave a bunch of dead weight on modern and semi modern GPUs.

[–] verysoft@kbin.social 5 points 1 year ago* (last edited 1 year ago)

Modern literally means the most recent release.

Which I explained would mean the 4000 series/7000 series GPUs and the 13th Gen/Zen 4 CPUs, but the worst one from one of these is not better than the best of the previous generation, so it's not as cut and dry as 'modern/old'.

Starfield is pushing no limits, thats the point. It's just built like shit, so it runs like it. I could maybe be swayed a bit on the matter if it was absolutely ground breaking, but it isn't. It's Fallout 4 in space with less stuff going on at any one time.

[–] Jakeroxs@sh.itjust.works 3 points 1 year ago (1 children)

Remember when "could it run crysis" was a good thing to understand? Now everyone acts like max settings should run on 5 year old gpus and complaining about devs instead.

We're on PCs guys, there's a shit load of variables as to why a game might run poorly on your device, there is absolutely no way a game dev can (or should) be able to account for all those variables. If you want a standard gaming experience with preset settings that run fine, get a console.

[–] kmkz_ninja@lemmy.world 1 points 1 year ago (1 children)

"Could it run Crysis" was a pro for your computer, but it was also always a bit of a dog on the fact that the game was largely unplayable for a lot of people.

[–] Jakeroxs@sh.itjust.works 2 points 1 year ago (1 children)

It was because it pushed the boundaries on what was possible with current gen hardware at the time, that didn't make it unoptimized or a bad game, but that concept seems to be lost on a lot of people.

[–] Linker95@lemmy.world 1 points 1 year ago

Are you seriously suggesting that Starfield pushes any boundary? The game still uses the god forsaken “boxes” from Oblivion, every slice of world pales in comparcomparison to both the size and quality of like, all modern open worlds of comparable budget.

[–] FMT99@lemmy.world 15 points 1 year ago

I'm not super interested how Starfield will play 5 years from now. I didn't even play Skyrim past it's 5th release..

But seriously modders have shown they're up to the task op upgrading the engine and visuals over time.

[–] qarbone@lemmy.world 4 points 1 year ago (1 children)

I like that your last sentence is open-ended, like a Mad-lib

[–] givesomefucks@lemmy.world 3 points 1 year ago

The list of things Bethesda isn't good at it is a little too long to post at this point.

They've been coasting off their name for a long time, even though very few of the staff from the good old days are still around.

The bugs used to be because they were over ambitious and they'd eventually fix them. Now the bugs are because they know modders will fix them eventually for free.

[–] robotrash@lemmy.robotra.sh 3 points 1 year ago (1 children)

How would you expect them to develop a game targeted towards hardware they can't test on due to it not existing? Latest PC hardware should be able to run max or near max at release.

[–] givesomefucks@lemmy.world 3 points 1 year ago (1 children)

I don't think you understand how all those settings work in the option menu...

Or anything that I said in my comment.

It's been explained to other people already

[–] pancakes@sh.itjust.works 3 points 1 year ago

People would rather be angry than know how things work.

[–] Tathas@programming.dev 11 points 1 year ago

Yeah. But at least you can use console command ( ~ tilde as usual ) to change fov Default values are first person 85 and third person 70 Range is 70-120

SetINISetting "fFPWorldFOV:Camera" "85"  
SetINISetting "fTPWorldFOV:Camera" "70"  

When you're happy with what you got, issue

SaveIniFiles  

Or you can just edit

[–] freeman@lemmy.pub 7 points 1 year ago* (last edited 1 year ago)

The game, without using dlss mods, runs at 30fps with some stutters on my system using the optimized settings from hardware unboxed (linked) on a 4060ti. If I install the DLSS-FG mods I immediately get 60-90 fps in all areas, that alone should tell you everything you need to know..

Heres the rub, Im not a FPS whore. Its generally a good experience for this game at 30 FPS assuming you use a gamepad/xbox controller. KB+M it gets really jittery and theres input lag. The game was clearly playtested only using a gamepad. The reactivity of a mouse for looking is much different and the lower FPS the game is optimized for becomes harder to digest.

I have also tested on my 1650ti max-q, and a 1070. Both the 4060ti and 1070 were on an egpu.

My system has an 11th gen i7-1165g7 and 16 gb ddr4 ram. I play at 1080P in all cases.

For the 1650ti and the 1070 the game runs fine IF I do the following

  • i set the preset to Low (expected) and THEN turn scaling back up to maximum/100% and/or disable FSR entirely
  • set indirect shadows to something above medium (which allows the textures to run normally, otherwise they are blurry).

Even on the 4060ti, it saying its using 100% of the GPU but it is only pulling like 75-100 W, when it would normally pull 150-200W under load easy.

TL:DR - this game isnt optimized, at least for NVIDIA cards. They should acknowledge that.

[–] Lev_Astov@lemmy.world 3 points 1 year ago (1 children)

I can at least change the FOV with an ini file edit, but there's no way to adjust the horrible 2.4 gamma lighting curve they have... It's so washed out on my display, it's crazy!