It’s BS though. People with TOTL hardware are having issues. Those systems don’t underperform because the game is advanced or anything like that – the game underperforms because it is a new release that is poorly optimized. It’s also expected because it’s on a senior citizen of a game engine that likely needs a few other nudges.
Todd Howard forgets that PC users see this shit all the time, and it’s pretty obvious with this one. Hoping to see talk of optimization in a coming patch instead.
Edit: a good example – not hitting 60fps in New Atlantis, but concurrently, CPU usage in the 50s and GPU usage in the 70s. That’s a sign of poor optimization.
I’m starting to think that maybe, just maybe brute forcing a 26 yesr old engine that makes skyrim have a stroke if you try to play above 30fps isn’t a good idea
What game engine is 26 years old other than the Unreal engine?
Edit: stepped on some toes i guess lmfao
Gamebryo, the base of creation engine used by Bethesda for this
Ah okay. Thank you for the actual answer
…like not launching with DLSS. What a weird oversight.
AMD is the official sponsor. That’s the one thing that wasn’t a surprise.
It’s not an oversight, they were paid to not include DLSS.
While I’m no fan of paid sponsorships holding back good games, this is untrue.
Neither nvidia nor amd block their partner devs from supporting competing tech in their games. They just won’t help them get it working, and obviously the other side won’t either, since that dev is sponsored. There are some games out there that support both, some of them even partnered.
So yes, it’s bullshit. But it’s not “literally paid” bullshit. Bethesda could have gone the extra mile, and didn’t.
AMD blocks partners from implementing DLSS. You’re probably right that it’s not paid bullshit as the payout isn’t monetary. But it’s still being blocked due to the partnership.
This is hardly the first game to do this. Jedi Survivor, RE4 have the same problem. AMD sponsored FSR2 only. The work required to implement FSR2 or DLSS is basically the same (motion data). That’s why DLSS mods were immediately available.
Since FSR2 was released not a single AMD sponsored game has DLSS added. Even games done in engines like unreal where all the dev has to do is include the plugin.
Literally not the case here, as evidenced by public communications.
I expected this once everyone kept buying into nvidias dlss.
Nvidia and dlss will be required to get titles to run decently.
Minimal game optimization will be done on majority of future game titles.
Fml
Minimal game optimization will be done on majority of future game titles.
That’s more optimisation than we get now
Y’all are surprised the boss of a AAA studio suggested you buy hardware from companies he has a deeply vested interest in?
It’s all one big circle jerk of companies and anyone buying “cutting edge” gets what they deserve.
You’re the product in more ways than one
You’re literally the consumer in this instance. The game is the product. The computer is the product.
Why upgrade when I will just pick it up on the PS7, 10 years from now, along with the Skyrim bundle.
I have a i9 13900k and a Radeon 7900xtx, 64GB RAM and I had to refund on steam it because it would keep crashing to desktop every few minutes. Sometimes I would not even get passed the Bethesda into Logo before crashing. Very frustrating experience to say the least.
I have a i7-10700k/32gbRAM/3080ti - playing the game at 4k with all settings to max (without motion blur ofc) and with almost 80hrs into the game, I have yet to have a single crash or performance issue.
Only realized people were having issues when I saw posts and performance mods popping up.
I mean, the game definitely runs like shit but if you keep crashing that sounds like a you problem. My 7600x/6700XT/32GB DDR5 build hasn’t crashed once in 15 hours of playtime and I’ve heard a ton of complaints about the game but barely any about crashing.
Oh, only a 7900xtx? lol
I haven’t played starfield yet but many of the recent headliner releases have been performance hogs. It’s not unreasonable to expect people to either play with lower settings or upgrade if you want to run the best possible set up. That’s why there are performance sliders in most games. When you need a 3080 to run minimum settings that’s when you start running into trouble (👀ksp 2)
Man, that’s why armored core blew me away. Completed the whole game, at launch, maximum settings and I don’t recall a single frame drop. 3060, with very mediocre other hardware. I know there’s a lot to be said about map sizes and instanced missions, but with as fantastic as that game looks and plays…
Same happened with Doom Eternal. The graphics were a show stopper when the game came out and the game didn’t even stutter. It’s so well optimized that I’m told you can even play it with integrated graphics.
It’s almost like having a giant open world comes with some massive drawbacks. I’m pretty fatigued over open world games tho so that may just be me.
Frankly, open world sucks. I’ve played Far Cry 2 sometime last year because one of my friends spoke so highly of it and I’ve spent more time driving around than actually shooting anything. It served no purpose other than wasting player’s time. Missions were rather basic too. And nothing in the reviews of more modern examples showcase that anything has changed.
I have a 3060Ti and play most games on max settings. There is the occasional game that explodes if I do that but otherwise GPU power is out ahead of decently optimized games (probably because gaming is now no longer the driving factor for GPU performance).
Not that I’ll be buying it anytime soon but if the hardware specifications I’ve read are true, no graphics card is worth €500+ to play a game. This is bonkers.
If he’s telling us this, does that mean we get to bill him for the upgrade?
Starfield also requires an SSD, a first for a modern triple-A PC game.
I recall the same being said about Cyberpunk 2077, and I’m not sure that was the first either.
Cyberpunk doesn’t require an SSD, it had “SSD recommended” under it’s storage but not required. Starfield lists it as a requirement.
Cyberpunk also has a “HDD mode” in its options.
Because you load every time you walk through a door.
I stand corrected.
To be fair, Cyberpunk 2077 came out in the peak of Covid GPU scarcity, I was still gaming on a GTX1080 at it’s release and the only way I could have a decent experience was running it at 50% resolution scale with 100% sharpening.
BG3 has the same too.
Just upgrade your PC 4head
Its on Game Pass, Todd. If it doesn’t run well I’ll just not play Skyrim-Space Edition.
My partner who is interested has a PS5 and an older PC. If her PC doesn’t run it, she’ll probably just keep playing Stardew Valley. Honestly it’s not like anyone is going to really be talking about Starfield in a month or two except ridiculous ship builds on social media.
I bought a new PC just to play Starfield (and BG3 with less issues).
It looks alright overall. But it’s pretty crazy that even 30xx cards can’t run it well (I had a 1070 though).
Wish my computer weren’t dead, so I could at least try to play it. Although my 2070 wouldn’t have survived. It runs nice on my Series X, but I hate playing this type of game with a controller.
My 1060 would probably burst into flames at 640x480 then.
Ridiculous statement. I’ve got an rx 7900xtx and a ryzen 7 7700x with 64 gigs of ram @5600mhz and the fucking game barely ever hits 144fps. Usually it’s sitting around 100-110 fps which is playable for sure, but literally every other game I’ve played on it has had no problem staying nailed at 144fps. This is at low-medium settings BTW (for starfield).
Ridiculous statement. 100-110fps is far above playable. Do people forget how Witcher, Crysis and others ran on release?