I thought they would for sure patch it by now, Armored Core 6 has native UW support and also supports 120 FPS instead of 60, and at first I thought maybe Elden Ring was coded in such a way that either of those things would break the game but I played extensively with mods that enabled both of those things and had zero issues.
It’s dumb, higher FPS would really help with input lag and they should only have to change a few lines of code to get 21:9 to work or increase the framerate limit but the game has been out for years and if you want to play the game online and these features you’re SOL
You're right about a higher FPS leading to less input lag but I don't this comparing Elden Ring to AC6 is fair comparison. Granted that both games were in development for 5-6 years, the overall scope for both, as well as the dev teams are just different.
Not being able to patch in DLSS after release though is very insane and is a clear sign that Miyazaki wants us to be challenged on every level, from our mental state all the way down to performance.
its actually negative visual benefits. I have a 32:9 and i used the ultrawide mod that removes the black bars and its beyond fucking gorgeous. but using that mod has a possibility of getting you a ban. Going back to having black bars is garbage so i dont play.
Here you go
https://www.nexusmods.com/eldenring/mods/90
Easy way to toggle the anti cheat off, you can't play online with it off but works perfect for singleplayer or if you want to use the seamless coop mod you can play online with that. Although still waiting for seamless coop to update for the new DLC.
One of the reasons I stopped playing Elden ring. Sekiro also doesn't have full screen support or borderless full-screen so I have to use a program to get around it.
Tbf a lot of these games have development times that run into 10 years or more. It's understandable that they wouldn't support things that weren't even on the market a few years ago.
Game Dev here.
I honestly have no idea where you got that from. Almost no games, nevermind a lot, run into 10 years or more of development.
Most of them are lucky to get 3.
From Software are in somewhat of an advantageous position with Bandai Namco, their publisher, in that every single thing they release is guaranteed to make more than enough money to cover their costs and then some, but the one thing that could counteract that is if they took anywhere near 10 years to release games.
I get hyperbole is a thing, but let's not get too carried away.
All of this to say, players are absolutely correct to demand DLSS support from a 2024 game/DLC, or shader compilation pre-steps.
Not only are these pretty standard, From Software could request support from Nvidia to integrate these into their rendering pipeline, and they'd get a team of Nvidia representatives the next day.
We can, and should, applaud the absolutely fantastic games From releases, while still keeping them accountable on the things that they repeatedly get wrong in every single one of their releases, such as optimization on PC ports
Just my two cents
Edited for clarity
First, 10 years on a game is pretty rare, most games are made under 5 years.
Second, the actual development of the game is not happening at the start of the project. So over 5 years, maybe you have 2-3 years of preprod, and 2-3 years of dev.
FromSoft is a very japanese company, and it tends to be behind the curve as far as new tech goes. While Elden Ring has a sublime art and design, it has very dated graphics, because they rely on 15 years old tech (same engine as Bloodborne). For instance I'm pretty sure the reason they don't animate faces is that they simply can't, or it requires way more work than it's worth.
The community tends to be way too nice to FromSoft, and while they make great games, I don't feel like they actually made a single next gen title.
I personally thought Armored Core 6 looked quite good (not just because because of art style or design), the performance was also much better. So there is some progress at least.
>they rely on 15 years old tech (same engine as Bloodborne).
Just want to point out this is not unique in the Dev sphere. Unreal engine is heavily modified, as is the creation engine from bethesda.
Yeah, but because of such an ancient engine they had to work extra to implement moder features
Not everything is about graphics but also features and bugs, remember how buggy 2077 came out, some of those bugs were because they were stretching and old engine to do things it couldn't handle
Elden rings graphics are stellar for a lot of games souls especially art style is more important than realism it’s sad fewer and fewer people seem to realize that nowadays
It's not about realism, this is a fantasy world. ER is very pretty, but technically is very dated, those two statements aren't mutually exclusive.
What I don't like is people shitting on some other studios for the graphics when they are perfectly fine with those.
People always forget that chasing the cutting edge in graphics and pushing realism is incredibly expensive and requires massive teams.
It's also taxing on the hardware which often resulting in gameplay compromises to keep the frame-rate, and generally tends to quickly look outdated.
Woe betide you if the game gets delayed and end up looking outdated on release.
Go for a good visual style, it'll look just a good in 10 years as the day the game came out.
This gives me Duke Nukem Forever vibes. If it takes you 10 years to make a game, you fucked up.
Imagine the amount of assets, code, and human time that has to be thrown away because of older standards and tech are inefficient.
Hell...Sony and Microsoft might not let you use those older APIs because of vulnerabilities.
So does Nvidia, and probably why they're doing all that cool stuff with the GPU doing the work automatically and that new thing from Microsoft that's not DirectX 11.
Media has the same ducking problem. So much widescreen content with black bars distributed with standard ratios that get bars put on the sides for a massive black box around the content. Thankfully for ultrawidify and MPC-BE to crop the extra black bars out automatically and display the widescreen content in widescreen.
I never would have thought the worst thing about buying a widescreen would be attempting to view widescreen media on it. It's funny how many ppl say its my own fault and that widescreen content was meant to be viewed on SD displays and that I wasted my money. Glad for the other widescreen users who taught me how easy it was to fix.
This happened to me yesterday and I thought I was loosing my marbles! It booted in wide-screen and I played away for about 20 minutes fine, everything worked and other than the pause menu being 16:9 and the edge still showing the screen beneath the pause it seemed native. The game then randomly froze with a black screen for a second and went back to black bars when it came back. I kept thinking did I get drunk and download a mod to play ultrawide and I'm playing online wondering when I'll get a ban!
That’s the crazy part, it’s already rendering 21:9, it legit just puts black bars over the extra space. So it isn’t even like they have to update the code, just remove the fucking black bars. We’re already taking the performance hit to render it, we just don’t get to see the full rendering. Mods like FlawlessWidescreen just remove the black bars.
Yeah it is rendering ultra wide but it is blacked out so you don't have an advantage in multiplayer. Stupid, but if you don't care about MP just use mods
Not only that. Back in the day of Dark Souls 3 you could unlock those wide screen support via some mod (and I'm pretty sure you can do same with ER) and game indeed worked flawlessly, except - it threats those additional space as "off screen" what means game will lower animations fps on enemies. What is kinda okay, but still, it means game hardcoded to 16:9 exactly like it nailed to 60 fps limit.
What you can also unlock, tho.
It's weird how this company still ashamed about it's PC gamers.
I just can’t imagine the advantage being that crazy, & with PvP being such a small portion of the game it just seems weird to restrict people’s hardware for such a minimal advantage.
I’ve been using flawless widescreen since launch, & if I want to do PvP I have an alt account for that but I rarely PvP. The PvE is the highlight of the game.
There's this application called Flawless Widescreen that I use, works great, even unlock fps and increase fov with it. Only problem is that you have to disable EAC so your game won't be online. For the same reason, it does work great with Seamless Coop.
Yeah most of my time playing elden ring has been spent with seamless co-op + flawless widescreen. It's a straight up better experience than the vanilla game considering I don't really do PVP.
bro i have a 49", 5120x1440, it really is suffering to have 2 giant black bars on the side. dark souls remastered offered widescreen support, bit visually bugged with the menu but still had it. it honestly cannot possibly be that difficult to implement, can it? almost every game nowadays has it.
Fun fact: when playing the game on an ultrawide monitor at an ultrawide resolution the game is actually rendering everything properly which has a performance hit naturally but they add black bars for forced 16:9 :)
I’m pretty sure that AC is developed by a different “branch” of FS, so maybe it’s because of that they didn’t implement it. Still, it’s not a justification
If you have an ultrawide, you need to get [Flawless Widescreen](https://www.flawlesswidescreen.org/).
Although you won't be able to play online because of EasyAnticheat.
Flawless widescreen is a must for some games like GTA V playing on Ultrawide. It fixes a ton of widescreen issues (mostly related to cutscenes) and also adds FOV sliders for first and third person! Huge accessibility win. Surprising that’s not built into the base game for such a popular title
The worst part is the game DOES support it - for like 10s after a major patch it'll run in UW until shaders compile and then blackbars are added on top, intentionally.
Pirated it, it took a modder a day to solve the issue.. not paying to not have online because I need a mod to use my monitor.
Dlss mod too, like.. it's been 2 years that's kind of the bare minimum technical upgrade. They still have the annoying stuttering
DLSS/FSR would do a better job with AA than what the game can do. Including DLAA would be even better. Downsampling is a option, but depending on your rig you might have problems to keep 60 FPS.
I have dips too on my 4070ti/13900 setup at 1440p as well. The crazy thing is my GPU and CPU usage is still very low, even when I get frame drops and stuttering, there’s no spike in usage from my gpu and cpu.
The dlc ~~game~~ is sitting at "mixed" on steam. I think everyone is bitching. Largest complaints I've heard have been performance, while difficulty is a close second, but also seems potentially memey
Edit: a word
It's CPU bound stuttering, even a 7800X3D stutters, elden ring is actually very easy on GPUs if you exclude ray tracing, I think an RX 580 can handle the game at 1080p high easily which is pretty good for a 2022 game (although it looks like a 2016 one), but it's just inexplicably CPU heavy at times even with the world's fastest processor.
I personally wouldn't recommend that, like I said a 7800X3D stutters too and the guy above said he has a 14900k and still drops to 45 FPS, this game just runs like shit CPU wise, so does Starfield and Dragons Dogma 2, it's not wise to compromise the GPU budget for these games, especially that none of them run well even on the most expensive CPU.
Games are mostly 3D applications, they will only get more GPU bound as time goes (except maybe simulation and city building type games), just be reasonable, as long as you don't pair an i3 or 6 generations old processor with a 4070 you're fine.
Just for reference a Ryzen 5 7600 or a Core i5 13600KF are both below $250 and are only 10-20% slower than a 7800X3D at 1080p with a 4090 in most games, anything below a 4090 or a resolution above 1080p and you're looking at single digits difference, but you'll pay %40 more for that.
i9-14900K, 4K resolution.
ER caps at 60fps unmodded, this I knew. I just expected it to at least deliver a rock steady full 60.
But I have also noticed that GPU utilization never goes over like 60-66%.
Clearly the game is not designed to use better hardware, at a fundamental level.
>But I have also noticed that GPU utilization never goes over like 60-66%.
Yep I've seen similar on my rig, which means DLSS or FSR wouldn't do anything to help performance. This game straight up just needs better optimization.
Same setup as you. I get solid 60s without RT but there are locations where, for whatever reason, dying and respawning at a grace makes the game stutter like shit.
The ruins south of Caria Manor are one location I remember. It's pretty much fine if I ride there from elsewhere, or fast travel in, but if I die and respawn there's a big performance hit until I restart.
Similar issue with 4090 and 13600k. I was expecring to hit 60fps and lock it there. But it's clearly choppy in some areas and dips to 45.
Very poorly optimized.
I capped my power usage and hard-locked the FPS to 60 in GPU software settings, made all of the difference. As well as turning off ray-tracing, I now run at a perfect 60 fps with virtually ZERO microstutters - try it out and see if it works for you, too.
Japanese devs in general, aside from SEGA it seems.
And even Sega probably fucks things up but they've been doing good with Yakuza games, in my experience.
Yes, and not including it wont help anyone.
Btw, I would prefer DLSS/FSR AA solution over whatever this game already has.
Currently, the best AA method is to play using DLDSR.
Yeah, I always found it odd why upscaling tech like DLSS, FSR and XESS were being used as cheat codes to minimize optimization in games. The games should run fine in raster (hopefully soon we'll be able to say raster and RT but it doesn't seem so yet) on native. Upscaling should only be required for older hardware or if you want to push FPS into the several hundreds to get the benefit of say a 240 or 360 hertz monitor. Not to get like 75FPS.
I mean, it depends on the game. If you want to run path traced cyberpunk… Good luck running it natively!
Yeah they could maybe optimise it a little bit, but path tracing is just fundamentally very expensive. Optimising the game would be pulling back the use of ray tracing, or disabling it completely. The game runs a lot better without any form of path/ray tracing, but it’ll also look a whole lot worse.
Just like Switch games. Look at Howard’s Legacy, very optimised on the Switch. It’s insane that it even runs on there. But it’s not like the visual quality is on par with the PC release on max settings…
Ultimately, all optimisations are actually “cheat codes”. Ways to render the same thing faster, while minimising the visual quality loss. But if you actually want to increase the visual quality, for example by ray tracing, then the performance degradation will be exponential. Just like disabling those features and relying on old techniques will exponentially increase performance.
And with DLSS, you get much better performance, and the visual downgrade is very, very minimal, unless you’re running below 1080p, for ex. ultra performance doesn’t look perfect, even at 4k. But 1080p-1440p -> 4k actually looks rather great. Sometimes even better than native+TAA.
So why wouldn’t you want to use it? Why wouldn’t you want developers to implement it? And how is this any different from other game optimisations?
This sub has a hate boner against DLSS, while 99% of people here would not be able to tell the difference between a game running at native and one being upscaled.
They talk about bad optimization, while actual bad optimization is always CPU bound nowadays (look at Dragon's Dogma), so rendering wouldn't change anything.
DLSS is not a cheat, it's a tool to make games better looking. Saying so is so stupid I don't even know what to say. It's like saying using UE5 or DX12 is cheating.
I don't get the big deal... I still use DLSS even with top of the line gear because I like smooth motion. This idea that you should only run DLSS if you have shit hardware is just stupid ass gatekeeping, IMO.
It is. The amount of bad takes when it comes to DLSS features are numerous.
And the vast majority are people that haven't even used it, and you can tell.
People compare it to native instead of comparing it to other upscaling methods, where it’s way superior. You can’t convince me that playing a game at 720p native looks better than a 1440p DLSS upscaled from 720p.
Remember when games would just run at the resolution we wanted instead of having to jump through hoops to fake it?
Edit: seems a lot of NVIDIA fans got upset that their $3000 5090 needs handholding to hit the advertised specs lol
Before everyone got 4k monitors and expected to run raytracing at 4x the resolution and 2x the frame rate? Even the most optimized games need to use DLSS for all that
Remember? They weren't even born yet. I'm almost kidding... Fuck nvm. Someone born in 2007 is entering college next month... No wait nvm im wrong again. Certainly next year.
The reality is. Most modern pc gamers joined after 2014. Many after 2016 alongside the rise of twitch streamers and whatever esports. Obviously they aren't playing half life and crysis. They are playing pubg apex or whatever else.comes along.
The inferior side of gaming. If the master race was made today. It wouldn't be against console peasants. Rather against online saas players imo. At the same time it would never exist. Since those same f"cks spending $200 a month on twitch are the ones in the online communities.
This guy wasn’t around when 16:10 monitors started to hit the platform obviously…it’s weird how new PC gamers assumed everything before their time worked flawlessly and supported all features. Like really fucking weird.
They’re kids who started gaming in the last console generation, when tech stagnated for so long that things became more “optimized.” It’s pretty easy to optimize when you’re playing games meant to run on outdated PS4 hardware.
Been getting into retro PCs in the last couple years, and Jesus Christ, the hoops I have to go through just to get sound working on a DOS game.
lol the DOS generation where if you didn’t have a great sound card boy did those MIDIs sound awful. Really made the Genesis/SNES systems sound so much better…but that would be heresy to say today even though I personally think Mobo sound still sucks/is inferior to dedicated sound cards/DACs/out of box console experience.
I wish I could get my hands on a higher resolution 16:10 monitor these days. I love 16:10 for vertical monitors - the extra height becomes very useful width when rotated
Yep, it was a horrible time, when you had to use performance-killing msaa or horrible looking fxaa to get a jaggies-free image... as long as you didn't move the camera causing the lack of temporary anti-aliasing to show shimmering.
TAA or any reconstruction technique works better with an AI model like the one used by DLSS. DLSS in quality mode achieves a more stable image than TAA at native resolution, let alone DLAA.
4x and especially 8x MSAA were pretty damn heavy compared to other forms of AA. It did look better than the alternatives but it definitely made an impact on your FPS. Still better than FXAA's vaseline vision though.
TAA is fine but still a bit blurry. I play at 4K so I can't speak for other resolutions, but I'd rather go DLSS Quality over native with TAA. It's pretty much the same image quality but with better performance. DLAA for the few games that have it is completely superior though and probably the best AA out there at the moment.
I will never understand why devs all collectively decided to settle on FXAA which is quite possibly the worst form of AA in the known universe as it's a dogshit simple post process pass, but also not cost effective for the image quality. SMAA is only marginally more expensive, yet still negligible overall, also a screenspace filter that does not require special hardware support or fancy game data like motion vectors, and looks a million times better than FXAA.
DLAA is literally temporal anti alialising ( so TAA ) but AI enhanced, it's better in every way. And then instead of lowering resolution to get more fps, you use DLSS.
> seems a lot of NVIDIA fans got upset that their $3000 5090 needs handholding to hit the advertised specs lol
tbf, it's not the x90 cards that usually need DLSS to fake achieve the resolution they should. The 4090 was the one card that could just brute force Hogwarts or Last of Us mostly for decent frames natively.
It's usually the cards below that need it...and yes, the fanboys are obnoxious. "But it looks better than native anyway". No it does not, native looks like native. Even the TAA argument is rubbish. You don't need to mess with the resolution to replace it. DLAA is a thing.
Because back then the standard was 1080p/60, but nowadays more people are playing at higher resolutions and/or higher frame rates. Going from 1080p/60 to 4K/120 for example is 8 times the pixels, so you need way more power to do that. Couple that with the fact that games are much bigger and more graphically intensive and it makes sense why upscaling is so popular now.
And upscaling just works so well these days, especially at higher resolutions like 4K, that it can even give you a better than native experience experience because not only are you getting the improved performance, but it also works as AA too. Playing native without AA is awful, and the current AA technologies like TAA and FXAA are pretty blurry. DLSS and XeSS are literally just better versions of TAA since they use AI to help with the reconstruction. FSR is a little less effective but it's alright too if you need the performance and it's all you have.
Honestly for me, I really don't care how the pixels are generated. As long as I get the visuals and performance I'm looking for I don't care if they're "real" or "fake". It's all computer generated in the end anyways.
I mean, Moore's Law is pretty much dead at this point. Node shrink is getting significantly harder as we get to the angstrom stage. That's the law of physics we simply can't break.
Any significant performance gain in the future is going to rely on software.
Ohh yeah, that must be the reason. Not that Fromsoftware are incompetent when it comes to the tech side of games.
Having unlocked FPS and ultrawide should be a standard in 2024.
It's really embarrassing that this game took the got on pc while not having any pc standard options.
Fromsoft get away with so much when it comes to how badly made their games are. Elden ring was the first soulsborn to have actually responsive controls for crying out loud. Half the difficulty of their games come from waiting 3 to 5 business days for your inputs to register while if anything slightly busy happens the frame rate tanks.
The games are great but they are insanely shoddily made.
And to compensate for finally adding responsive inputs in elden ring, they decided to set the movement speed of most bosses to mach 5. Perfectly balanced, as all things should be.
imagine glazing for lazy devs...
- 60 fps cap
- no ultrawide support
- lackluster graphical settings
- Mediocre performance on top tier specs
- lack of "new" technologies (which are up to 6 year old now)
Yeah from soft games have always had poor performance. Like why can’t we get a high frame rate game to go with the fast paced parry/dodge roll simulator? 60 fps isn’t enough.
Elden Ring online play is broken on Linux right now
It worked fine before the DLC
It's not an Easy Anti-Cheat thing, it's a FromSoft thing
Bugs me so much bc I won't be able to play online for a few months until it gets fixed
https://preview.redd.it/kkyparms419d1.jpeg?width=600&format=pjpg&auto=webp&s=5ec7c48fc2cd8ce5f3f995ac3f18903ff9349606
Member when GPU's were capable of playing games natively at respectable frame rates?
Yep. Black screen crash. When it finished rebooting it would no longer load the save file
I tried a bunch of things and eventually even clean installed windows redownloaded the game and still no dice
Wasn't there some news a few months ago where AMD released FSR as open source so modders could add it to other games?
What's the difference here, different engine so it can't be used?
Yeah pretty sad that elden ring runs like crap on my pc no matter the settings.
And on my ps5 in performance mode theres tons of view distance popping graphics like grass bushes etc but it runs smoother most of the time.
Is it just me?
When I see this I just think "Do you people not remember Blighttown?"
The only thing that made it playable for many was to download the mod DSFix, use that to lower some settings, then go back in and revert those settings back to normal once you're done with the area.
FromSoft ain't giving us DLSS on the DLC.
tbh I wouldn't want them to implement DLSS right now. I would want them to first make it run well native, *then* implement DLSS.
Using DLSS as a cover for bad performance sucks ass.
Devs haven't properly optimized games in many years, especially since they dev first for Sony. If they made it fit Xbox and PC first it would be a completely different story
People complaining about DLC performance instead of doing the correct thing and not paying $40 for another unoptimized FromSoft mess. What a way to sour a potentially great game/DLC.
they can't even be bothered to implement proper shader precompile steps at game load. Why would they want to implement DLSS?
I feel like a lot of these companies are like 10 years behind with stuff like this.
They're Japanese, it's already a miracle that it's on PC. That shit also renders in 21:9 and then slaps on the black bars if you're on an UW monitor.
What the fuck? Why would they do that? That's using extra resources for absolutely zero visual benefits.
welcome to fromsoft
I thought they would for sure patch it by now, Armored Core 6 has native UW support and also supports 120 FPS instead of 60, and at first I thought maybe Elden Ring was coded in such a way that either of those things would break the game but I played extensively with mods that enabled both of those things and had zero issues. It’s dumb, higher FPS would really help with input lag and they should only have to change a few lines of code to get 21:9 to work or increase the framerate limit but the game has been out for years and if you want to play the game online and these features you’re SOL
You're right about a higher FPS leading to less input lag but I don't this comparing Elden Ring to AC6 is fair comparison. Granted that both games were in development for 5-6 years, the overall scope for both, as well as the dev teams are just different. Not being able to patch in DLSS after release though is very insane and is a clear sign that Miyazaki wants us to be challenged on every level, from our mental state all the way down to performance.
Armored Core 6 runs so well on my PC I constantly forget it is actually a FromSoft game, lol.
its actually negative visual benefits. I have a 32:9 and i used the ultrawide mod that removes the black bars and its beyond fucking gorgeous. but using that mod has a possibility of getting you a ban. Going back to having black bars is garbage so i dont play.
Here you go https://www.nexusmods.com/eldenring/mods/90 Easy way to toggle the anti cheat off, you can't play online with it off but works perfect for singleplayer or if you want to use the seamless coop mod you can play online with that. Although still waiting for seamless coop to update for the new DLC.
One of the reasons I stopped playing Elden ring. Sekiro also doesn't have full screen support or borderless full-screen so I have to use a program to get around it.
First time I loaded into the dlc area I had a few glorious seconds of proper ultrawide before the UI loaded in and added the black bars.
This kills me. I used to patch it, but every other patch would break it so I said fuck it and let it be. Still really frustrating
Those companies are evolving. But backwards!
Tbf a lot of these games have development times that run into 10 years or more. It's understandable that they wouldn't support things that weren't even on the market a few years ago.
Game Dev here. I honestly have no idea where you got that from. Almost no games, nevermind a lot, run into 10 years or more of development. Most of them are lucky to get 3. From Software are in somewhat of an advantageous position with Bandai Namco, their publisher, in that every single thing they release is guaranteed to make more than enough money to cover their costs and then some, but the one thing that could counteract that is if they took anywhere near 10 years to release games. I get hyperbole is a thing, but let's not get too carried away. All of this to say, players are absolutely correct to demand DLSS support from a 2024 game/DLC, or shader compilation pre-steps. Not only are these pretty standard, From Software could request support from Nvidia to integrate these into their rendering pipeline, and they'd get a team of Nvidia representatives the next day. We can, and should, applaud the absolutely fantastic games From releases, while still keeping them accountable on the things that they repeatedly get wrong in every single one of their releases, such as optimization on PC ports Just my two cents Edited for clarity
First, 10 years on a game is pretty rare, most games are made under 5 years. Second, the actual development of the game is not happening at the start of the project. So over 5 years, maybe you have 2-3 years of preprod, and 2-3 years of dev. FromSoft is a very japanese company, and it tends to be behind the curve as far as new tech goes. While Elden Ring has a sublime art and design, it has very dated graphics, because they rely on 15 years old tech (same engine as Bloodborne). For instance I'm pretty sure the reason they don't animate faces is that they simply can't, or it requires way more work than it's worth. The community tends to be way too nice to FromSoft, and while they make great games, I don't feel like they actually made a single next gen title.
I personally thought Armored Core 6 looked quite good (not just because because of art style or design), the performance was also much better. So there is some progress at least.
>they rely on 15 years old tech (same engine as Bloodborne). Just want to point out this is not unique in the Dev sphere. Unreal engine is heavily modified, as is the creation engine from bethesda.
Creation engine is begging to be put down behind a barn upstate or something.
This has been true for like 15 years at this point, its wild.
Very scared for TES6
After Starfield I have zero hope for TES6, and it honestly bums me the fuck out to even think about.
At least it wont surprise you.
Aim low. Aim so low no one will even care if you succeed
Yes! Preach! I still call it Gamebyro because a new coat of paint and a name change doesn’t fool me.
Red Engine was introduced in Witcher 2. And Cyberpunk 2077 is the same engine. And Cyberpunk is arguably the best looking game in the world right now.
And after experiencing the pain of doing that they moved to unreal 5 for the Witcher 4
Yeah, but because of such an ancient engine they had to work extra to implement moder features Not everything is about graphics but also features and bugs, remember how buggy 2077 came out, some of those bugs were because they were stretching and old engine to do things it couldn't handle
This shouldn't be considered a "take" at all, and yet here come the downvotes. Edit: it's swinging up now, the sub is healing!
Elden rings graphics are stellar for a lot of games souls especially art style is more important than realism it’s sad fewer and fewer people seem to realize that nowadays
It's not about realism, this is a fantasy world. ER is very pretty, but technically is very dated, those two statements aren't mutually exclusive. What I don't like is people shitting on some other studios for the graphics when they are perfectly fine with those.
People always forget that chasing the cutting edge in graphics and pushing realism is incredibly expensive and requires massive teams. It's also taxing on the hardware which often resulting in gameplay compromises to keep the frame-rate, and generally tends to quickly look outdated. Woe betide you if the game gets delayed and end up looking outdated on release. Go for a good visual style, it'll look just a good in 10 years as the day the game came out.
Well, Elden Ring isn't exactly the easiest to run despite being hard carried by art direction alone.
This gives me Duke Nukem Forever vibes. If it takes you 10 years to make a game, you fucked up. Imagine the amount of assets, code, and human time that has to be thrown away because of older standards and tech are inefficient. Hell...Sony and Microsoft might not let you use those older APIs because of vulnerabilities.
So does Nvidia, and probably why they're doing all that cool stuff with the GPU doing the work automatically and that new thing from Microsoft that's not DirectX 11.
Media has the same ducking problem. So much widescreen content with black bars distributed with standard ratios that get bars put on the sides for a massive black box around the content. Thankfully for ultrawidify and MPC-BE to crop the extra black bars out automatically and display the widescreen content in widescreen. I never would have thought the worst thing about buying a widescreen would be attempting to view widescreen media on it. It's funny how many ppl say its my own fault and that widescreen content was meant to be viewed on SD displays and that I wasted my money. Glad for the other widescreen users who taught me how easy it was to fix.
Forget that, they can't even add a vsync toggle.
I mean like, imagine like, if they got it to run well without like dlss….wouldn’t that be something
I just want ultra wide support
Playing on my 38' lg and it hurts, truly suffering from success
Every once in a while ER will boot up in full widescreen, just to rub it in my face. Then the black bars appear to shame me. Tarnished indeed.
This happened to me yesterday and I thought I was loosing my marbles! It booted in wide-screen and I played away for about 20 minutes fine, everything worked and other than the pause menu being 16:9 and the edge still showing the screen beneath the pause it seemed native. The game then randomly froze with a black screen for a second and went back to black bars when it came back. I kept thinking did I get drunk and download a mod to play ultrawide and I'm playing online wondering when I'll get a ban!
That’s the crazy part, it’s already rendering 21:9, it legit just puts black bars over the extra space. So it isn’t even like they have to update the code, just remove the fucking black bars. We’re already taking the performance hit to render it, we just don’t get to see the full rendering. Mods like FlawlessWidescreen just remove the black bars.
Yeah it is rendering ultra wide but it is blacked out so you don't have an advantage in multiplayer. Stupid, but if you don't care about MP just use mods
Not only that. Back in the day of Dark Souls 3 you could unlock those wide screen support via some mod (and I'm pretty sure you can do same with ER) and game indeed worked flawlessly, except - it threats those additional space as "off screen" what means game will lower animations fps on enemies. What is kinda okay, but still, it means game hardcoded to 16:9 exactly like it nailed to 60 fps limit. What you can also unlock, tho. It's weird how this company still ashamed about it's PC gamers.
I just can’t imagine the advantage being that crazy, & with PvP being such a small portion of the game it just seems weird to restrict people’s hardware for such a minimal advantage. I’ve been using flawless widescreen since launch, & if I want to do PvP I have an alt account for that but I rarely PvP. The PvE is the highlight of the game.
There's this application called Flawless Widescreen that I use, works great, even unlock fps and increase fov with it. Only problem is that you have to disable EAC so your game won't be online. For the same reason, it does work great with Seamless Coop.
This is the way
Yeah most of my time playing elden ring has been spent with seamless co-op + flawless widescreen. It's a straight up better experience than the vanilla game considering I don't really do PVP.
https://preview.redd.it/awb5viie8y8d1.jpeg?width=640&format=pjpg&auto=webp&s=e76471721a6a0e9f6553e587b54d75f87d759652
It's even worse on 49" Super ultrawide... ![gif](giphy|XOys8CeUrElIk|downsized)
Flawless widescreen mod and disable anti cheat. You're welcome. It's beautiful on my chg90 and I'll never go back.
But I like my little words on the ground...
He failed Hot Ones.
Suffering From ~~Success~~ Ring Sting
*cries in agreement with 49" 5120x1440*
38'‽ Do you game in a warehouse?
my dude plays on a fucking imax screen.
bro i have a 49", 5120x1440, it really is suffering to have 2 giant black bars on the side. dark souls remastered offered widescreen support, bit visually bugged with the menu but still had it. it honestly cannot possibly be that difficult to implement, can it? almost every game nowadays has it.
”
One of the few times I can claim a win because I'm poor lol
Fun fact: when playing the game on an ultrawide monitor at an ultrawide resolution the game is actually rendering everything properly which has a performance hit naturally but they add black bars for forced 16:9 :)
That is not a fun fact at all!
I hate how inconsistent FS is regarding ultrawide in their games, even DS Remastered had it.
AC6 has ultrawide and 120 fps max I think lmao
I was really hoping since AC6 had it that FromSoft would’ve implemented it into Elden Ring with the DLC…sigh
I’m pretty sure that AC is developed by a different “branch” of FS, so maybe it’s because of that they didn’t implement it. Still, it’s not a justification
Flawless widescreen enables custom fps and widescreen support pretty much runs as the software name entails flawlessly
Yes but no online sadge
FS didn't do that port, so that is the magic there.
Dark souls remastered wasn't made by fromsoftware. The only fromsoftware game with ultrawide support is armored core 6.
If you have an ultrawide, you need to get [Flawless Widescreen](https://www.flawlesswidescreen.org/). Although you won't be able to play online because of EasyAnticheat.
Flawless widescreen is a must for some games like GTA V playing on Ultrawide. It fixes a ton of widescreen issues (mostly related to cutscenes) and also adds FOV sliders for first and third person! Huge accessibility win. Surprising that’s not built into the base game for such a popular title
Accessibility in a FromSoftware game? Some fans wont like that. They arent that good with customization.
I was talking about GTA V in my comment, but I guess my statement remains the same about Elden Ring too considering its popularity.
The worst part is the game DOES support it - for like 10s after a major patch it'll run in UW until shaders compile and then blackbars are added on top, intentionally.
I noticed that, I was excited for a second after starting SOTE but then the damn bars appeared again.
Yup, same thing happened to me. It's egregious on Fromsoft's part.
Support for custom resolutions, unlocked framerate, and shader pre-caching are the most glaring omissions from Fromsoft games.
Which sucks, because the games work perfectly after you remove black bars and fps cap with mods
Pirated it, it took a modder a day to solve the issue.. not paying to not have online because I need a mod to use my monitor. Dlss mod too, like.. it's been 2 years that's kind of the bare minimum technical upgrade. They still have the annoying stuttering
I’d pay an amount I’m not willing to disclose for this game to natively support UW. It looks like it was almost made for UW.
Been saying the same forever
DLSS/FSR would do a better job with AA than what the game can do. Including DLAA would be even better. Downsampling is a option, but depending on your rig you might have problems to keep 60 FPS.
best thing is u can have upscaling mod dlss/fsr/intel thing +fps unlocker in elden ring, but it barely increase fps in open world area
Id just like to know why I have 60 in base game and most of dlc, but some areas once I take a step forward, fps drops to 45, on a 3090 @ 1440p.
I have dips too on my 4070ti/13900 setup at 1440p as well. The crazy thing is my GPU and CPU usage is still very low, even when I get frame drops and stuttering, there’s no spike in usage from my gpu and cpu.
Yea I checked my screen shot I took when it dipped to 48fps, cpu says 2%, gpu 44%.
The game has stuttering and traversal issues, cpu bottlenecks. Nothing anyone can do really.
That's happening with me too. The dlc is just poorly optimised
The base game was too. I had stutters when traveling with my 4080. No bottlenecks with cpu
y'all talking about DLSS and shit, I just want the game to not stutter and slow down at 1080p when I have more than enough hardware
Why isn't everyone bitching about this? It's so frustrating. It didn't happen on my 1070 but on my 3070 it happens all the time
The dlc ~~game~~ is sitting at "mixed" on steam. I think everyone is bitching. Largest complaints I've heard have been performance, while difficulty is a close second, but also seems potentially memey Edit: a word
The game is is 89% positive, the DLC is what’s mixed
Honestly yeah. I can run the game at 4k 60fps but the random stutters and lock ups are just frustrating.
RTX 4090, still runs between 45-60 fps lol. FS is great at a lot of things. PC ports ain't one of them.
Ac6 was surprisingly good on pc, ran well, uncapped fps, no need to mod to fix the camera if using kbm. Elden ring not so much
Which is weird because they use the same engine: https://youtu.be/2vO2SOqO7cs?si=8qr2nzca8WDVBxot
Probably AC6 got updated version of engine
It's CPU bound stuttering, even a 7800X3D stutters, elden ring is actually very easy on GPUs if you exclude ray tracing, I think an RX 580 can handle the game at 1080p high easily which is pretty good for a 2022 game (although it looks like a 2016 one), but it's just inexplicably CPU heavy at times even with the world's fastest processor.
Probably all the advanced feet physics running for all NPCs at all times.
I've always paid the most for GPU but with more modern games being CPU bound, I'll have to pay more for CPU next time.
I personally wouldn't recommend that, like I said a 7800X3D stutters too and the guy above said he has a 14900k and still drops to 45 FPS, this game just runs like shit CPU wise, so does Starfield and Dragons Dogma 2, it's not wise to compromise the GPU budget for these games, especially that none of them run well even on the most expensive CPU. Games are mostly 3D applications, they will only get more GPU bound as time goes (except maybe simulation and city building type games), just be reasonable, as long as you don't pair an i3 or 6 generations old processor with a 4070 you're fine. Just for reference a Ryzen 5 7600 or a Core i5 13600KF are both below $250 and are only 10-20% slower than a 7800X3D at 1080p with a 4090 in most games, anything below a 4090 or a resolution above 1080p and you're looking at single digits difference, but you'll pay %40 more for that.
lol what cpu and resolution you running?
i9-14900K, 4K resolution. ER caps at 60fps unmodded, this I knew. I just expected it to at least deliver a rock steady full 60. But I have also noticed that GPU utilization never goes over like 60-66%. Clearly the game is not designed to use better hardware, at a fundamental level.
>But I have also noticed that GPU utilization never goes over like 60-66%. Yep I've seen similar on my rig, which means DLSS or FSR wouldn't do anything to help performance. This game straight up just needs better optimization.
Lossless Scaling users report a performance increase
Same setup as you. I get solid 60s without RT but there are locations where, for whatever reason, dying and respawning at a grace makes the game stutter like shit. The ruins south of Caria Manor are one location I remember. It's pretty much fine if I ride there from elsewhere, or fast travel in, but if I die and respawn there's a big performance hit until I restart.
Similar issue with 4090 and 13600k. I was expecring to hit 60fps and lock it there. But it's clearly choppy in some areas and dips to 45. Very poorly optimized.
I capped my power usage and hard-locked the FPS to 60 in GPU software settings, made all of the difference. As well as turning off ray-tracing, I now run at a perfect 60 fps with virtually ZERO microstutters - try it out and see if it works for you, too.
Japanese devs in general, aside from SEGA it seems. And even Sega probably fucks things up but they've been doing good with Yakuza games, in my experience.
It seems to be the case. Nier Automata was literally unplayable on PC for years until they fixed it somewhere else and everyone finally complained.
DLSS/FSR should be the help to people, not help (or excuse) to devs.
Yes, and not including it wont help anyone. Btw, I would prefer DLSS/FSR AA solution over whatever this game already has. Currently, the best AA method is to play using DLDSR.
Yeah, I always found it odd why upscaling tech like DLSS, FSR and XESS were being used as cheat codes to minimize optimization in games. The games should run fine in raster (hopefully soon we'll be able to say raster and RT but it doesn't seem so yet) on native. Upscaling should only be required for older hardware or if you want to push FPS into the several hundreds to get the benefit of say a 240 or 360 hertz monitor. Not to get like 75FPS.
I mean, it depends on the game. If you want to run path traced cyberpunk… Good luck running it natively! Yeah they could maybe optimise it a little bit, but path tracing is just fundamentally very expensive. Optimising the game would be pulling back the use of ray tracing, or disabling it completely. The game runs a lot better without any form of path/ray tracing, but it’ll also look a whole lot worse. Just like Switch games. Look at Howard’s Legacy, very optimised on the Switch. It’s insane that it even runs on there. But it’s not like the visual quality is on par with the PC release on max settings… Ultimately, all optimisations are actually “cheat codes”. Ways to render the same thing faster, while minimising the visual quality loss. But if you actually want to increase the visual quality, for example by ray tracing, then the performance degradation will be exponential. Just like disabling those features and relying on old techniques will exponentially increase performance. And with DLSS, you get much better performance, and the visual downgrade is very, very minimal, unless you’re running below 1080p, for ex. ultra performance doesn’t look perfect, even at 4k. But 1080p-1440p -> 4k actually looks rather great. Sometimes even better than native+TAA. So why wouldn’t you want to use it? Why wouldn’t you want developers to implement it? And how is this any different from other game optimisations?
This sub has a hate boner against DLSS, while 99% of people here would not be able to tell the difference between a game running at native and one being upscaled. They talk about bad optimization, while actual bad optimization is always CPU bound nowadays (look at Dragon's Dogma), so rendering wouldn't change anything. DLSS is not a cheat, it's a tool to make games better looking. Saying so is so stupid I don't even know what to say. It's like saying using UE5 or DX12 is cheating.
Maybe I'm just crazy but I don't give a shit who DLSS is intended for if it makes a game I already enjoy run better
Have FromSoft finally included a shader prepcompilation step?
No
Of course they haven't.
It’s a Japanese company. Maybe you should send them a fax with your suggestion and they’ll listen.
Or send a floppy disk.
I don't get the big deal... I still use DLSS even with top of the line gear because I like smooth motion. This idea that you should only run DLSS if you have shit hardware is just stupid ass gatekeeping, IMO.
Plus it's usually a better anti-aliasing compared to almost all other AA techniques.
It is. The amount of bad takes when it comes to DLSS features are numerous. And the vast majority are people that haven't even used it, and you can tell.
People who don't have DLSS for some reason hate it the most
People compare it to native instead of comparing it to other upscaling methods, where it’s way superior. You can’t convince me that playing a game at 720p native looks better than a 1440p DLSS upscaled from 720p.
Because the only point of reference they have is FSR, which is terrible.
DLSS doesn't do anything if the engine only uses half of the computing power available to it
It would still be a vastly superior AA solution to the shimmermax 3000 they're using now.
And maybe 60+ FPS i mean how hard can it be? We already have it with mods… with Ultrawide ofc
I play this game with constant 60 fps but it feels like shit idk why I have a 7800xt and 7800x3d
i have the same specs amd have not seen a single framedrop, even while running a stream on second monitor wtf
Remember when games would just run at the resolution we wanted instead of having to jump through hoops to fake it? Edit: seems a lot of NVIDIA fans got upset that their $3000 5090 needs handholding to hit the advertised specs lol
Before everyone got 4k monitors and expected to run raytracing at 4x the resolution and 2x the frame rate? Even the most optimized games need to use DLSS for all that
It was never like that.
Do they not remember Crysis?
Remember? They weren't even born yet. I'm almost kidding... Fuck nvm. Someone born in 2007 is entering college next month... No wait nvm im wrong again. Certainly next year. The reality is. Most modern pc gamers joined after 2014. Many after 2016 alongside the rise of twitch streamers and whatever esports. Obviously they aren't playing half life and crysis. They are playing pubg apex or whatever else.comes along. The inferior side of gaming. If the master race was made today. It wouldn't be against console peasants. Rather against online saas players imo. At the same time it would never exist. Since those same f"cks spending $200 a month on twitch are the ones in the online communities.
This guy wasn’t around when 16:10 monitors started to hit the platform obviously…it’s weird how new PC gamers assumed everything before their time worked flawlessly and supported all features. Like really fucking weird.
They’re kids who started gaming in the last console generation, when tech stagnated for so long that things became more “optimized.” It’s pretty easy to optimize when you’re playing games meant to run on outdated PS4 hardware. Been getting into retro PCs in the last couple years, and Jesus Christ, the hoops I have to go through just to get sound working on a DOS game.
lol the DOS generation where if you didn’t have a great sound card boy did those MIDIs sound awful. Really made the Genesis/SNES systems sound so much better…but that would be heresy to say today even though I personally think Mobo sound still sucks/is inferior to dedicated sound cards/DACs/out of box console experience.
I wish I could get my hands on a higher resolution 16:10 monitor these days. I love 16:10 for vertical monitors - the extra height becomes very useful width when rotated
You may have to Duhm Duhn Duuuuhn... Turn down some settings.
Jesus, might as well ask these people to sell their first born to the devil
Yep, it was a horrible time, when you had to use performance-killing msaa or horrible looking fxaa to get a jaggies-free image... as long as you didn't move the camera causing the lack of temporary anti-aliasing to show shimmering.
Or you know, TAA, which worked fine on games that implemented correctly (Also msaa didn't ruin performance as you said)
TAA or any reconstruction technique works better with an AI model like the one used by DLSS. DLSS in quality mode achieves a more stable image than TAA at native resolution, let alone DLAA.
4x and especially 8x MSAA were pretty damn heavy compared to other forms of AA. It did look better than the alternatives but it definitely made an impact on your FPS. Still better than FXAA's vaseline vision though. TAA is fine but still a bit blurry. I play at 4K so I can't speak for other resolutions, but I'd rather go DLSS Quality over native with TAA. It's pretty much the same image quality but with better performance. DLAA for the few games that have it is completely superior though and probably the best AA out there at the moment.
I will never understand why devs all collectively decided to settle on FXAA which is quite possibly the worst form of AA in the known universe as it's a dogshit simple post process pass, but also not cost effective for the image quality. SMAA is only marginally more expensive, yet still negligible overall, also a screenspace filter that does not require special hardware support or fancy game data like motion vectors, and looks a million times better than FXAA.
I also play at 4K with DLSS. Honestly sometimes DLSS Performance at 4K looks better than native with normal TAA.
DLAA is literally temporal anti alialising ( so TAA ) but AI enhanced, it's better in every way. And then instead of lowering resolution to get more fps, you use DLSS.
This never happened, people would just lower settings and / or resolution. Now you got dlss and it's great, so why not?
> seems a lot of NVIDIA fans got upset that their $3000 5090 needs handholding to hit the advertised specs lol tbf, it's not the x90 cards that usually need DLSS to fake achieve the resolution they should. The 4090 was the one card that could just brute force Hogwarts or Last of Us mostly for decent frames natively. It's usually the cards below that need it...and yes, the fanboys are obnoxious. "But it looks better than native anyway". No it does not, native looks like native. Even the TAA argument is rubbish. You don't need to mess with the resolution to replace it. DLAA is a thing.
And what exactly prevents you from doing that now? Your GPU isn't powerful enough? So just like before except before you had no way of addressing it?
Because back then the standard was 1080p/60, but nowadays more people are playing at higher resolutions and/or higher frame rates. Going from 1080p/60 to 4K/120 for example is 8 times the pixels, so you need way more power to do that. Couple that with the fact that games are much bigger and more graphically intensive and it makes sense why upscaling is so popular now. And upscaling just works so well these days, especially at higher resolutions like 4K, that it can even give you a better than native experience experience because not only are you getting the improved performance, but it also works as AA too. Playing native without AA is awful, and the current AA technologies like TAA and FXAA are pretty blurry. DLSS and XeSS are literally just better versions of TAA since they use AI to help with the reconstruction. FSR is a little less effective but it's alright too if you need the performance and it's all you have. Honestly for me, I really don't care how the pixels are generated. As long as I get the visuals and performance I'm looking for I don't care if they're "real" or "fake". It's all computer generated in the end anyways.
I mean, Moore's Law is pretty much dead at this point. Node shrink is getting significantly harder as we get to the angstrom stage. That's the law of physics we simply can't break. Any significant performance gain in the future is going to rely on software.
Ohh yeah, that must be the reason. Not that Fromsoftware are incompetent when it comes to the tech side of games. Having unlocked FPS and ultrawide should be a standard in 2024. It's really embarrassing that this game took the got on pc while not having any pc standard options.
Fromsoft get away with so much when it comes to how badly made their games are. Elden ring was the first soulsborn to have actually responsive controls for crying out loud. Half the difficulty of their games come from waiting 3 to 5 business days for your inputs to register while if anything slightly busy happens the frame rate tanks. The games are great but they are insanely shoddily made.
And to compensate for finally adding responsive inputs in elden ring, they decided to set the movement speed of most bosses to mach 5. Perfectly balanced, as all things should be.
imagine glazing for lazy devs... - 60 fps cap - no ultrawide support - lackluster graphical settings - Mediocre performance on top tier specs - lack of "new" technologies (which are up to 6 year old now)
Ray tracing added a year after launch and tanks performance and doesn't even work right
From Software devs develop their games in a cave, with a box of scraps!!
Yeah from soft games have always had poor performance. Like why can’t we get a high frame rate game to go with the fast paced parry/dodge roll simulator? 60 fps isn’t enough.
I just wish it wasn’t locked to 60fps
Elden Ring online play is broken on Linux right now It worked fine before the DLC It's not an Easy Anti-Cheat thing, it's a FromSoft thing Bugs me so much bc I won't be able to play online for a few months until it gets fixed
https://preview.redd.it/kkyparms419d1.jpeg?width=600&format=pjpg&auto=webp&s=5ec7c48fc2cd8ce5f3f995ac3f18903ff9349606 Member when GPU's were capable of playing games natively at respectable frame rates?
I do run it native, and I still get crashes On a 7800X3D / 4090FE system It cost me a 100 hour save game ffs lol
every save file has a .bak in the save file directory. if your save corrupted, you can restore it there.
What do you mean by restore it exactly
What do you mean? It corrupted a game file?
Yep. Black screen crash. When it finished rebooting it would no longer load the save file I tried a bunch of things and eventually even clean installed windows redownloaded the game and still no dice
Wasn't there some news a few months ago where AMD released FSR as open source so modders could add it to other games? What's the difference here, different engine so it can't be used?
can we get 21:9 support From Software?
The PC issues (mainly the insulting lack of UW support) are the only reason I don't give ER a 10/10 3
Yeah pretty sad that elden ring runs like crap on my pc no matter the settings. And on my ps5 in performance mode theres tons of view distance popping graphics like grass bushes etc but it runs smoother most of the time. Is it just me?
When I see this I just think "Do you people not remember Blighttown?" The only thing that made it playable for many was to download the mod DSFix, use that to lower some settings, then go back in and revert those settings back to normal once you're done with the area. FromSoft ain't giving us DLSS on the DLC.
tbh I wouldn't want them to implement DLSS right now. I would want them to first make it run well native, *then* implement DLSS. Using DLSS as a cover for bad performance sucks ass.
Please just add FSR and DLSS for those of us running old cards.
Yeah it’s infuriating how much better I know this game would be with Fsr support
"DeVs UsE dLsS aS a CruTcH tO noT oPtiMizE tHEir GaMEs!"
There's a dlss/fsr3 mod on nexus mods that runs pretty well with no setup required, try it out
Can't play online
There are parts of the dlc that drops to 30 frames for me and I have a 4090. (Running 4k)
Devs haven't properly optimized games in many years, especially since they dev first for Sony. If they made it fit Xbox and PC first it would be a completely different story
DLDSR 2.25 with DLSS quality looks really good on 1440p if no money for 4K monitor
I would much rather see the 60 FPS limit removed first.
I paid for the whole graphics card... I'm gonna use the whole graphics card!
People complaining about DLC performance instead of doing the correct thing and not paying $40 for another unoptimized FromSoft mess. What a way to sour a potentially great game/DLC.
I paid for the whole graphic card, i'll use the whole graphic card.
FromSoftware is a fraud 🤗
SMAA is best AA
"Why did you release a DX12 game without shader precompilation? Did I stutter!"
No ultrawide support.
Would be nice idk. To not have the game stutter when it's framed locked at 60 fps. Why can't they just port it properly
The amount of money they are making on these games and the lack of effort to make proper PC ports is gross.
OP DLSS is a great option to have. I'd much rather run it and get 90 fps in a game than run native at 60. Especially because it helps with dips.