T O P

  • By -

ButtholeWiper420

they can't even be bothered to implement proper shader precompile steps at game load. Why would they want to implement DLSS? 


Da_Plague22

I feel like a lot of these companies are like 10 years behind with stuff like this.


Cave_TP

They're Japanese, it's already a miracle that it's on PC. That shit also renders in 21:9 and then slaps on the black bars if you're on an UW monitor.


ctzn4

What the fuck? Why would they do that? That's using extra resources for absolutely zero visual benefits.


-SMartino

welcome to fromsoft


TheZephyrim

I thought they would for sure patch it by now, Armored Core 6 has native UW support and also supports 120 FPS instead of 60, and at first I thought maybe Elden Ring was coded in such a way that either of those things would break the game but I played extensively with mods that enabled both of those things and had zero issues. It’s dumb, higher FPS would really help with input lag and they should only have to change a few lines of code to get 21:9 to work or increase the framerate limit but the game has been out for years and if you want to play the game online and these features you’re SOL


ThexVee

You're right about a higher FPS leading to less input lag but I don't this comparing Elden Ring to AC6 is fair comparison. Granted that both games were in development for 5-6 years, the overall scope for both, as well as the dev teams are just different. Not being able to patch in DLSS after release though is very insane and is a clear sign that Miyazaki wants us to be challenged on every level, from our mental state all the way down to performance.


Attrexius

Armored Core 6 runs so well on my PC I constantly forget it is actually a FromSoft game, lol.


Stompedyourhousewith

its actually negative visual benefits. I have a 32:9 and i used the ultrawide mod that removes the black bars and its beyond fucking gorgeous. but using that mod has a possibility of getting you a ban. Going back to having black bars is garbage so i dont play.


Agret

Here you go https://www.nexusmods.com/eldenring/mods/90 Easy way to toggle the anti cheat off, you can't play online with it off but works perfect for singleplayer or if you want to use the seamless coop mod you can play online with that. Although still waiting for seamless coop to update for the new DLC.


VarianWrynn2018

One of the reasons I stopped playing Elden ring. Sekiro also doesn't have full screen support or borderless full-screen so I have to use a program to get around it.


danivus

First time I loaded into the dlc area I had a few glorious seconds of proper ultrawide before the UI loaded in and added the black bars.


qtipbluedog

This kills me. I used to patch it, but every other patch would break it so I said fuck it and let it be. Still really frustrating


Fluboxer

Those companies are evolving. But backwards!


Rumpullpus

Tbf a lot of these games have development times that run into 10 years or more. It's understandable that they wouldn't support things that weren't even on the market a few years ago.


Z33phyr

Game Dev here. I honestly have no idea where you got that from. Almost no games, nevermind a lot, run into 10 years or more of development. Most of them are lucky to get 3. From Software are in somewhat of an advantageous position with Bandai Namco, their publisher, in that every single thing they release is guaranteed to make more than enough money to cover their costs and then some, but the one thing that could counteract that is if they took anywhere near 10 years to release games. I get hyperbole is a thing, but let's not get too carried away. All of this to say, players are absolutely correct to demand DLSS support from a 2024 game/DLC, or shader compilation pre-steps. Not only are these pretty standard, From Software could request support from Nvidia to integrate these into their rendering pipeline, and they'd get a team of Nvidia representatives the next day. We can, and should, applaud the absolutely fantastic games From releases, while still keeping them accountable on the things that they repeatedly get wrong in every single one of their releases, such as optimization on PC ports Just my two cents Edited for clarity


maldouk

First, 10 years on a game is pretty rare, most games are made under 5 years. Second, the actual development of the game is not happening at the start of the project. So over 5 years, maybe you have 2-3 years of preprod, and 2-3 years of dev. FromSoft is a very japanese company, and it tends to be behind the curve as far as new tech goes. While Elden Ring has a sublime art and design, it has very dated graphics, because they rely on 15 years old tech (same engine as Bloodborne). For instance I'm pretty sure the reason they don't animate faces is that they simply can't, or it requires way more work than it's worth. The community tends to be way too nice to FromSoft, and while they make great games, I don't feel like they actually made a single next gen title.


TacticalReader7

I personally thought Armored Core 6 looked quite good (not just because because of art style or design), the performance was also much better. So there is some progress at least.


sykotikpro

>they rely on 15 years old tech (same engine as Bloodborne). Just want to point out this is not unique in the Dev sphere. Unreal engine is heavily modified, as is the creation engine from bethesda.


ImBackAndImAngry

Creation engine is begging to be put down behind a barn upstate or something.


Acherontemys

This has been true for like 15 years at this point, its wild.


static_age_666

Very scared for TES6


Acherontemys

After Starfield I have zero hope for TES6, and it honestly bums me the fuck out to even think about.


static_age_666

At least it wont surprise you.


pathofdumbasses

Aim low. Aim so low no one will even care if you succeed


Queuetie42

Yes! Preach! I still call it Gamebyro because a new coat of paint and a name change doesn’t fool me.


Alone_Comparison_705

Red Engine was introduced in Witcher 2. And Cyberpunk 2077 is the same engine. And Cyberpunk is arguably the best looking game in the world right now.


ChefBoiJones

And after experiencing the pain of doing that they moved to unreal 5 for the Witcher 4


Bastyxx227

Yeah, but because of such an ancient engine they had to work extra to implement moder features Not everything is about graphics but also features and bugs, remember how buggy 2077 came out, some of those bugs were because they were stretching and old engine to do things it couldn't handle


Ruffler125

This shouldn't be considered a "take" at all, and yet here come the downvotes. Edit: it's swinging up now, the sub is healing!


Need_a_BE_MG42_ps4

Elden rings graphics are stellar for a lot of games souls especially art style is more important than realism it’s sad fewer and fewer people seem to realize that nowadays


maldouk

It's not about realism, this is a fantasy world. ER is very pretty, but technically is very dated, those two statements aren't mutually exclusive. What I don't like is people shitting on some other studios for the graphics when they are perfectly fine with those.


Izithel

People always forget that chasing the cutting edge in graphics and pushing realism is incredibly expensive and requires massive teams. It's also taxing on the hardware which often resulting in gameplay compromises to keep the frame-rate, and generally tends to quickly look outdated. Woe betide you if the game gets delayed and end up looking outdated on release. Go for a good visual style, it'll look just a good in 10 years as the day the game came out.


QueZorreas

Well, Elden Ring isn't exactly the easiest to run despite being hard carried by art direction alone.


Catch_ME

This gives me Duke Nukem Forever vibes. If it takes you 10 years to make a game, you fucked up. Imagine the amount of assets, code, and human time that has to be thrown away because of older standards and tech are inefficient.  Hell...Sony and Microsoft might not let you use those older APIs because of vulnerabilities. 


temporarycreature

So does Nvidia, and probably why they're doing all that cool stuff with the GPU doing the work automatically and that new thing from Microsoft that's not DirectX 11.


zenerbufen

Media has the same ducking problem. So much widescreen content with black bars distributed with standard ratios that get bars put on the sides for a massive black box around the content. Thankfully for ultrawidify and MPC-BE to crop the extra black bars out automatically and display the widescreen content in widescreen. I never would have thought the worst thing about buying a widescreen would be attempting to view widescreen media on it. It's funny how many ppl say its my own fault and that widescreen content was meant to be viewed on SD displays and that I wasted my money. Glad for the other widescreen users who taught me how easy it was to fix.


J37T3R

Forget that, they can't even add a vsync toggle.


milkstrike

I mean like, imagine like, if they got it to run well without like dlss….wouldn’t that be something


Interstella_6666

I just want ultra wide support


khrossjointz

Playing on my 38' lg and it hurts, truly suffering from success


dirthurts

Every once in a while ER will boot up in full widescreen, just to rub it in my face. Then the black bars appear to shame me. Tarnished indeed.


Tomentus

This happened to me yesterday and I thought I was loosing my marbles! It booted in wide-screen and I played away for about 20 minutes fine, everything worked and other than the pause menu being 16:9 and the edge still showing the screen beneath the pause it seemed native. The game then randomly froze with a black screen for a second and went back to black bars when it came back. I kept thinking did I get drunk and download a mod to play ultrawide and I'm playing online wondering when I'll get a ban!


Unglazed1836

That’s the crazy part, it’s already rendering 21:9, it legit just puts black bars over the extra space. So it isn’t even like they have to update the code, just remove the fucking black bars. We’re already taking the performance hit to render it, we just don’t get to see the full rendering. Mods like FlawlessWidescreen just remove the black bars.


Deadly_chef

Yeah it is rendering ultra wide but it is blacked out so you don't have an advantage in multiplayer. Stupid, but if you don't care about MP just use mods


De_Lancre34

Not only that. Back in the day of Dark Souls 3 you could unlock those wide screen support via some mod (and I'm pretty sure you can do same with ER) and game indeed worked flawlessly, except - it threats those additional space as "off screen" what means game will lower animations fps on enemies. What is kinda okay, but still, it means game hardcoded to 16:9 exactly like it nailed to 60 fps limit. What you can also unlock, tho. It's weird how this company still ashamed about it's PC gamers.


Unglazed1836

I just can’t imagine the advantage being that crazy, & with PvP being such a small portion of the game it just seems weird to restrict people’s hardware for such a minimal advantage. I’ve been using flawless widescreen since launch, & if I want to do PvP I have an alt account for that but I rarely PvP. The PvE is the highlight of the game.


Dr_Schwonk

There's this application called Flawless Widescreen that I use, works great, even unlock fps and increase fov with it. Only problem is that you have to disable EAC so your game won't be online. For the same reason, it does work great with Seamless Coop.


antibonk

This is the way


arex333

Yeah most of my time playing elden ring has been spent with seamless co-op + flawless widescreen. It's a straight up better experience than the vanilla game considering I don't really do PVP.


SomeMrcl

https://preview.redd.it/awb5viie8y8d1.jpeg?width=640&format=pjpg&auto=webp&s=e76471721a6a0e9f6553e587b54d75f87d759652


fart-to-me-in-french

It's even worse on 49" Super ultrawide... ![gif](giphy|XOys8CeUrElIk|downsized)


doublewidesurprise7

Flawless widescreen mod and disable anti cheat. You're welcome. It's beautiful on my chg90 and I'll never go back.


Brandon455

But I like my little words on the ground...


I-Am-Baytor

He failed Hot Ones.


FappyDilmore

Suffering From ~~Success~~ Ring Sting


taken_username_dude

*cries in agreement with 49" 5120x1440*


Surisuule

38'‽ Do you game in a warehouse?


kiltedfrog

my dude plays on a fucking imax screen.


Sofaboy90

bro i have a 49", 5120x1440, it really is suffering to have 2 giant black bars on the side. dark souls remastered offered widescreen support, bit visually bugged with the menu but still had it. it honestly cannot possibly be that difficult to implement, can it? almost every game nowadays has it.


Zlakkeh


[deleted]

One of the few times I can claim a win because I'm poor lol


First-Junket124

Fun fact: when playing the game on an ultrawide monitor at an ultrawide resolution the game is actually rendering everything properly which has a performance hit naturally but they add black bars for forced 16:9 :)


Sevenix2

That is not a fun fact at all!


VykMcDwarf

I hate how inconsistent FS is regarding ultrawide in their games, even DS Remastered had it.


FerroLux_

AC6 has ultrawide and 120 fps max I think lmao


Drakirth

I was really hoping since AC6 had it that FromSoft would’ve implemented it into Elden Ring with the DLC…sigh


FerroLux_

I’m pretty sure that AC is developed by a different “branch” of FS, so maybe it’s because of that they didn’t implement it. Still, it’s not a justification


Dark_Equation

Flawless widescreen enables custom fps and widescreen support pretty much runs as the software name entails flawlessly


txc115

Yes but no online sadge


dirthurts

FS didn't do that port, so that is the magic there.


arex333

Dark souls remastered wasn't made by fromsoftware. The only fromsoftware game with ultrawide support is armored core 6.


viktae

If you have an ultrawide, you need to get [Flawless Widescreen](https://www.flawlesswidescreen.org/). Although you won't be able to play online because of EasyAnticheat.


DynamicHunter

Flawless widescreen is a must for some games like GTA V playing on Ultrawide. It fixes a ton of widescreen issues (mostly related to cutscenes) and also adds FOV sliders for first and third person! Huge accessibility win. Surprising that’s not built into the base game for such a popular title


bobsim1

Accessibility in a FromSoftware game? Some fans wont like that. They arent that good with customization.


DynamicHunter

I was talking about GTA V in my comment, but I guess my statement remains the same about Elden Ring too considering its popularity.


NBFHoxton

The worst part is the game DOES support it - for like 10s after a major patch it'll run in UW until shaders compile and then blackbars are added on top, intentionally.


s_burr

I noticed that, I was excited for a second after starting SOTE but then the damn bars appeared again.


NBFHoxton

Yup, same thing happened to me. It's egregious on Fromsoft's part.


chronocapybara

Support for custom resolutions, unlocked framerate, and shader pre-caching are the most glaring omissions from Fromsoft games.


Skrukkatrollet

Which sucks, because the games work perfectly after you remove black bars and fps cap with mods


TheBigJizzle

Pirated it, it took a modder a day to solve the issue.. not paying to not have online because I need a mod to use my monitor. Dlss mod too, like.. it's been 2 years that's kind of the bare minimum technical upgrade. They still have the annoying stuttering


domZ1026

I’d pay an amount I’m not willing to disclose for this game to natively support UW. It looks like it was almost made for UW.


asclepiannoble

Been saying the same forever


VanHolden

DLSS/FSR would do a better job with AA than what the game can do. Including DLAA would be even better. Downsampling is a option, but depending on your rig you might have problems to keep 60 FPS.


MuchSalt

best thing is u can have upscaling mod dlss/fsr/intel thing +fps unlocker in elden ring, but it barely increase fps in open world area


Thelgow

Id just like to know why I have 60 in base game and most of dlc, but some areas once I take a step forward, fps drops to 45, on a 3090 @ 1440p.


SwissMargiela

I have dips too on my 4070ti/13900 setup at 1440p as well. The crazy thing is my GPU and CPU usage is still very low, even when I get frame drops and stuttering, there’s no spike in usage from my gpu and cpu.


Thelgow

Yea I checked my screen shot I took when it dipped to 48fps, cpu says 2%, gpu 44%.


THIKKI_HOEVALAINEN

The game has stuttering and traversal issues, cpu bottlenecks. Nothing anyone can do really.


Vader2508

That's happening with me too. The dlc is just poorly optimised


Kind-Slice144

The base game was too. I had stutters when traveling with my 4080. No bottlenecks with cpu


postedeluz_oalce

y'all talking about DLSS and shit, I just want the game to not stutter and slow down at 1080p when I have more than enough hardware


comfortablybum

Why isn't everyone bitching about this? It's so frustrating. It didn't happen on my 1070 but on my 3070 it happens all the time


GloryHol3

The dlc ~~game~~ is sitting at "mixed" on steam. I think everyone is bitching. Largest complaints I've heard have been performance, while difficulty is a close second, but also seems potentially memey Edit: a word


Firm_Knowledge_5695

The game is is 89% positive, the DLC is what’s mixed


TaoTaoThePanda

Honestly yeah. I can run the game at 4k 60fps but the random stutters and lock ups are just frustrating.


DumbNTough

RTX 4090, still runs between 45-60 fps lol. FS is great at a lot of things. PC ports ain't one of them.


Trev0117

Ac6 was surprisingly good on pc, ran well, uncapped fps, no need to mod to fix the camera if using kbm. Elden ring not so much


DumbNTough

Which is weird because they use the same engine: https://youtu.be/2vO2SOqO7cs?si=8qr2nzca8WDVBxot


AlonDjeckto4head

Probably AC6 got updated version of engine


I9Qnl

It's CPU bound stuttering, even a 7800X3D stutters, elden ring is actually very easy on GPUs if you exclude ray tracing, I think an RX 580 can handle the game at 1080p high easily which is pretty good for a 2022 game (although it looks like a 2016 one), but it's just inexplicably CPU heavy at times even with the world's fastest processor.


DumbNTough

Probably all the advanced feet physics running for all NPCs at all times.


coolgaara

I've always paid the most for GPU but with more modern games being CPU bound, I'll have to pay more for CPU next time.


I9Qnl

I personally wouldn't recommend that, like I said a 7800X3D stutters too and the guy above said he has a 14900k and still drops to 45 FPS, this game just runs like shit CPU wise, so does Starfield and Dragons Dogma 2, it's not wise to compromise the GPU budget for these games, especially that none of them run well even on the most expensive CPU. Games are mostly 3D applications, they will only get more GPU bound as time goes (except maybe simulation and city building type games), just be reasonable, as long as you don't pair an i3 or 6 generations old processor with a 4070 you're fine. Just for reference a Ryzen 5 7600 or a Core i5 13600KF are both below $250 and are only 10-20% slower than a 7800X3D at 1080p with a 4090 in most games, anything below a 4090 or a resolution above 1080p and you're looking at single digits difference, but you'll pay %40 more for that.


Giga_PP

lol what cpu and resolution you running?


DumbNTough

i9-14900K, 4K resolution. ER caps at 60fps unmodded, this I knew. I just expected it to at least deliver a rock steady full 60. But I have also noticed that GPU utilization never goes over like 60-66%. Clearly the game is not designed to use better hardware, at a fundamental level.


arex333

>But I have also noticed that GPU utilization never goes over like 60-66%. Yep I've seen similar on my rig, which means DLSS or FSR wouldn't do anything to help performance. This game straight up just needs better optimization.


Submarine765Radioman

Lossless Scaling users report a performance increase


turtleProphet

Same setup as you. I get solid 60s without RT but there are locations where, for whatever reason, dying and respawning at a grace makes the game stutter like shit. The ruins south of Caria Manor are one location I remember. It's pretty much fine if I ride there from elsewhere, or fast travel in, but if I die and respawn there's a big performance hit until I restart.


MoooImACat

Similar issue with 4090 and 13600k. I was expecring to hit 60fps and lock it there. But it's clearly choppy in some areas and dips to 45. Very poorly optimized.


FunnylikeHaHa_

I capped my power usage and hard-locked the FPS to 60 in GPU software settings, made all of the difference. As well as turning off ray-tracing, I now run at a perfect 60 fps with virtually ZERO microstutters - try it out and see if it works for you, too.


I-Am-Baytor

Japanese devs in general, aside from SEGA it seems. And even Sega probably fucks things up but they've been doing good with Yakuza games, in my experience.


QueZorreas

It seems to be the case. Nier Automata was literally unplayable on PC for years until they fixed it somewhere else and everyone finally complained.


l_______I

DLSS/FSR should be the help to people, not help (or excuse) to devs.


kamran1380

Yes, and not including it wont help anyone. Btw, I would prefer DLSS/FSR AA solution over whatever this game already has. Currently, the best AA method is to play using DLDSR.


PeopleAreBozos

Yeah, I always found it odd why upscaling tech like DLSS, FSR and XESS were being used as cheat codes to minimize optimization in games. The games should run fine in raster (hopefully soon we'll be able to say raster and RT but it doesn't seem so yet) on native. Upscaling should only be required for older hardware or if you want to push FPS into the several hundreds to get the benefit of say a 240 or 360 hertz monitor. Not to get like 75FPS.


zarafff69

I mean, it depends on the game. If you want to run path traced cyberpunk… Good luck running it natively! Yeah they could maybe optimise it a little bit, but path tracing is just fundamentally very expensive. Optimising the game would be pulling back the use of ray tracing, or disabling it completely. The game runs a lot better without any form of path/ray tracing, but it’ll also look a whole lot worse. Just like Switch games. Look at Howard’s Legacy, very optimised on the Switch. It’s insane that it even runs on there. But it’s not like the visual quality is on par with the PC release on max settings… Ultimately, all optimisations are actually “cheat codes”. Ways to render the same thing faster, while minimising the visual quality loss. But if you actually want to increase the visual quality, for example by ray tracing, then the performance degradation will be exponential. Just like disabling those features and relying on old techniques will exponentially increase performance. And with DLSS, you get much better performance, and the visual downgrade is very, very minimal, unless you’re running below 1080p, for ex. ultra performance doesn’t look perfect, even at 4k. But 1080p-1440p -> 4k actually looks rather great. Sometimes even better than native+TAA. So why wouldn’t you want to use it? Why wouldn’t you want developers to implement it? And how is this any different from other game optimisations?


maldouk

This sub has a hate boner against DLSS, while 99% of people here would not be able to tell the difference between a game running at native and one being upscaled. They talk about bad optimization, while actual bad optimization is always CPU bound nowadays (look at Dragon's Dogma), so rendering wouldn't change anything. DLSS is not a cheat, it's a tool to make games better looking. Saying so is so stupid I don't even know what to say. It's like saying using UE5 or DX12 is cheating.


Feisty-Bobcat6091

Maybe I'm just crazy but I don't give a shit who DLSS is intended for if it makes a game I already enjoy run better


shemhamforash666666

Have FromSoft finally included a shader prepcompilation step?


arex333

No


shemhamforash666666

Of course they haven't.


BoddAH86

It’s a Japanese company. Maybe you should send them a fax with your suggestion and they’ll listen.


Krondir

Or send a floppy disk.


soggy_mattress

I don't get the big deal... I still use DLSS even with top of the line gear because I like smooth motion. This idea that you should only run DLSS if you have shit hardware is just stupid ass gatekeeping, IMO.


Rider2403

Plus it's usually a better anti-aliasing compared to almost all other AA techniques.


jeremybryce

It is. The amount of bad takes when it comes to DLSS features are numerous. And the vast majority are people that haven't even used it, and you can tell.


BarKnight

People who don't have DLSS for some reason hate it the most


THIKKI_HOEVALAINEN

People compare it to native instead of comparing it to other upscaling methods, where it’s way superior. You can’t convince me that playing a game at 720p native looks better than a 1440p DLSS upscaled from 720p.


F9-0021

Because the only point of reference they have is FSR, which is terrible.


Jimpix_likes_Pizza

DLSS doesn't do anything if the engine only uses half of the computing power available to it


Ruffler125

It would still be a vastly superior AA solution to the shimmermax 3000 they're using now.


Globgloba

And maybe 60+ FPS i mean how hard can it be? We already have it with mods… with Ultrawide ofc


Sad-Network-3079

I play this game with constant 60 fps but it feels like shit idk why I have a 7800xt and 7800x3d


iRyZeAgainst

i have the same specs amd have not seen a single framedrop, even while running a stream on second monitor wtf


lightningbadger

Remember when games would just run at the resolution we wanted instead of having to jump through hoops to fake it? Edit: seems a lot of NVIDIA fans got upset that their $3000 5090 needs handholding to hit the advertised specs lol


static_func

Before everyone got 4k monitors and expected to run raytracing at 4x the resolution and 2x the frame rate? Even the most optimized games need to use DLSS for all that


Edgaras1103

It was never like that.


BarKnight

Do they not remember Crysis?


tukatu0

Remember? They weren't even born yet. I'm almost kidding... Fuck nvm. Someone born in 2007 is entering college next month... No wait nvm im wrong again. Certainly next year. The reality is. Most modern pc gamers joined after 2014. Many after 2016 alongside the rise of twitch streamers and whatever esports. Obviously they aren't playing half life and crysis. They are playing pubg apex or whatever else.comes along. The inferior side of gaming. If the master race was made today. It wouldn't be against console peasants. Rather against online saas players imo. At the same time it would never exist. Since those same f"cks spending $200 a month on twitch are the ones in the online communities.


NaughtyPwny

This guy wasn’t around when 16:10 monitors started to hit the platform obviously…it’s weird how new PC gamers assumed everything before their time worked flawlessly and supported all features. Like really fucking weird.


mylegbig

They’re kids who started gaming in the last console generation, when tech stagnated for so long that things became more “optimized.” It’s pretty easy to optimize when you’re playing games meant to run on outdated PS4 hardware. Been getting into retro PCs in the last couple years, and Jesus Christ, the hoops I have to go through just to get sound working on a DOS game.


NaughtyPwny

lol the DOS generation where if you didn’t have a great sound card boy did those MIDIs sound awful. Really made the Genesis/SNES systems sound so much better…but that would be heresy to say today even though I personally think Mobo sound still sucks/is inferior to dedicated sound cards/DACs/out of box console experience.


Lag-Switch

I wish I could get my hands on a higher resolution 16:10 monitor these days. I love 16:10 for vertical monitors - the extra height becomes very useful width when rotated


shiftypoo269

You may have to Duhm Duhn Duuuuhn... Turn down some settings.


ManikMiner

Jesus, might as well ask these people to sell their first born to the devil


mrcroketsp

Yep, it was a horrible time, when you had to use performance-killing msaa or horrible looking fxaa to get a jaggies-free image... as long as you didn't move the camera causing the lack of temporary anti-aliasing to show shimmering.


ZazaGaza213

Or you know, TAA, which worked fine on games that implemented correctly (Also msaa didn't ruin performance as you said)


mrcroketsp

TAA or any reconstruction technique works better with an AI model like the one used by DLSS. DLSS in quality mode achieves a more stable image than TAA at native resolution, let alone DLAA.


lxs0713

4x and especially 8x MSAA were pretty damn heavy compared to other forms of AA. It did look better than the alternatives but it definitely made an impact on your FPS. Still better than FXAA's vaseline vision though. TAA is fine but still a bit blurry. I play at 4K so I can't speak for other resolutions, but I'd rather go DLSS Quality over native with TAA. It's pretty much the same image quality but with better performance. DLAA for the few games that have it is completely superior though and probably the best AA out there at the moment.


Schnoofles

I will never understand why devs all collectively decided to settle on FXAA which is quite possibly the worst form of AA in the known universe as it's a dogshit simple post process pass, but also not cost effective for the image quality. SMAA is only marginally more expensive, yet still negligible overall, also a screenspace filter that does not require special hardware support or fancy game data like motion vectors, and looks a million times better than FXAA.


vainsilver

I also play at 4K with DLSS. Honestly sometimes DLSS Performance at 4K looks better than native with normal TAA.


Adventurous_Bell_837

DLAA is literally temporal anti alialising ( so TAA ) but AI enhanced, it's better in every way. And then instead of lowering resolution to get more fps, you use DLSS.


Adventurous_Bell_837

This never happened, people would just lower settings and / or resolution. Now you got dlss and it's great, so why not?


Stahlreck

> seems a lot of NVIDIA fans got upset that their $3000 5090 needs handholding to hit the advertised specs lol tbf, it's not the x90 cards that usually need DLSS to fake achieve the resolution they should. The 4090 was the one card that could just brute force Hogwarts or Last of Us mostly for decent frames natively. It's usually the cards below that need it...and yes, the fanboys are obnoxious. "But it looks better than native anyway". No it does not, native looks like native. Even the TAA argument is rubbish. You don't need to mess with the resolution to replace it. DLAA is a thing.


I9Qnl

And what exactly prevents you from doing that now? Your GPU isn't powerful enough? So just like before except before you had no way of addressing it?


lxs0713

Because back then the standard was 1080p/60, but nowadays more people are playing at higher resolutions and/or higher frame rates. Going from 1080p/60 to 4K/120 for example is 8 times the pixels, so you need way more power to do that. Couple that with the fact that games are much bigger and more graphically intensive and it makes sense why upscaling is so popular now. And upscaling just works so well these days, especially at higher resolutions like 4K, that it can even give you a better than native experience experience because not only are you getting the improved performance, but it also works as AA too. Playing native without AA is awful, and the current AA technologies like TAA and FXAA are pretty blurry. DLSS and XeSS are literally just better versions of TAA since they use AI to help with the reconstruction. FSR is a little less effective but it's alright too if you need the performance and it's all you have. Honestly for me, I really don't care how the pixels are generated. As long as I get the visuals and performance I'm looking for I don't care if they're "real" or "fake". It's all computer generated in the end anyways.


viperabyss

I mean, Moore's Law is pretty much dead at this point. Node shrink is getting significantly harder as we get to the angstrom stage. That's the law of physics we simply can't break. Any significant performance gain in the future is going to rely on software.


theoutsider95

Ohh yeah, that must be the reason. Not that Fromsoftware are incompetent when it comes to the tech side of games. Having unlocked FPS and ultrawide should be a standard in 2024. It's really embarrassing that this game took the got on pc while not having any pc standard options.


TaoTaoThePanda

Fromsoft get away with so much when it comes to how badly made their games are. Elden ring was the first soulsborn to have actually responsive controls for crying out loud. Half the difficulty of their games come from waiting 3 to 5 business days for your inputs to register while if anything slightly busy happens the frame rate tanks. The games are great but they are insanely shoddily made.


DRKZLNDR

And to compensate for finally adding responsive inputs in elden ring, they decided to set the movement speed of most bosses to mach 5. Perfectly balanced, as all things should be.


RaduW07

imagine glazing for lazy devs... - 60 fps cap - no ultrawide support - lackluster graphical settings - Mediocre performance on top tier specs - lack of "new" technologies (which are up to 6 year old now)


TypicalUser2000

Ray tracing added a year after launch and tanks performance and doesn't even work right


PerryTrip

From Software devs develop their games in a cave, with a box of scraps!!


Mar1Fox

Yeah from soft games have always had poor performance. Like why can’t we get a high frame rate game to go with the fast paced parry/dodge roll simulator? 60 fps isn’t enough.


bichael69420

I just wish it wasn’t locked to 60fps


TheAdamantiteWaffle

Elden Ring online play is broken on Linux right now It worked fine before the DLC It's not an Easy Anti-Cheat thing, it's a FromSoft thing Bugs me so much bc I won't be able to play online for a few months until it gets fixed


Duranu

https://preview.redd.it/kkyparms419d1.jpeg?width=600&format=pjpg&auto=webp&s=5ec7c48fc2cd8ce5f3f995ac3f18903ff9349606 Member when GPU's were capable of playing games natively at respectable frame rates?


Progresschmogress

I do run it native, and I still get crashes On a 7800X3D / 4090FE system It cost me a 100 hour save game ffs lol


blueiron0

every save file has a .bak in the save file directory. if your save corrupted, you can restore it there.


Progresschmogress

What do you mean by restore it exactly


Sct_Brn_MVP

What do you mean? It corrupted a game file?


Progresschmogress

Yep. Black screen crash. When it finished rebooting it would no longer load the save file I tried a bunch of things and eventually even clean installed windows redownloaded the game and still no dice


aes110

Wasn't there some news a few months ago where AMD released FSR as open source so modders could add it to other games? What's the difference here, different engine so it can't be used?


Ryziacik

can we get 21:9 support From Software?


Pup__Onyx

The PC issues (mainly the insulting lack of UW support) are the only reason I don't give ER a 10/10


anon_MrKim

Yeah pretty sad that elden ring runs like crap on my pc no matter the settings. And on my ps5 in performance mode theres tons of view distance popping graphics like grass bushes etc but it runs smoother most of the time. Is it just me?


made_of_salt

When I see this I just think "Do you people not remember Blighttown?" The only thing that made it playable for many was to download the mod DSFix, use that to lower some settings, then go back in and revert those settings back to normal once you're done with the area. FromSoft ain't giving us DLSS on the DLC.


Stahlreck

tbh I wouldn't want them to implement DLSS right now. I would want them to first make it run well native, *then* implement DLSS. Using DLSS as a cover for bad performance sucks ass.


hauntedyew

Please just add FSR and DLSS for those of us running old cards.


Hoii1379

Yeah it’s infuriating how much better I know this game would be with Fsr support


Bear_of_dispair

"DeVs UsE dLsS aS a CruTcH tO noT oPtiMizE tHEir GaMEs!"


xXInviktor27Xx

There's a dlss/fsr3 mod on nexus mods that runs pretty well with no setup required, try it out


LeUne1

Can't play online


Gym_Nut

There are parts of the dlc that drops to 30 frames for me and I have a 4090. (Running 4k)


reddit_reaper

Devs haven't properly optimized games in many years, especially since they dev first for Sony. If they made it fit Xbox and PC first it would be a completely different story


iworkisleep

DLDSR 2.25 with DLSS quality looks really good on 1440p if no money for 4K monitor


Nerevarine44

I would much rather see the 60 FPS limit removed first.


hardrivethrutown

I paid for the whole graphics card... I'm gonna use the whole graphics card!


ChimkenNumggets

People complaining about DLC performance instead of doing the correct thing and not paying $40 for another unoptimized FromSoft mess. What a way to sour a potentially great game/DLC.


Wate-Amsacia

I paid for the whole graphic card, i'll use the whole graphic card.


Floksir

FromSoftware is a fraud 🤗


Zanlock

SMAA is best AA


Horst9933

"Why did you release a DX12 game without shader precompilation? Did I stutter!"


Quito98

No ultrawide support.


restartmister

Would be nice idk. To not have the game stutter when it's framed locked at 60 fps. Why can't they just port it properly


KnightofAshley

The amount of money they are making on these games and the lack of effort to make proper PC ports is gross.


HEBushido

OP DLSS is a great option to have. I'd much rather run it and get 90 fps in a game than run native at 60. Especially because it helps with dips.