• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Dying Light 2 (PS5 vs Xbox Series X/S)

ethomaz

Banned
I mentioned the tearing in the opening cut scene which the video lists as the only time this behavior was observed in the opening 3hrs.
Well DF didn't wrote that.
Are you looking for what DF wrote or what you believe? I posted a summary of what DF wrote in the article.
 
Last edited:

ethomaz

Banned
That is what is stated in video in the OP, which was published by DF.
You can read here... that is where the summary I posted come.


While the words can means all versions have screen-tearing and Series S has little more prone to them... they only specifically talked about it for Series S.
 
Last edited:

DaGwaphics

Member
You can read here... that is where the summary I posted come.


I've seen the article, the video clarifies the specific tearing they were talking about. I'm sure there are other instances, maybe in game play as well. But, that doesn't erase the fact that they clearly state that the framerate is rock solid throughout the first 3hrs. This game is clearly a near lock in all the modes outside of resolution mode. There are no "not stable" versions. LOL

"slightly less" does not equate to "not" in this case.
 
Last edited:

GymWolf

Gold Member
You missed the real war mate.


GREEN RATS BEWARE!
More interesting than a 1080p war between console peasants?

You can't be series...
 
Last edited:

ZywyPL

Gold Member
It was never like that. A dev delivered a game. Coded "to the metal" and that's that.
Now we get a choice and instead of feeling like a choice, in some case it feels like a punch in the face.
Naming of these options is also confusing. I had the whole thread about that...

That "coding to the metal" was always 30FPS tho, and it's still holds true to this day. But if you want 60, let alone 120, big sacrifices have to be made. Unless you're actually playing on a 1080p TV/monitor, than you're getting all the benefits.
 

metaverse

Member
Not supporting devs who can't give us atleast 1440p 60fps this gen.

Optimize your games.
Confused by this statement. Obviously current gen consoles aren't powerful enough to run their game at the graphical settings they've choosen at 1440p 60FPS. If you really wanted full control on the experience given, cough up the money and build a capable PC.
 
Last edited:

Reallink

Member
If S is a 100% locked 1080p/30 (suggesting untapped overhead), surely SX with more than 3x the resources should be able to muster more than 1080p/60. On consoles devs should at minimum be targeting 40fps in a 120HZ container at the highest res possible and using FSR to 4K.
 
Last edited:

ethomaz

Banned
It was never like that. A dev delivered a game. Coded "to the metal" and that's that.
Now we get a choice and instead of feeling like a choice, in some case it feels like a punch in the face.
Naming of these options is also confusing. I had the whole thread about that...
Options looks good on paper but you have compromisses because there is only a limited amount of time and $$$.
Devs should focus in a single mode and make it THE ONE!

But we will continue getting sub-par experiences across the board... not the first neither the last time.
 
Last edited:

DaGwaphics

Member
If S is a 100% locked 1080p/30 (suggesting untapped overhead), surely SX with more than 3x the resources should be able to muster more than 1080p/60.

A lot of other variables. Clearly they were going for locked targets. Maybe there isn't enough GPU/CPU to go 90 or 120. 🤷‍♂️
 

ethomaz

Banned
If S is a 100% locked 1080p/30 (suggesting untapped overhead), surely SX with more than 3x the resources should be able to muster more than 1080p/60. On consoles devs should at minimum be targeting 40fps in a 120HZ container at the highest res possible and using FSR to 4K.
It is not 100% locked 30fps on S.
Said that you have 1080p30 on S and 1080p60 on X... there are similar Performance modes.
S lacks a Quality and Resolution mode for obvious reasons.
 
Last edited:
Well if you watch the video the Quality 30fps mode should be default.
It is the only way the game looks somehow interesting to play.

Said that the perfect mode should be a combination of Quality and Resolution.
He's right. Does anyone really want to play games at 30 fps anymore on ps5/series x after getting a taste of 60 fps? Dividing benefits between different modes is choosing between a rock and a hard place. Look at what Metro Exodus, far cry 6, doom eternal, Ratchet, Spiderman, returnal and demons souls were able to accomplish. Those games were all able to get at least 1440p/60 fps, some even having ray tracing (metro having RT GI even). They achieved this by using DRS for the most part. That is what all devs should be aiming for. The precedent is set so it's just bad decision making and being out of touch with what gamers want not to have that mode included too.

That being said it's very clear that these two consoles are not the powerful machines we were led to believe. They're simply not up to the task of achieving 60 fps at high resolutions and high/ultra settings. The only time we see anything close to that is with Sony first party devs. Metro Exodus pulled off an amazing feat but they had to cut tesselation. Far Cry 6 looks blurry despite being 1800p (drs) and it has severe screen tearing. Spidermans 1440p/60/rt mode is massively scaled back from its fidelity mode. Control with RT reflections is a shitty 30 fps with terrible input lag, etc. Ratchet and Demons Souls were able to pull it off well, though even if you compare Ratchets 1440p rt mode to its 30 fps Fidelity mode, it's clear there are some concessions.

1080p should be never be the ONLY resolution available at 60 fps! We've moved beyond that now and it was Sony and MS who pushed for 4k more than anyone. With 4k tvs 1080 just doesn't hold up. If you're going to have a 60 fps mode, the lowest res should be 1440p and devs can make whatever sacrifices they need to to hit that resolution. Use DRS if you must or better yet use some form of temporal upscaling like Returnal (1080p internal res somehow looks almost native 4k in that game after upscaling).
 
So, did anyone see my comments calling out that there was probably going to be an issue with Dying Light 2 on next gen? I said they're hiding gameplay footage and "I hope that's not because it's only 1080p/60".
 

b0uncyfr0

Member
Been following keenly. They can do 1080p with raytracing at 30 fps. More than 1800p at 30 fps but performance mode is only 1080p/60. Nar, that's bullshit.

I bet the series X could pull off 1440p (or close)/60 easily if it was optimized enough. I was very interested, now not so much. Forcing me to choose between 30 fps(which i bought a NG console to avoid) and the ancient 1080p/60 route is not helping.

Techland, plase do better. We have 4K TV's now, 1080p wont cut it. Also look into Ratchet and Clank's 40 fps option - its a great compromise for 120hz users. This talk about an unlocked VRR option on the X - also at least sounds interesting.
 

rofif

Gold Member
That "coding to the metal" was always 30FPS tho, and it's still holds true to this day. But if you want 60, let alone 120, big sacrifices have to be made. Unless you're actually playing on a 1080p TV/monitor, than you're getting all the benefits.
then make it 60 and make it the definitive only mode.
Don't introduce "hey but there is this ray tracing mode that 30" ... "so of course nobody will choose 30fps but at least you see what you are missing out on when playing 60fps"
 

ethomaz

Banned
then make it 60 and make it the definitive only mode.
Don't introduce "hey but there is this ray tracing mode that 30" ... "so of course nobody will choose 30fps but at least you see what you are missing out on when playing 60fps"
And how do you think they will sell a game that looks last-gen because it is 60fps?
The lack of WOW factor this generation is mostly because some devs are trying to target 60fps that is really demanding.

I believe choices and higher framerate should be a PC thing... not console.
 
Last edited:
1944p and 1800p is a night and day difference. Only tree shadows could bridge that gap.
Are you serious? 1944 > 1800 is definitely not a "night and day difference" dude. There is a difference but it's small and I've got a keen eye for that sort of thing. The higher up you go in res the smaller the differences are. 1440p > 1080p is more of a "night and day difference".
 

Hunnybun

Member
He's right. Does anyone really want to play games at 30 fps anymore on ps5/series x after getting a taste of 60 fps? Dividing benefits between different modes is choosing between a rock and a hard place. Look at what Metro Exodus, far cry 6, doom eternal, Ratchet, Spiderman, returnal and demons souls were able to accomplish. Those games were all able to get at least 1440p/60 fps, some even having ray tracing (metro having RT GI even). They achieved this by using DRS for the most part. That is what all devs should be aiming for. The precedent is set so it's just bad decision making and being out of touch with what gamers want not to have that mode included too.

That being said it's very clear that these two consoles are not the powerful machines we were led to believe. They're simply not up to the task of achieving 60 fps at high resolutions and high/ultra settings. The only time we see anything close to that is with Sony first party devs. Metro Exodus pulled off an amazing feat but they had to cut tesselation. Far Cry 6 looks blurry despite being 1800p (drs) and it has severe screen tearing. Spidermans 1440p/60/rt mode is massively scaled back from its fidelity mode. Control with RT reflections is a shitty 30 fps with terrible input lag, etc. Ratchet and Demons Souls were able to pull it off well, though even if you compare Ratchets 1440p rt mode to its 30 fps Fidelity mode, it's clear there are some concessions.

1080p should be never be the ONLY resolution available at 60 fps! We've moved beyond that now and it was Sony and MS who pushed for 4k more than anyone. With 4k tvs 1080 just doesn't hold up. If you're going to have a 60 fps mode, the lowest res should be 1440p and devs can make whatever sacrifices they need to to hit that resolution. Use DRS if you must or better yet use some form of temporal upscaling like Returnal (1080p internal res somehow looks almost native 4k in that game after upscaling).

I think this is a gross exaggeration. It's true that the consoles aren't powerful enough to brute force excellent performance regardless of the work put in, but from all the comparisons I've seen they perform around a 2080 level or so. Which is as expected.

So that's good enough for roughly last gen visuals at 1800p60 or something like that.

What we've actually got with Ratchet, with a game designed for the systems, is a similar resolution/frame rate (1800p with scaling) but with a large increase in fidelity on top. In other words, a huge difference than when just brute forcing old games. Which is to say, we just need to be patient.
 

Hunnybun

Member
Are you serious? 1944 > 1800 is definitely not a "night and day difference" dude. There is a difference but it's small and I've got a keen eye for that sort of thing. The higher up you go in res the smaller the differences are. 1440p > 1080p is more of a "night and day difference".

No I'm not serious.
 

Riky

My little VRR pleasure pearl goes vrrrooommm.
Any bullet points on the findings?

Not much difference, framerates are basically locked and he counts the resolution difference at 12% in favour of Series X.
It's probably more interesting for PC owners as that has a lot more options.
 
I have no issue playing 30fps games even after experience 60fps... in fact some games I prefer to play in 30fps modes because it looks way better... from the video this one is a obvious choice for me.
What games? Certainly not shooters, combat/action or any other genre that requires precision. If you say you can with those games than congrats, you are part of the 1% who doesn't notice or mind the drop in fluidity and accuracy.

At 60 fps, it's down to physics. Not only do our eyes perceive the image clearer but the responsiveness in controls is faster.
 

ethomaz

Banned
What games? Certainly not shooters, combat/action or any other genre that requires precision. If you say you can with those games than congrats, you are part of the 1% who doesn't notice or mind the drop in fluidity and accuracy.

At 60 fps, it's down to physics. Not only do our eyes perceive the image clearer but the responsiveness in controls is faster.
Why not shooters? The best shooter gameplay I ever played is 30fps... it is called Destiny... and the gameplay is better than any other 60fps shooter on PS4.

In the past I thought I would hate playing racing games in 30fps... and so I expected to find DriveClub unplayable... well that is not what it turned... while DriveClub has several things I don't like the framerate was fine and I enjoyed it.

When you keep shifting games there is period of few minutes to the eye adapt to the new framerate... after that it become the same... maybe you need to give more time to your eyes adapt.
 
Last edited:

winjer

Member
I was expecting more than 1080p 60fps. The game doesn´t look that good to justify such resolution on these machines.
It should have at least something close to 1440p 60 fps.
 

DaGwaphics

Member
I don't get the reaction over 1080p/60fps. Returnal and Guardians of The Galaxy also use 1080p to play at 60fps and I saw no one complaining.

Oh, people were complaining, especially since the 60fps was hardly a lock there. However, GoTG was a visually more stunning game than DL, IMO. Completely enjoyed that one at 30fps on XSS.
 
Top Bottom