• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel's Spider-Man Remastered - PC Features & Specs Revealed

01011001

Banned
I think thats where i feel i'm a lucky PC gamer, because i play on a 60hz big screen tv. I only ever 'need' 60fps, i dont play unlocked framerates, so pc parts can last me years. Like i wouldnt be surprised if i'm still getting 60fps in the vast majority of games 5 ot 6 years from now, with my 3700x and 1080ti.:messenger_sunglasses:

yo with that 1080ti you can even play stuff like Watch Dogs Legion at 1080p~40fps with raytracing! not even kidding lol, that's not far off from the very dynamic res 30fps RT mode on consoles that can also drop close to 1080p at times

I usually also only target ~60fps in non-competitive titles, but if possible I try to push for at least 90fps, that's like the sweet spot for me in terms of fluidity vs graphics

edit: which makes me think... I wonder if Spider-Man will let you enable RT on GTX10 series cards! THAT would be VERY interesting to say the least! sadly less and less games let you do that
 
Last edited:

Stuart360

Member
yo with that 1080ti you can even play stuff like Watch Dogs Legion at 1080p~40fps with raytracing! not even kidding lol, that's not far off from the very dynamic res 30fps RT mode on consoles that can also drop close to 1080p at times

I usually also only target ~60fps in non-competitive titles, but if possible I try to push for at least 90fps, that's like the sweet spot for me in terms of fluidity vs graphics

edit: which makes me think... I wonder if Spider-Man will let you enable RT on GTX10 series cards! THAT would be VERY interesting to say the least! sadly less and less games let you do that
Wow does WatchDogs Legion let you use ray tracing on GTX cards?, i didnt realize that. I may have to download the game and give it a try. I dont actually care about ray tracing (which is why i upgraded to a 1080ti from a 980ti last week lol instead of a RTX card which were more expensive and less powerful), but it would be cool to try it. Crysis Remastered with RT ran at 60fps with room to spare at 1080p on my 1080ti when i tested it the other day.
 

01011001

Banned
Wow does WatchDogs Legion let you use ray tracing on GTX cards?, i didnt realize that. I may have to download the game and give it a try. I dont actually care about ray tracing (which is why i upgraded to a 1080ti from a 980ti last week lol instead of a RTX card which were more expensive and less powerful), but it would be cool to try it. Crysis Remastered with RT ran at 60fps with room to spare at 1080p on my 1080ti when i tested it the other day.

yeah a bunch of games support RTX features on GTX10 cards. it's kinda random which ones do tho.
Watch Dogs is the most surprising when it comes to performance as it is really not far off from the consoles.

especially considering I'm talking RT set to ultra here! and the RT reflections on console are way worse than PC Ultra, if the game had an option to use console grade RT settings I bet you would get closer to 60fps even! RT reflections on console run with checkerboarding and a lower drawdistance than PC ultra

Quake 2 RTX also runs on GTX10 cards btw. but that you'll have to run at either 720p30 or 480p60 most likely, as it is really taxing being 100% raytraced

I think Metro Exodus actually gets close to or even gets to 60fps at 1080p with RT on. but the og version not the RT only version I think


edit: oh another good one is Control, if you ONLY activate transparent RT reflections you will easily be running that at 60fps at reasonably high resolutions too. and given that the transparent reflections are the most impactful to the overall look IMO in the game, that is an option I would actually use if I had a 1080ti, just a tip ;)
 
Last edited:

Stuart360

Member
yeah a bunch of games support RTX features on GTX10 cards. it's kinda random which ones do tho.
Watch Dogs is the most surprising when it comes to performance as it is really not far off from the consoles.

especially considering I'm talking RT set to ultra here! and the RT reflections on console are way worse than PC Ultra, if the game had an option to use console grade RT settings I bet you would get closer to 60fps even! RT reflections on console run with checkerboarding and a lower drawdistance than PC ultra

Quake 2 RTX also runs on GTX10 cards btw. but that you'll have to run at either 720p30 or 480p60 most likely, as it is really taxing being 100% raytraced

I think Metro Exodus actually gets close to or even gets to 60fps at 1080p with RT on. but the og version not the RT only version I think
Thanks for the info. i own Metro Exodus, and Quake 2 RTX as Steam were giving it away at one point (or is it still free, cant remember).
I'll give them a shot for the lols.
 

01011001

Banned
Thanks for the info. i own Metro Exodus, and Quake 2 RTX as Steam were giving it away at one point (or is it still free, cant remember).
I'll give them a shot for the lols.

yeah I edited my comment, if you have Control, or plan to play it, try Transparent RT reflections. that runs extremely fast even on lower cards than the 1080ti. so for a 1080ti that is an actually valid option to turn on no matter what IMO not even only for the fun of trying it out.

transparent reflections will cover surfaces like windows, monitors and some white boards.

due to the fact that these surfaces are almost always 100% smooth with barely and roughness, they aren't taxing at all since smooth surfaces need less rays and they don't scatter them as much, which saves performance like hell.

so if you plan on playing or replaying Control on that 1080ti, turn on Transparent Reflections, you will most likely still hit 1080p60 or even 1440p60 with that

edit: here it is running on my old PC using a GTX1070

control5jpjl5.png


this room mostly has smooth reflections, so I'm hitting 1080p35fps here on a GTX1070... yeah... crazy how these old cards still can power through stuff like this
 
Last edited:

Stuart360

Member
yeah I edited my comment, if you have Control, or plan to play it, try Transparent RT reflections. that runs extremely fast even on lower cards than the 1080ti. so for a 1080ti that is an actually valid option to turn on no matter what IMO not even only for the fun of trying it out.

transparent reflections will cover surfaces like windows, monitors and some white boards.

due to the fact that these surfaces are almost always 100% smooth with barely and roughness, they aren't taxing at all since smooth surfaces need less rays and they don't scatter them as much, which saves performance like hell.

so if you plan on playing or replaying Control on that 1080ti, turn on Transparent Reflections, you will most likely still hit 1080p60 or even 1440p60 with that
Thanks, i also own Control so i'll give it a shot.
I honestly never knew about RT working on 10 series cards. Shows how much i care about RT lol. Still it will be cool to try them out.
 

01011001

Banned
Thanks, i also own Control so i'll give it a shot.
I honestly never knew about RT working on 10 series cards. Shows how much i care about RT lol. Still it will be cool to try them out.

I added an image showing it running on my old 1070 lol, so your 1080ti should totally run at 1080p60 with transparent reflections

edit: here are some more
control2kzkrj.png

control135jg0.png


47FPS in that last one! giving the PS5 a run for its money LOL


I really hope Spider-Man will support GTX10 cards, I really wanna see that!
 
Last edited:

yamaci17

Member
01011001 01011001

with stock 3000 cl15 it is very bad
3466 cl14+ tigh timings bring it one notch above 3700x (not kidding) with a stock 3000 cl15. to a point where it was smooth sailing all around. getting a locked 60 in those areas are near impossible, but frames with ram oc are high enough that VRR saved the day for me. though in the end I was more bound by GPU. especially when i targeted 4k+dlss performance


although 3700x with 3466 cl14 + tight timings would also take the ball and run away. this game is hugely ccx latency/dram bound, which gets great benefit from heavy RAM OC. tho yes, my 2700x with heavily tuned ram outperforms most 3700xs in the wild lol. this is why the performance i get in death stranding is quite frustrating

you cant see it but RT is enabled in this comparison. i know that this is not the same region as yours but capframex also deemed this place as heavily CPU bound and uses it in his test suite;
tho settings are not completely uıltra. rather they're CPU bound tailored settings per him;


other regions can usually lock to 60. actually more than the CPU, I struggled to get the GPU in line to a locked 60. in this video at 1620p dlss balanced, with RT gi and reflections, i was able to get GPU bound 55-60 FPS. tho later i decided to use 4K + dlss performance, it uses higher quality lods and game looks much better but it has a serious performance cost on the GPU.

 
Last edited:

yamaci17

Member
You know the annoying thing for me is that i have always used Intel cpu's, always, until i switched to AMD with the 2700x, and now3700x, simply because of the price to power level of AMD compared to Intel. The annoying thing though with all my Intel cpu's, they would always get to high 90's percent usage on the cpu cores before they would start bottlenecking my gpu's, often at 99%. With AMD though, the cores get to the high 70's/low 80's percent usage and then start bottlenecking the gpu.
Is that like an AMD thing?.
it probably has to do with smt / cpu usage calculations. afterburner is not the most reliable thing when it comes to materializing cpu usage especially with high thread counts

when i disable smt, i can actually see the threads that are maxxing out

tho i have no idea how this not so complex, basic scenery location maxex out the cpu at a mere 77 fps. no idea really. i wonder if director's cut is broken or something. i will try to do a cross comparison with base version. i remember getting upwards of 100+ frames in base version in the starting capital knot city. yesterday i went back there in my current save, and frames were in the region of 60s. i wonder if this problem im having is something to do with my save file / something bugging out / broken. i will have to observe. tho vrr saves the day and im okay with what I get for now.

h262VQo.png
 
it's mostly that area that is an issue yes. I usually run DF's optimised settings + RT with DLSS and am usually above 60fps, often in the 70's, but that fucking part of the city is ridiculous.
I haven’t gotten good information on this but do you know if cyberpunk without rt runs better on amd or nvidia cards (I obviously know nvidia destroys in raytracing)
 
I dont think you were given bad info to be honest. Fact is outside of a very small selection of PC games (Cyberpunk being one of them, and the worst cpu wise), the vast majority of PC games will easily run at 100+fps if you have a good cpu and gpu. And if you're like me, and are happy with a locked 60fps (or play on a 60hz screen), then cpu's way lower end than a 3700x will hit 60fps in pretty much every game, along side a decent gpu.
Cyberpunk is just one of those very rare games that is super demanding both on the cpu and gpu.
Even on the most well optimized games you will still always get at least a couple more frames unless your gpu is truly maxed out
 

PaintTinJr

Member
it probably has to do with smt / cpu usage calculations. afterburner is not the most reliable thing when it comes to materializing cpu usage especially with high thread counts

when i disable smt, i can actually see the threads that are maxxing out

tho i have no idea how this not so complex, basic scenery location maxex out the cpu at a mere 77 fps. no idea really. i wonder if director's cut is broken or something. i will try to do a cross comparison with base version. i remember getting upwards of 100+ frames in base version in the starting capital knot city. yesterday i went back there in my current save, and frames were in the region of 60s. i wonder if this problem im having is something to do with my save file / something bugging out / broken. i will have to observe. tho vrr saves the day and im okay with what I get for now.

h262VQo.png
It is hard to say, but the scene might be more complex than you are giving it credit for.
For a start, the draw distance is huge with only partial mid-range scenery being occluded by the near scenery. You've also got twice the number of high polygon model assets in the foreground, than even just having a NPC - as you are carrying a human model at the main character's fidelity- that also has very complex inverse kinematics at work with every step you take.

The draw distance might be thrashing the (coarse) CPU visibility algorithm as things in the distance pop in and out of visibility with small camera changes. It also seems like user constructed items: a zip line and a bridge which will have more taxing CPU logic are nearby, and it looks like just behind the first mound there is a settlement in a valleyed section below - maybe with NPCs and story elements being prepared for interaction with potentially lots of 3D assets and unique shaders needing loaded in advance. Or it could be Timefall related, with the game preparing for a timefall event coming in the next few minutes, it could easily be a bug, but the game has lots of background stuff going on and clearly uses clipmap type distance rendering, so has lots of data streaming to do, and may have overlapping areas where it has to conservatively leave more stuff in memory than needed to be sure everything is being updated coherently, which would impact the CPU more IMO.

If you turn around and walk 10-20m in the opposite direction, how does the frame-rate change?
 

yamaci17

Member
That looks like with rt benchmark I meant a benchmark without rt so the rasterization settings
its without rt
shocking i know but ultra raster cyberpunk is pretty heavy. even a 3090 wont get a locked 60 there. my gtx 1080 was getting 40-45 fps at 1080p go figure :D
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That looks like with rt benchmark I meant a benchmark without rt so the rasterization settings
Thats without RT.
With RT a 6900XT does like 10fps.

Cyberpunk is bandwidth happy at higher resolutions.
AMDs Infinity Cache just aint enough to keep up.
 

Guilty_AI

Member
its without rt
shocking i know but ultra raster cyberpunk is pretty heavy. even a 3090 wont get a locked 60 there. my gtx 1080 was getting 40-45 fps at 1080p go figure :D
CP2077 (without rt) has a lot of settings that are pointless to use on ultra.

Volumetric fog res and cloud res in ultra instead of high can suck up to around 10% performance and any visual gain is barely noticeable.

SSR and AO are even worse, especially SSR. Not just for cyberpunk, they're just not very good lightining solutions in general anymore, especially when compared with RT.
Set those to medium, putting them on high or ultra (for ssr) is sacrificing 15% performance for the equivalent of high res turds.

Doing this i could increase the average benchmark performance from 41 fps on my GTX 1660S to 52 fps.
I can play the game on locked 40 fps, 1080p and AMD sharpening on, with most drops to the 35-30's occuring when running in highly crowded places or in specific missions with too many enemies/explosions (like the maelstrom base mission from act 1). Not to mention i'm even using mods which i believe are decreasing performance through CPU usage from what i've noticed.
 
Last edited:

yamaci17

Member
CP2077 (without rt) has a lot of settings that are pointless to use on ultra.

Volumetric fog res and cloud res in ultra instead of high can suck up to around 10% performance and any visual gain is barely noticeable.

SSR and AO are even worse, especially SSR. Not just for cyberpunk, they're just not very good lightining solutions in general anymore, especially when compared with RT.
Set those to medium, putting them on high or ultra (for ssr) is sacrificing 15% performance for the equivalent of high res turds.

Doing this i could increase the average benchmark performance from 41 fps on my GTX 1660S to 52 fps.
I can play the game on locked 40 fps, 1080p and AMD sharpening on, with most drops to the 35-30's occuring when running in highly crowded places or in specific missions with too many enemies/explosions (like the maelstrom base mission from act 1). Not to mention i'm even using mods which i believe are decreasing performance through CPU usage from what i've noticed.
of course, im not saying anything against that , i too played with a mix of med high settings. tho i still ended up locking fps to 40. getting %99 gpu usage in this game creates a very nasty input lag. i can live with that kind of lag in TPS games with gamepad to some extent, but on a FPS game with mouse/keyboard, i couldn't stomach it. with optimized settings, and a 40 fps lock, i managed to get my gpu consistently under %90 usage which allowed me to experience the game lag free. this is a subjective experience, the lag caused by %99 quite literally for some reason hurts my wrist for some reason. at this point i feel like half of the problem became psychological, i feel like i'm dragging my hand/wrist through mud and my wrist takes an immense pain. it feels like i'm doing 3x more work for the same screen sweep. once i lock to 40, and lag is gone, everything feels snappy, smooth and easy to move

this is why i appricate consoles in some respect, they always leave a %10-15 headroom to avoid that lag, and also to consistently get high %1 lows. both factors are great fundamental parts of what makes a game greatly smooth. this is why VRR modes in spiderman and ratched clank offers up to %15-25 more frames. i dont know if the said GPU bound lag occurs on them or not however, since i dont have a PS5. that is up to digital foundry and stuff to test out.

even with RT and stuff, i still had to play with a 40 fps cap in the end (i was getting 45-50 fps gpu bound with 4k dlss performance, RT GI at medium and RT reflections+RT shadows with optimized raster settings with the 3070. 4k+dlss performance is very demanding, actually more demanding than native 1440p almost. but its worth it, really, 4k dlss performance is a bit better than native 1440p in terms of image quality). i wish the game had reflex implementation. you can inject reflex with Special K into the game but i discovered that later on.
 
Last edited:

Guilty_AI

Member
of course, im not saying anything against that , i too played with a mix of med high settings. tho i still ended up locking fps to 40. getting %99 gpu usage in this game creates a very nasty input lag. i can live with that kind of lag in TPS games with gamepad to some extent, but on a FPS game with mouse/keyboard, i couldn't stomach it. with optimized settings, and a 40 fps lock, i managed to get my gpu consistently under %90 usage which allowed me to experience the game lag free. this is a subjective experience, the lag caused by %99 quite literally for some reason hurts my wrist for some reason. at this point i feel like half of the problem became psychological, i feel like i'm dragging my hand/wrist through mud and my wrist takes an immense pain. it feels like i'm doing 3x more work for the same screen sweep. once i lock to 40, and lag is gone, everything feels snappy, smooth and easy to move

this is why i appricate consoles in some respect, they always leave a %10-15 headroom to avoid that lag, and also to consistently get high %1 lows. both factors are great fundamental parts of what makes a game greatly smooth. this is why VRR modes in spiderman and ratched clank offers up to %15-25 more frames. i dont know if the said GPU bound lag occurs on them or not however, since i dont have a PS5. that is up to digital foundry and stuff to test out.

even with RT and stuff, i still had to play with a 40 fps cap in the end (i was getting 45-50 fps gpu bound with 4k dlss performance, RT GI at medium and RT reflections+RT shadows with optimized raster settings with the 3070. 4k+dlss performance is very demanding, actually more demanding than native 1440p almost. but its worth it, really, 4k dlss performance is a bit better than native 1440p in terms of image quality). i wish the game had reflex implementation. you can inject reflex with Special K into the game but i discovered that later on.
I also noticed the input lag in the game. I managed to fix it by playing with borderless windowed + v-sync off without having to decrease settings (limiting fps to 40).
The only disadvantage of that approach is that dynamic resolution doesn't work on borderless windowed, so i couldn't solve the fps drops in heated moments.
 
Last edited:

yamaci17

Member
I also noticed the input lag in the game. I managed to fix it by playing with borderless windowed + v-sync off without having to decrease settings (limiting fps to 40).
The only disadvantage of that approach is that dynamic resolution doesn't work on borderless windowed, so i couldn't solve the fps drops in heated moments.
i have mostly given up on dynamic resolution features on PC. half of games implement it in a broken state. that is why i completely leave myself to the warm embrace of VRR. guardians of galaxy's dynamic res. was directly broken, wasn't working. ac valhalla's dynamic resolution feature only downgrades the resolution to a lower cap of %85. it doesn't go below that

and then games like rdr 2 elden ring and many more do not have the implementation, at all

seems like proper dynamic resolution is a very hard thing to come across on PC. tho i have one good example, halo infinite. its dynamic res. feature really works great exactly like console. i remember on campaign i used dynamic res target as 60 FPS at 4K. it really looked great, i took 10-15 screenshots randomly, some of them were 1400p, some of them 1512p, some of them 1700p etc. it provided a smooth experience

if more games had proper working dynamic res like halo infinite, i would have used them more. if i can reliably get upwards of 60 frames i dont care about it actually. but when frames drop 60 below, i would prefer dynamic resolution to drop resolution instead. so i either target a config that reliably gets me upwards of 70 frames, or I just settle with VRR overall
 
Last edited:

PaintTinJr

Member
Thats without RT.
With RT a 6900XT does like 10fps.

Cyberpunk is bandwidth happy at higher resolutions.
AMDs Infinity Cache just aint enough to keep up.
Looking at those bench numbers and having a quick look at the specs of a 3080, 3090ti and 6900XT on techpowerup, it looks like the non-RT performance scales with the FP32 FLOPs capability and L2 cache size, as though the PC version of the game doesn't use half float FLOPs for the 6900XT to get a half float boost.

Performing like that in non-RT does follow the view the game was a victim of feature creep and needed longer and an overhaul - leaving last-gen behind - to be performant.
 

Guilty_AI

Member
Looking at those bench numbers and having a quick look at the specs of a 3080, 3090ti and 6900XT on techpowerup, it looks like the non-RT performance scales with the FP32 FLOPs capability and L2 cache size, as though the PC version of the game doesn't use half float FLOPs for the 6900XT to get a half float boost.

Performing like that in non-RT does follow the view the game was a victim of feature creep and needed longer and an overhaul - leaving last-gen behind - to be performant.
On their defense tho, it amazing the game can perform as well as it does given how dense and detailed the enviroments are, in an open world game nonetheless. I wouldn't expect a game like this to run on a gtx 1050ti.
 

PaintTinJr

Member
On their defense tho, it amazing the game can perform as well as it does given how dense and detailed the enviroments are, in an open world game nonetheless. I wouldn't expect a game like this to run on a gtx 1050ti.
Yeah, not without them at least doubling the efficiency over what they have now, and not without the user going down to 720p, and having a decent CPU/chipset/RAM to couple with that card. In many ways the game they've released really needs PS4 Pro/X1X level hardware to delivery an aesthetic close to the developer's vision IMO.
 

Guilty_AI

Member
Yeah, not without them at least doubling the efficiency over what they have now, and not without the user going down to 720p, and having a decent CPU/chipset/RAM to couple with that card. In many ways the game they've released really needs PS4 Pro/X1X level hardware to delivery an aesthetic close to the developer's vision IMO.
Doesn't really need to go that far, though its true 4gb RAM would probably be troublesome to work with.
But with 8gb ram and even using a 3rd gen i7, its possible to run it at 30 fps on a 1050ti with a mix of medium-low settings, using ultra quality FSR.




I personally find this quite impressive, though i guess this is more or less ps4 pro X1X level hardware.
 
Last edited:

PaintTinJr

Member
Doesn't really need to go that far, though its true 4gb RAM would probably be troublesome to work with.
But with 8gb ram and even using a 3rd gen i7, its possible to run it at 30 fps on a 1050ti with a mix of medium-low settings, using ultra quality FSR.




I personally find this quite impressive, though i guess this is more or less ps4 pro X1X level hardware.

On balance the extra RAM, via RAM+VRAM over the consoles, the boost clock and it being nvidia vs AMD (on Pro/X1X) probably does place it closer than the tech specs of the +2.5 TFLOPs 1050ti we are seeing in the video, to make it par with those consoles.

With FSR in the video, what is the native resolution? Is it feeding in 1080p and outputting higher ? Or is the 1080p the FSR improvement - from a lower resolution? Given the fidelity is visually similar to Batman Arkham Knight on base PS4(running at 30fps) from what I remember, I'm guess it is the latter.
 

Guilty_AI

Member
On balance the extra RAM, via RAM+VRAM over the consoles, the boost clock and it being nvidia vs AMD (on Pro/X1X) probably does place it closer than the tech specs of the +2.5 TFLOPs 1050ti we are seeing in the video, to make it par with those consoles.

With FSR in the video, what is the native resolution? Is it feeding in 1080p and outputting higher ? Or is the 1080p the FSR improvement - from a lower resolution? Given the fidelity is visually similar to Batman Arkham Knight on base PS4(running at 30fps) from what I remember, I'm guess it is the latter.
Its the latter, he shows at the beggining he's using 1080p with FSR activated.
 

Tqaulity

Member
Ok Guys, lots of talking in circles and avoiding the facts. Let me try to move things forward a bit...

First a couple of facts:
  1. Next-gen console GPU perf has been within a range of PC GPUs between an RTX 2070 and RTX 3070. This is because every game engine is different, some are more/less optimized for AMD hardware, and some are more/less optimized for consoles. In the worst cases, the consoles are right around RX 5700XT/2070 but in best cases (AMD favoring workloads) we're seeing perf approach RTX2080ti/RTX 3070 levels
  2. Everyone always tries to compare PC vs consoles by looking at PC versions of games that were straight ports to consoles (not optimized for consoles). But the better way to actually compare the console perf differences with all of the delta in HW, OS, APIs, and SW is to look at games that have been designed with consoles in mind and then ported back to PC. In practice we have precious few of those to date but it's not a coincidence that many of those games are the ones people label as a "poor" or "weird" port.
    1. Death Stranding - built on an engine designed to harness the PS4 hardware with optimizations for unifed RAM, additional ACE engines etc. This isn't just a weird PC port but a console engine port to PC which isn't trivial. While people love to dismiss it, this is probably the BEST example we have of console to PC perf since it's the only game available that have native versions on last gen, current gen, and PC. This shows the potential upper limit of console perf when a game/engine is optimized for the platform. BTW the perf delta here isn't to say that the PC HW is somehow less powerful but that porting that console game over is suboptimal (I.e the SW is suboptimal and not the HW).
    2. Other Sony PC ports such as Days Gone, Horizon, and God of War cannot be used to draw meaningful conclusions since there are no PS5 native ports for those. Yet you can look at PC performance and still see that "equivalent" console GPUs tend to perform considering worse than their console equivalents. For example, in Horizon Zero Dawn on PC, it takes an GTX 1050 or something above an r9 390 GPU to match base PS4 settings at 1080p/30fps! (Link) A PS4 is supposed to be ~a 7850 yet a 1050 is evenly matched with a 7950. An R9 390 is several generations ahead of a PS4 is generally nearly 2x faster than a 7950! (Link)
    3. SpiderMan PC port is a fine example again of a console optimized game and the difficulty of replicating that perf on a PC. The min/rec chart reflects this in spades as it takes more PC HW grunt to overcome some of the more efficient and optimized console hardware. It'll be interesting to see the perf comparison since again we have native PS4/PS5 and PC versions to compare. Don't be surprised if the Playstation consoles are performing well above their weight in this title

Also, lots of circular questions around examples of PS5 performing closer to a RTX 3070 aside from Death Stranding and Horizon. Some mention of a Call of Duty game. Ok here you go:

Call of Duty Cold War: PS5 is within 8 % of a 3070 according to Digital Foundry's test
6zkYrZj.png


Assassin's Creed Valhalla: PS5 is only about 12-15% below a 2080Ti (which is close to a 3070):
B5tMKK4.png


Both of those games that ran better on AMD vs Nvidia so it makes sense the consoles will run better. There aren't a lot of examples yet but trust you will see more as more true console games are ported to PC and developers have more time with the new hardware.
Ok had to follow up on this now that Spiderman Remastered PC benchmarks are coming out and we can get a clearer picture of what happens when you take an game engine developed and optimized for a fixed HW like a console and attempt to port it to PC. Considering the API differences with DX12 and the less efficient data streaming on a PC, the result is it takes much more PC HW grunt to effectively match PS5 settings. Case in point:

XySdr5KptktFSG3jWS2ERe-1200-80.png.webp

At 1440p without RT enabled (i.e. PS5 Performance mode), the PS5 delivers a rock solid 60fps with virtually no drops. Sure, the settings here on PS5 are mostly "high" and not quite ultra but the difference with ultra on PC is mostly in shadow detail that is barely noticeable during gameplay. To get a 99% 60fps performance (again average here is not very relevant since console consistency is the key to the experience) on PC, you will need something much higher than a RX 6600 or RTX 2060 (something on the order of 50% faster). PS5 perf is probably right around a 6700XT (slightly lower than the 6750XT shown here) or RTX 3060Ti (not shown on this chart).

bg6UPRAVwRYqrQ9JSer4Kf-1200-80.png.webp

With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps. Looking at PC using IGTI as is the case on PS5, we see that RTX 3070 just barely hits 60fps 99% which is offering a similar experience and level of consistency to PS5. Remember that PS5 is frame capped to 60fps in most cases but can run unlocked now in VRR modes. In that case, PS5 version can reach above 100fps in Performance RT mode with an average closer to mid 70s to 80fps (Link). Again, way above the perf of the closest technical equivalent in an RX 6600 XT and a virtual match for a RX 6750XT/RTX 3070.


SCfY4gsNfnd5pJNWn98mNf-1200-80.png.webp

For this case, we're looking at the PS5's fidelity mode which is native 4K/30fps without VRR and native 4K/40fps+ with VRR. In fact, with VRR enabled, PS5 fidelity mode generally stays around the 50fps range but the 99% fps is 40. Thus, matching that overall level of performance on PC will take a RTX3070/RX6750XT at a minimum. Again slight difference in settings, but as a ballpark it's pretty clear that something much lower than than 3070/6750 (i.e. a 2070 or 5700XT) won't really be able to hit PS5 level.


Conclusions:

  • This thread posed the question of identifying examples of when the PS5 can match or approach an RTX3070 level of PC performance. I submitted that we've seen this realized in the "best case" scenarios and the best case really is a console developed engine being ported back to PC. That includes Decima (Horizon, Death Stranding) and now Spiderman (with Uncharted Comparison coming soon). With Spiderman we have one of the best examples of truly comparing the REAL WORLD performance between console and PC.
  • To be clear, nobody is saying that the GPU HW in a PS5 is as capable as an RTX3070 when looking at it in isolation. But as has been reiterated many times, actual game performance relies on much more than just the metal making up a single component.
  • The total system performance in a PS5 in best case scenarios (engines that are optimized for AMD HW and utilizes the console advantages such as faster data throughput and unified memory) has been seen to approach/match a RTX 3070 in like for like comparisons including in Spiderman remastered
  • The consistency of performance on console is really key and most PC vs Console comparison completely miss this. These new gen consoles have done a great job of delivering locked framerates and high stable performance which you don't typically see on PC with unlocked frame rates. Matching that stability of experience on PC typically takes higher than expected HW when compared to console anyway
  • True this comparison I made isn't 100% like for like since the PC isn't running at identical settings to PS5. But my point is to answer the question "what will it take on PC to match the overall experience" including visuals and performance. Some slightly cleaner shadows and AO and higher resolution reflections will hardly be noticed when actually playing and not standing still to inspect certain scenes.
  • Also, remember that the console performance is entirely limited by the frame caps that are standard in that space. If a game can't be locked to 60fps for example, then the dev may have to lock to 30fps even thought the game could be running with an average of 50-60fps. That's a 50% or greater delta in performance that is just left on the table due to the frame cap. We've seen this recently with Rachet & Clank on PS5 with the VRR update. The fildelity mode that was previously locked to 30fps was running with an average of around 50fps in most scenarios when unlocked (an almost 70% increase in performance). It would have been grossly incorrect to take the Rachet & Clank fidelity performance at 30fps as a direct indication of what the PS5 is capable of. When comparing to PC systems running unlocked, you have to consider the frame cap in order to accurately compare performance.
  • jFKgHfj.jpg
 

yamaci17

Member
if the difference is minor, just use console equivalent settings
if those benchmarks are done with higher settings than what ps5 have, it bears no value to the points you're trying to make
if you believe those settings offer minimal improvement to IQ, then you must also believe and accept that a 3070 user can use those settings to, once again, achieve more performance than PS5
if you don't match settings, it is a pointless thing to discuss. i myself a 3070 owner and i never use maxed ultra settings. i always try to use optimized, console-like settings. for more performance for buck.

"With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode""

you can't. max rt settings approximately needs %35-40 more hardware grunt than normal rt settings which PS5 uses most likely, per PCGH benchmarks. you can claim they look very similar or not (they do not), but there's a huge performance disperancy between high ray tracing and medium ray tracing presets.

once DF shares their "ps5" equivalent settings, I will try to make an actual apples to apples comparison for you with my 3070. don't you worry...

final point: DLSS quality even at 1440p will most likely look better than the game's native TAA. this alone will bring even a 2060 super up to pace with a PS5 with PS5 equivalent settings. at least at 4K, it is CONFIRMED that dlss quality looks better than temporal injection and FSR 2.0,


so, an NVIDIA RTX GPU gives you a better anti aliasing method that actually looks better than native 4K, and with that performance boost, you will easily surpass PS5's performance profile even with those max settings

now, imagine using "optimized" console equivalent settings and adding DLSS on top of that. how will things shape up then?

as i said, if you really want apple to apple comparison, use optimized console equivalent settings
 
Last edited:

Tqaulity

Member
if the difference is minor, just use console equivalent settings
if those benchmarks are done with higher settings than what ps5 have, it bears no value to the points you're trying to make
if you believe those settings offer minimal improvement to IQ, then you must also believe and accept that a 3070 user can use those settings to, once again, achieve more performance than PS5
if you don't match settings, it is a pointless thing to discuss. i myself a 3070 owner and i never use maxed ultra settings. i always try to use optimized, console-like settings. for more performance for buck.

"With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode""

you can't. max rt settings approximately needs %35-40 more hardware grunt than normal rt settings which PS5 uses most likely, per PCGH benchmarks. you can claim they look very similar or not (they do not), but there's a huge performance disperancy between high ray tracing and medium ray tracing presets.

once DF shares their "ps5" equivalent settings, I will try to make an actual apples to apples comparison for you with my 3070. don't you worry...

final point: DLSS quality even at 1440p will most likely look better than the game's native TAA. this alone will bring even a 2060 super up to pace with a PS5 with PS5 equivalent settings. at least at 4K, it is CONFIRMED that dlss quality looks better than temporal injection and FSR 2.0,

[/URL]

so, an NVIDIA RTX GPU gives you a better anti aliasing method that actually looks better than native 4K, and with that performance boost, you will easily surpass PS5's performance profile even with those max settings

now, imagine using "optimized" console equivalent settings and adding DLSS on top of that. how will things shape up then?

as i said, if you really want apple to apple comparison, use optimized console equivalent settings
Ok appreciate the response. Yes it'll be interesting to get "console equivalent settings" and we can follow up when that is available. I included a non-RT case because I'm aware that the RT settings have huge performance implications.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
if the difference is minor, just use console equivalent settings
if those benchmarks are done with higher settings than what ps5 have, it bears no value to the points you're trying to make
if you believe those settings offer minimal improvement to IQ, then you must also believe and accept that a 3070 user can use those settings to, once again, achieve more performance than PS5
if you don't match settings, it is a pointless thing to discuss. i myself a 3070 owner and i never use maxed ultra settings. i always try to use optimized, console-like settings. for more performance for buck.

"With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode""

you can't. max rt settings approximately needs %35-40 more hardware grunt than normal rt settings which PS5 uses most likely, per PCGH benchmarks. you can claim they look very similar or not (they do not), but there's a huge performance disperancy between high ray tracing and medium ray tracing presets.

once DF shares their "ps5" equivalent settings, I will try to make an actual apples to apples comparison for you with my 3070. don't you worry...

final point: DLSS quality even at 1440p will most likely look better than the game's native TAA. this alone will bring even a 2060 super up to pace with a PS5 with PS5 equivalent settings. at least at 4K, it is CONFIRMED that dlss quality looks better than temporal injection and FSR 2.0,


so, an NVIDIA RTX GPU gives you a better anti aliasing method that actually looks better than native 4K, and with that performance boost, you will easily surpass PS5's performance profile even with those max settings

now, imagine using "optimized" console equivalent settings and adding DLSS on top of that. how will things shape up then?

as i said, if you really want apple to apple comparison, use optimized console equivalent settings
I dont think you understand what 99th Percentile means.
 

octiny

Banned
Ok had to follow up on this now that Spiderman Remastered PC benchmarks are coming out and we can get a clearer picture of what happens when you take an game engine developed and optimized for a fixed HW like a console and attempt to port it to PC. Considering the API differences with DX12 and the less efficient data streaming on a PC, the result is it takes much more PC HW grunt to effectively match PS5 settings. Case in point:

XySdr5KptktFSG3jWS2ERe-1200-80.png.webp

At 1440p without RT enabled (i.e. PS5 Performance mode), the PS5 delivers a rock solid 60fps with virtually no drops. Sure, the settings here on PS5 are mostly "high" and not quite ultra but the difference with ultra on PC is mostly in shadow detail that is barely noticeable during gameplay. To get a 99% 60fps performance (again average here is not very relevant since console consistency is the key to the experience) on PC, you will need something much higher than a RX 6600 or RTX 2060 (something on the order of 50% faster). PS5 perf is probably right around a 6700XT (slightly lower than the 6750XT shown here) or RTX 3060Ti (not shown on this chart).

bg6UPRAVwRYqrQ9JSer4Kf-1200-80.png.webp

With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps. Looking at PC using IGTI as is the case on PS5, we see that RTX 3070 just barely hits 60fps 99% which is offering a similar experience and level of consistency to PS5. Remember that PS5 is frame capped to 60fps in most cases but can run unlocked now in VRR modes. In that case, PS5 version can reach above 100fps in Performance RT mode with an average closer to mid 70s to 80fps (Link). Again, way above the perf of the closest technical equivalent in an RX 6600 XT and a virtual match for a RX 6750XT/RTX 3070.


SCfY4gsNfnd5pJNWn98mNf-1200-80.png.webp

For this case, we're looking at the PS5's fidelity mode which is native 4K/30fps without VRR and native 4K/40fps+ with VRR. In fact, with VRR enabled, PS5 fidelity mode generally stays around the 50fps range but the 99% fps is 40. Thus, matching that overall level of performance on PC will take a RTX3070/RX6750XT at a minimum. Again slight difference in settings, but as a ballpark it's pretty clear that something much lower than than 3070/6750 (i.e. a 2070 or 5700XT) won't really be able to hit PS5 level.


Conclusions:

  • This thread posed the question of identifying examples of when the PS5 can match or approach an RTX3070 level of PC performance. I submitted that we've seen this realized in the "best case" scenarios and the best case really is a console developed engine being ported back to PC. That includes Decima (Horizon, Death Stranding) and now Spiderman (with Uncharted Comparison coming soon). With Spiderman we have one of the best examples of truly comparing the REAL WORLD performance between console and PC.
  • To be clear, nobody is saying that the GPU HW in a PS5 is as capable as an RTX3070 when looking at it in isolation. But as has been reiterated many times, actual game performance relies on much more than just the metal making up a single component.
  • The total system performance in a PS5 in best case scenarios (engines that are optimized for AMD HW and utilizes the console advantages such as faster data throughput and unified memory) has been seen to approach/match a RTX 3070 in like for like comparisons including in Spiderman remastered
  • The consistency of performance on console is really key and most PC vs Console comparison completely miss this. These new gen consoles have done a great job of delivering locked framerates and high stable performance which you don't typically see on PC with unlocked frame rates. Matching that stability of experience on PC typically takes higher than expected HW when compared to console anyway
  • True this comparison I made isn't 100% like for like since the PC isn't running at identical settings to PS5. But my point is to answer the question "what will it take on PC to match the overall experience" including visuals and performance. Some slightly cleaner shadows and AO and higher resolution reflections will hardly be noticed when actually playing and not standing still to inspect certain scenes.
  • Also, remember that the console performance is entirely limited by the frame caps that are standard in that space. If a game can't be locked to 60fps for example, then the dev may have to lock to 30fps even thought the game could be running with an average of 50-60fps. That's a 50% or greater delta in performance that is just left on the table due to the frame cap. We've seen this recently with Rachet & Clank on PS5 with the VRR update. The fildelity mode that was previously locked to 30fps was running with an average of around 50fps in most scenarios when unlocked (an almost 70% increase in performance). It would have been grossly incorrect to take the Rachet & Clank fidelity performance at 30fps as a direct indication of what the PS5 is capable of. When comparing to PC systems running unlocked, you have to consider the frame cap in order to accurately compare performance.
  • jFKgHfj.jpg

Take those numbers & summary w/ a grain of salt. Multiple patches have been & still are being pushed through before the actual launch. It'd be foolish to come to any conclusion at all.

"With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps."

I take you didn't look at how RT fidelity mode actually looks & compares to the PC version? It's a night & day difference. Nothing "minor" about it, let alone all the other improvements. That's not even getting into the even more drastic RT downgrades w/ performance RT mode vs fidelity mode on PS5. Attempting to compare them as if they are 1:1 while discounting all the PC improvements is a travesty & disservice. The PS5 is nowhere near equivalent to a 3070 in this game if they used actual equivalent settings, nor is it now as we can clearly see they aren't even close w/ regards to settings.

drgMLpb.jpg


Fidelity mode on PS5 below via 01011001 01011001 (which has higher RT resolution/settings than Performance RT 60 FPS mode per DF)

1828c82219399-screensytema.png

1828c824f3078-screensbxcbg.png


I won't get into any of the other stuff you said as I'm not going to waste anymore time on it since it's pointless at the moment.

Will wait for the DF comparison (they are waiting for the final pre-release patches), gamegpu.com & a few of my favorite YT'ers w/ optimal rigs to do real benchmarks & comparisons so we can gauge performance correctly.
 
Last edited:

yamaci17

Member
I dont think you understand what 99th Percentile means.
me?

Ok appreciate the response. Yes it'll be interesting to get "console equivalent settings" and we can follow up when that is available. I included a non-RT case because I'm aware that the RT settings have huge performance implications.
that's the problem, even without RT, ps5 is most likely not maxing out raster settings. it will use both optimized raster / rt settings together. so it is also may be misleading to assume that ps5 uses maximum raster settings in non-rt mode (i'm fairly sure it will use very high settings indeed, but it may be not be max out. do note that this game includes hbao+, and it is more performance intensive than most traditional AO methods and if the "max settings" benchmark is done with such settings that adds over the PS5 version, that in it is really not a fair comparison
for example metro exodus enhanced edition has tons of extra stuff like hairworks, tesselation and rt reflections over console versions which skews comparisons in favor of consoles usually.
 
Last edited:

Jaybe

Member
Reviews of the PC port look phenomenal. Awesome to see. This will sell a ton over the launch month.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not

Hahaha sorry fucking NeoGAF gold keeps updated pages while im on them so i hit the wrong quote button.
Carry on.

Stealth GAF gold brag

Ok had to follow up on this now that Spiderman Remastered PC benchmarks are coming out and we can get a clearer picture of what happens when you take an game engine developed and optimized for a fixed HW like a console and attempt to port it to PC. Considering the API differences with DX12 and the less efficient data streaming on a PC, the result is it takes much more PC HW grunt to effectively match PS5 settings. Case in point:

XySdr5KptktFSG3jWS2ERe-1200-80.png.webp

At 1440p without RT enabled (i.e. PS5 Performance mode), the PS5 delivers a rock solid 60fps with virtually no drops. Sure, the settings here on PS5 are mostly "high" and not quite ultra but the difference with ultra on PC is mostly in shadow detail that is barely noticeable during gameplay. To get a 99% 60fps performance (again average here is not very relevant since console consistency is the key to the experience) on PC, you will need something much higher than a RX 6600 or RTX 2060 (something on the order of 50% faster). PS5 perf is probably right around a 6700XT (slightly lower than the 6750XT shown here) or RTX 3060Ti (not shown on this chart).

bg6UPRAVwRYqrQ9JSer4Kf-1200-80.png.webp

With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps. Looking at PC using IGTI as is the case on PS5, we see that RTX 3070 just barely hits 60fps 99% which is offering a similar experience and level of consistency to PS5. Remember that PS5 is frame capped to 60fps in most cases but can run unlocked now in VRR modes. In that case, PS5 version can reach above 100fps in Performance RT mode with an average closer to mid 70s to 80fps (Link). Again, way above the perf of the closest technical equivalent in an RX 6600 XT and a virtual match for a RX 6750XT/RTX 3070.


SCfY4gsNfnd5pJNWn98mNf-1200-80.png.webp

For this case, we're looking at the PS5's fidelity mode which is native 4K/30fps without VRR and native 4K/40fps+ with VRR. In fact, with VRR enabled, PS5 fidelity mode generally stays around the 50fps range but the 99% fps is 40. Thus, matching that overall level of performance on PC will take a RTX3070/RX6750XT at a minimum. Again slight difference in settings, but as a ballpark it's pretty clear that something much lower than than 3070/6750 (i.e. a 2070 or 5700XT) won't really be able to hit PS5 level.


Conclusions:

  • This thread posed the question of identifying examples of when the PS5 can match or approach an RTX3070 level of PC performance. I submitted that we've seen this realized in the "best case" scenarios and the best case really is a console developed engine being ported back to PC. That includes Decima (Horizon, Death Stranding) and now Spiderman (with Uncharted Comparison coming soon). With Spiderman we have one of the best examples of truly comparing the REAL WORLD performance between console and PC.
  • To be clear, nobody is saying that the GPU HW in a PS5 is as capable as an RTX3070 when looking at it in isolation. But as has been reiterated many times, actual game performance relies on much more than just the metal making up a single component.
  • The total system performance in a PS5 in best case scenarios (engines that are optimized for AMD HW and utilizes the console advantages such as faster data throughput and unified memory) has been seen to approach/match a RTX 3070 in like for like comparisons including in Spiderman remastered
  • The consistency of performance on console is really key and most PC vs Console comparison completely miss this. These new gen consoles have done a great job of delivering locked framerates and high stable performance which you don't typically see on PC with unlocked frame rates. Matching that stability of experience on PC typically takes higher than expected HW when compared to console anyway
  • True this comparison I made isn't 100% like for like since the PC isn't running at identical settings to PS5. But my point is to answer the question "what will it take on PC to match the overall experience" including visuals and performance. Some slightly cleaner shadows and AO and higher resolution reflections will hardly be noticed when actually playing and not standing still to inspect certain scenes.
  • Also, remember that the console performance is entirely limited by the frame caps that are standard in that space. If a game can't be locked to 60fps for example, then the dev may have to lock to 30fps even thought the game could be running with an average of 50-60fps. That's a 50% or greater delta in performance that is just left on the table due to the frame cap. We've seen this recently with Rachet & Clank on PS5 with the VRR update. The fildelity mode that was previously locked to 30fps was running with an average of around 50fps in most scenarios when unlocked (an almost 70% increase in performance). It would have been grossly incorrect to take the Rachet & Clank fidelity performance at 30fps as a direct indication of what the PS5 is capable of. When comparing to PC systems running unlocked, you have to consider the frame cap in order to accurately compare performance.
  • jFKgHfj.jpg
You know what 99th percentile is right?
 

yamaci17

Member
Ok had to follow up on this now that Spiderman Remastered PC benchmarks are coming out and we can get a clearer picture of what happens when you take an game engine developed and optimized for a fixed HW like a console and attempt to port it to PC. Considering the API differences with DX12 and the less efficient data streaming on a PC, the result is it takes much more PC HW grunt to effectively match PS5 settings. Case in point:

XySdr5KptktFSG3jWS2ERe-1200-80.png.webp

At 1440p without RT enabled (i.e. PS5 Performance mode), the PS5 delivers a rock solid 60fps with virtually no drops. Sure, the settings here on PS5 are mostly "high" and not quite ultra but the difference with ultra on PC is mostly in shadow detail that is barely noticeable during gameplay. To get a 99% 60fps performance (again average here is not very relevant since console consistency is the key to the experience) on PC, you will need something much higher than a RX 6600 or RTX 2060 (something on the order of 50% faster). PS5 perf is probably right around a 6700XT (slightly lower than the 6750XT shown here) or RTX 3060Ti (not shown on this chart).

bg6UPRAVwRYqrQ9JSer4Kf-1200-80.png.webp

With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps. Looking at PC using IGTI as is the case on PS5, we see that RTX 3070 just barely hits 60fps 99% which is offering a similar experience and level of consistency to PS5. Remember that PS5 is frame capped to 60fps in most cases but can run unlocked now in VRR modes. In that case, PS5 version can reach above 100fps in Performance RT mode with an average closer to mid 70s to 80fps (Link). Again, way above the perf of the closest technical equivalent in an RX 6600 XT and a virtual match for a RX 6750XT/RTX 3070.


SCfY4gsNfnd5pJNWn98mNf-1200-80.png.webp

For this case, we're looking at the PS5's fidelity mode which is native 4K/30fps without VRR and native 4K/40fps+ with VRR. In fact, with VRR enabled, PS5 fidelity mode generally stays around the 50fps range but the 99% fps is 40. Thus, matching that overall level of performance on PC will take a RTX3070/RX6750XT at a minimum. Again slight difference in settings, but as a ballpark it's pretty clear that something much lower than than 3070/6750 (i.e. a 2070 or 5700XT) won't really be able to hit PS5 level.


Conclusions:

  • This thread posed the question of identifying examples of when the PS5 can match or approach an RTX3070 level of PC performance. I submitted that we've seen this realized in the "best case" scenarios and the best case really is a console developed engine being ported back to PC. That includes Decima (Horizon, Death Stranding) and now Spiderman (with Uncharted Comparison coming soon). With Spiderman we have one of the best examples of truly comparing the REAL WORLD performance between console and PC.
  • To be clear, nobody is saying that the GPU HW in a PS5 is as capable as an RTX3070 when looking at it in isolation. But as has been reiterated many times, actual game performance relies on much more than just the metal making up a single component.
  • The total system performance in a PS5 in best case scenarios (engines that are optimized for AMD HW and utilizes the console advantages such as faster data throughput and unified memory) has been seen to approach/match a RTX 3070 in like for like comparisons including in Spiderman remastered
  • The consistency of performance on console is really key and most PC vs Console comparison completely miss this. These new gen consoles have done a great job of delivering locked framerates and high stable performance which you don't typically see on PC with unlocked frame rates. Matching that stability of experience on PC typically takes higher than expected HW when compared to console anyway
  • True this comparison I made isn't 100% like for like since the PC isn't running at identical settings to PS5. But my point is to answer the question "what will it take on PC to match the overall experience" including visuals and performance. Some slightly cleaner shadows and AO and higher resolution reflections will hardly be noticed when actually playing and not standing still to inspect certain scenes.
  • Also, remember that the console performance is entirely limited by the frame caps that are standard in that space. If a game can't be locked to 60fps for example, then the dev may have to lock to 30fps even thought the game could be running with an average of 50-60fps. That's a 50% or greater delta in performance that is just left on the table due to the frame cap. We've seen this recently with Rachet & Clank on PS5 with the VRR update. The fildelity mode that was previously locked to 30fps was running with an average of around 50fps in most scenarios when unlocked (an almost 70% increase in performance). It would have been grossly incorrect to take the Rachet & Clank fidelity performance at 30fps as a direct indication of what the PS5 is capable of. When comparing to PC systems running unlocked, you have to consider the frame cap in order to accurately compare performance.
  • jFKgHfj.jpg
do you have %1 percentile values for uncapped vrr mode on ps5 for spiderman?

unlocked framerates will have more varied framerates overall. a gpu that is averaging 73 frames with a 45 fps %1 may lock to a rock solid 60 fps across the board.

example, i can get unlocked 78 fps in f5 with %1 lows at 70 and %0.1 lows at 64 (not too bad)



ftyjxAI.png


when I lock the game to 60, i get a perfect 60 across the board.

2WAVqK9.png



mind you, I use xbox sx equivalent (performance mode) settings at 4K. only difference is that my night shadows is enabled (but it does not affect the performance here, the scene not being at... night)

with optimized settings, i believe you can heave enough headroom to hit similar high %1 lows.

u said yourself, only reason ps5/xbox have perfect cap accuracy is because they have some headroom (which i accepted and explained many times in this forum). that's why those benchmarks also somewhat misleading, because there's a huge performance diff. between max settings and moderate settings in this game, just like it happens with many other games
 
Last edited:

StreetsofBeige

Gold Member
Take those numbers & summary w/ a grain of salt. Multiple patches have been & still are being pushed through before the actual launch. It'd be foolish to come to any conclusion at all.

"With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps."

I take you didn't look at how RT fidelity mode actually looks & compares to the PC version? It's a night & day difference. Nothing "minor" about it, let alone all the other improvements. That's not even getting into the even more drastic RT downgrades w/ performance RT mode vs fidelity mode on PS5. Attempting to compare them as if they are 1:1 while discounting all the PC improvements is a travesty & disservice. The PS5 is nowhere near equivalent to a 3070 in this game if they used actual equivalent settings, nor is it now as we can clearly see they aren't even close w/ regards to settings.

drgMLpb.jpg


Fidelity mode on PS5 below via 01011001 01011001 (which has higher RT resolution/settings than Performance RT 60 FPS mode per DF)

1828c82219399-screensytema.png

1828c824f3078-screensbxcbg.png


I won't get into any of the other stuff you said as I'm not going to waste anymore time on it since it's pointless at the moment.

Will wait for the DF comparison (they are waiting for the final pre-release patches), gamegpu.com & a few of my favorite YT'ers w/ optimal rigs to do real benchmarks & comparisons so we can gauge performance correctly.
That's some high quality RT.

My stance stands for RT on consoles. Maybe a good PC with a great gpu can do RT well, but for consoles it's a waste.

K3M06O5.jpg
VkNSoRN.jpg
oF4MaxN.jpg
 

Tqaulity

Member
Take those numbers & summary w/ a grain of salt. Multiple patches have been & still are being pushed through before the actual launch. It'd be foolish to come to any conclusion at all.
Cool. Perfectly valid point and I'll fall back. Don't have time or energy to argue here. We'll more data and patch updates soon.
 
Game Ready Drivers are here - Check your GeForce Experience.
FZuop9nUcAERMXU



Interesting the driver notes say the game will also use Nvidia HBAO+.
I would have thought RTAO or atleast GTAO would have made it in, but ill HBAO+ over reggy SSAO.
I thought you said you were waiting for a sale? Can you check if you can get a locked 120 on a 5600x?
 

S0ULZB0URNE

Member
Ok had to follow up on this now that Spiderman Remastered PC benchmarks are coming out and we can get a clearer picture of what happens when you take an game engine developed and optimized for a fixed HW like a console and attempt to port it to PC. Considering the API differences with DX12 and the less efficient data streaming on a PC, the result is it takes much more PC HW grunt to effectively match PS5 settings. Case in point:

XySdr5KptktFSG3jWS2ERe-1200-80.png.webp

At 1440p without RT enabled (i.e. PS5 Performance mode), the PS5 delivers a rock solid 60fps with virtually no drops. Sure, the settings here on PS5 are mostly "high" and not quite ultra but the difference with ultra on PC is mostly in shadow detail that is barely noticeable during gameplay. To get a 99% 60fps performance (again average here is not very relevant since console consistency is the key to the experience) on PC, you will need something much higher than a RX 6600 or RTX 2060 (something on the order of 50% faster). PS5 perf is probably right around a 6700XT (slightly lower than the 6750XT shown here) or RTX 3060Ti (not shown on this chart).

bg6UPRAVwRYqrQ9JSer4Kf-1200-80.png.webp

With DXR we can get a mostly apples to apples comparison to the PS5 "Performance RT mode". Again PS5 settings here are mostly at "high" level but the visual difference is minor and the performance is a virtually locked 60fps. Looking at PC using IGTI as is the case on PS5, we see that RTX 3070 just barely hits 60fps 99% which is offering a similar experience and level of consistency to PS5. Remember that PS5 is frame capped to 60fps in most cases but can run unlocked now in VRR modes. In that case, PS5 version can reach above 100fps in Performance RT mode with an average closer to mid 70s to 80fps (Link). Again, way above the perf of the closest technical equivalent in an RX 6600 XT and a virtual match for a RX 6750XT/RTX 3070.


SCfY4gsNfnd5pJNWn98mNf-1200-80.png.webp

For this case, we're looking at the PS5's fidelity mode which is native 4K/30fps without VRR and native 4K/40fps+ with VRR. In fact, with VRR enabled, PS5 fidelity mode generally stays around the 50fps range but the 99% fps is 40. Thus, matching that overall level of performance on PC will take a RTX3070/RX6750XT at a minimum. Again slight difference in settings, but as a ballpark it's pretty clear that something much lower than than 3070/6750 (i.e. a 2070 or 5700XT) won't really be able to hit PS5 level.


Conclusions:

  • This thread posed the question of identifying examples of when the PS5 can match or approach an RTX3070 level of PC performance. I submitted that we've seen this realized in the "best case" scenarios and the best case really is a console developed engine being ported back to PC. That includes Decima (Horizon, Death Stranding) and now Spiderman (with Uncharted Comparison coming soon). With Spiderman we have one of the best examples of truly comparing the REAL WORLD performance between console and PC.
  • To be clear, nobody is saying that the GPU HW in a PS5 is as capable as an RTX3070 when looking at it in isolation. But as has been reiterated many times, actual game performance relies on much more than just the metal making up a single component.
  • The total system performance in a PS5 in best case scenarios (engines that are optimized for AMD HW and utilizes the console advantages such as faster data throughput and unified memory) has been seen to approach/match a RTX 3070 in like for like comparisons including in Spiderman remastered
  • The consistency of performance on console is really key and most PC vs Console comparison completely miss this. These new gen consoles have done a great job of delivering locked framerates and high stable performance which you don't typically see on PC with unlocked frame rates. Matching that stability of experience on PC typically takes higher than expected HW when compared to console anyway
  • True this comparison I made isn't 100% like for like since the PC isn't running at identical settings to PS5. But my point is to answer the question "what will it take on PC to match the overall experience" including visuals and performance. Some slightly cleaner shadows and AO and higher resolution reflections will hardly be noticed when actually playing and not standing still to inspect certain scenes.
  • Also, remember that the console performance is entirely limited by the frame caps that are standard in that space. If a game can't be locked to 60fps for example, then the dev may have to lock to 30fps even thought the game could be running with an average of 50-60fps. That's a 50% or greater delta in performance that is just left on the table due to the frame cap. We've seen this recently with Rachet & Clank on PS5 with the VRR update. The fildelity mode that was previously locked to 30fps was running with an average of around 50fps in most scenarios when unlocked (an almost 70% increase in performance). It would have been grossly incorrect to take the Rachet & Clank fidelity performance at 30fps as a direct indication of what the PS5 is capable of. When comparing to PC systems running unlocked, you have to consider the frame cap in order to accurately compare performance.
  • jFKgHfj.jpg
TsMpYNh.gif
 
Top Bottom