• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel's Spider Man Remastered (PC & Steam Deck) | Review Thread

yamaci17

Member
It’s weird the dude above said it’s a 1700x then
on paper, he is correct. reality is often disappointing. console APIs will extract more performance out of similar hardware, most of the time. GPU-side of the things seemed to improve, but CPU-side of the things are still undesirable

don't worry tho i have a crap ryzen 2700x that helps me to easily emulate a 1700x. i'm pretty sure it wont hit a locked 60 fps with ray tracing
tho its also a moot point, a 5600 is dirt cheap and destroys and decimates then-overpriced 1700x-2700x and even the 3700x. yes, 5600 destroys and decimates a 2700x. it would probably... erase the 1700x out of existence. even if you have a b350 board, nowadays you are actually able to upgrade to zen 3 lol

and no, 3700x will never ever surpass a 5600. IPC difference is too much and 12 threads are still plenty enough to survive the entire generation. dont get me wrong but it is very pointless to fixate on 2700x/3700x/3600. those are relic CPUs. no one in their right mind would (aside from crazy people like me) would match a 3070 with a 2700x/1700x. at WORST you can match it with something like 3700x/3600. but even then, I WOULD say even 3070 deserves a minimum of Zen 3 CPU.

Vh3eRcZ.png





Reccomended settings are almost always for 60fps, unless otherwise stated (which it isnt). Sure that doesnt mean 60fps at max settings. If it was for 30fps the minimum and reccomended cpu's would be very similar.

i know, i just point out that in almost every 8th gen game that ps4 could hit rock solid 30, r5 1600 managed to bring that to upwards of 60 frames. why spiderman should be the expection to the rule, especially considering it is utilizing multithreading gracefully and directx12
 
Last edited:

yamaci17

Member
Adding a variable Like different cpu in what’s supposed to be a GPU benchmark makes the bench invalid it’s what alex battaglia does a lot
nope quite the opposite. it makes sure that GPU is well fed.

on consoles, games are designed around utilizing maximum potential GPU. especially with dynamic resolution. console games are designed in a way that they always maximize the usage of GPU, they target 4k but drop the resolution when the GPU is pressured. if the majority of the games were CPU bound, you wouldn't see resolution drops.
hence, you need to be free of CPU boundness when testing similar GPUs on desktop. and to make sure of that, you gotta throw something good at it. as you can see, IGN gamer constantly runs into bottlenecks with his 2700x on his 2070. it is clear that CPUs on consoles punch above their weight, otherwise 2700x would not drop frames in that manner. it just drops, it happens in many games, lots of devs says that lots of things that require CPU on desktop is practically free on consoles. so 1700x or 2700x being equal to ps5 theoritically doesn't mean a thing. the thing can practically output near unlimited drawcalls due to its specific metal API and unified architecture. drawcall bottlenecks on desktop has been a pain for gamers since 2010s.
 
Last edited:
nope quite the opposite. it makes sure that GPU is well fed.

on consoles, games are designed around utilizing maximum potential GPU. especially with dynamic resolution. console games are designed in a way that they always maximize the usage of GPU, they target 4k but drop the resolution when the GPU is pressured. if the majority of the games were CPU bound, you wouldn't see resolution drops.
hence, you need to be free of CPU boundness when testing similar GPUs on desktop. and to make sure of that, you gotta throw something good at it. as you can see, IGN gamer constantly runs into bottlenecks with his 2700x on his 2070. it is clear that CPUs on consoles punch above their weight, otherwise 2700x would not drop frames in that manner. it just drops, it happens in many games, lots of devs says that lots of things that require CPU on desktop is practically free on consoles. so 1700x or 2700x being equal to ps5 theoritically doesn't mean a thing. the thing can practically output near unlimited drawcalls due to its specific metal API and unified architecture. drawcall bottlenecks on desktop has been a pain for gamers since 2010s.
I don’t think you understand what I’m saying if your allowing the pc to remain non cpu bottlenecked but still allow the ps5 to be cpu bottlenecked then how is that fair remember this isn’t a full rig benchmark this is supposed to be purely a gpu benchmark. I used the example earlier of pairing different cpus with pc gpus in what’s supposed to be a gpu benchmark to show how ridiculous it is to expect the same results
 

yamaci17

Member
I don’t think you understand what I’m saying if your allowing the pc to remain non cpu bottlenecked but still allow the ps5 to be cpu bottlenecked then how is that fair remember this isn’t a full rig benchmark this is supposed to be purely a gpu benchmark. I used the example earlier of pairing different cpus with pc gpus in what’s supposed to be a gpu benchmark to show how ridiculous it is to expect the same results
ps5 is not cpu bottlenecked in its own controll environment with gpu bound settings / gpu bound resolution targets

how do you know that PS5 is cpu bottlenecked? do you have gpu/cpu metrics in your discretion?

if the performance was not being limited by THE GPU, resolution WOULD NOT drop. in a game where performance is limited BY GPU, there's no CPU bottlenecks. quite literally GPU is being the bottleneck, being the main performance limiter. if a game DROPS resolution, IT DROPS it because GPU is not up for the task at given resolution. hence, it is GPU LIMITED. hence, it is GPU bound. hence, IT IS NOT BOTTLENECKED BY CPU.
 
Last edited:
ps5 is not cpu bottlenecked in its own controll environment with gpu bound settings / gpu bound resolution targets

how do you know that PS5 is cpu bottlenecked? do you have gpu/cpu metrics in your discretion?

if the performance was not being limited by THE GPU, resolution WOULD NOT drop. in a game where performance is limited BY GPU, there's no CPU bottlenecks. quite literally GPU is being the bottleneck, being the main performance limiter. if a game DROPS resolution, IT DROPS it because GPU is not up for the task at given resolution. hence, it is GPU LIMITED. hence, it is GPU bound. hence, IT IS NOT BOTTLENECKED BY CPU.
I dont think you can guarantee there are no cpu bottlenecks
 

yamaci17

Member
I dont think you can guarantee there are no cpu bottlenecks
we can, if a game uses aggresive resolution scaling to lock to a 60

do you think ps5 on its vrr uncapped native 4k rt mode is being limited by CPU at mere 40-50 frames? its not. it is heavily GPU bound.

dont worry, i will do console equivalent 4k native rt benchmarks with my 1700x equivalent CPU parameters on my 3070.
 

GametimeUK

Member
I just got my order confirmation email got my Steam Deck. I'm going to enjoy playing this game on my LGC9 Oled for the main missions and Steam Deck for exploration and side quests/ collectibles. Woo.
 
we can, if a game uses aggresive resolution scaling to lock to a 60

do you think ps5 on its vrr uncapped native 4k rt mode is being limited by CPU at mere 40-50 frames? its not. it is heavily GPU bound.

dont worry, i will do console equivalent 4k native rt benchmarks with my 1700x equivalent CPU parameters on my 3070.
It’s in the performance rt mode where there is more dynamic res scaling
 

rofif

Can’t Git Gud
512gb, UK, 31st July 2021. :)

If you want to check your progress have you seen the steam deck email calculator on reddit?
I only ordered this March…. So am ded.
Lol just checked. 6% there!
14 days were processed out of 250 days of orders lol
 
Last edited:

StreetsofBeige

Gold Member


Indiana Jones Reaction GIF


I‘ve Platinumed the console version but I’ll definitely grab this in a Steam or EGS sale. Whoever gets it to $25 first.

LOL. They didn't have correct spelling in that tweet. Also, if you skim the comments, a shit load of responses came up on the day they announced it for PC. Insomniac didn't even have the balls to respond back.
 

Midn1ght

Member
I only ordered this March…. So am ded.
Lol just checked. 6% there!
14 days were processed out of 250 days of orders lol
What model did you order? Had a look on reddit and it seems like the 64gb is slower than the other two. Also, a lot people are making huge jump (40-50%) out of nowhere after being stuck in single digit for a long time so it seems like Valve is definitely increasing production.

I ordered a 64GB EU early March and I’m at 10%. Valve confirmed we’ll get it this year so it’s all good.
 

rofif

Can’t Git Gud
What model did you order? Had a look on reddit and it seems like the 64gb is slower than the other two. Also, a lot people are making huge jump (40-50%) out of nowhere after being stuck in single digit for a long time so it seems like Valve is definitely increasing production.

I ordered a 64GB EU early March and I’m at 10%. Valve confirmed we’ll get it this year so it’s all good.
256
 

ClosBSAS

Member
Lol cult of the lamb is number one on steam best sellers now. .spiderman won't sell more than GOW. I am now sure of it.
 
on paper, he is correct. reality is often disappointing. console APIs will extract more performance out of similar hardware, most of the time. GPU-side of the things seemed to improve, but CPU-side of the things are still undesirable

don't worry tho i have a crap ryzen 2700x that helps me to easily emulate a 1700x. i'm pretty sure it wont hit a locked 60 fps with ray tracing
tho its also a moot point, a 5600 is dirt cheap and destroys and decimates then-overpriced 1700x-2700x and even the 3700x. yes, 5600 destroys and decimates a 2700x. it would probably... erase the 1700x out of existence. even if you have a b350 board, nowadays you are actually able to upgrade to zen 3 lol

and no, 3700x will never ever surpass a 5600. IPC difference is too much and 12 threads are still plenty enough to survive the entire generation. dont get me wrong but it is very pointless to fixate on 2700x/3700x/3600. those are relic CPUs. no one in their right mind would (aside from crazy people like me) would match a 3070 with a 2700x/1700x. at WORST you can match it with something like 3700x/3600. but even then, I WOULD say even 3070 deserves a minimum of Zen 3 CPU.

Vh3eRcZ.png







i know, i just point out that in almost every 8th gen game that ps4 could hit rock solid 30, r5 1600 managed to bring that to upwards of 60 frames. why spiderman should be the expection to the rule, especially considering it is utilizing multithreading gracefully and directx12
Cheers for bringing this to my attention, I wasn't aware you could now stick a 5000 series in b350 boards.
 

yamaci17

Member
Cheers for bringing this to my attention, I wasn't aware you could now stick a 5000 series in b350 boards.
yup, my friend upgraded from a 1700x to 5600x on his b350 tomahawk

the difference was astounding. up to 2 times performance in a lot of games. we used to play heroes of storm together and 1700x would drop to 90s in 5v5 combats, now it doesn't budge below 200 lol (not kidding).



and usually you get %40-70 uplift over zen/zen+. look at the hitman section, literally %70 improvement.

zen 3 sometimes destroys zen 2 too in certain situations like flight simulator;



here, zen 3 casually murdering zen 2 by %50-60 percent in flight simulator

as i said, it is reailly pointless to fixate on zen/zen+/zen 2.

it is important to note that nvidia drivers use %20 more CPU cycles to do scheduling. this is the case with all CPUs so you really gotta feed those rtx 3000 series with a minimum of something like 5600x. or just live with consequences if you choose not to do
 
Last edited:
yup, my friend upgraded from a 1700x to 5600x on his b350 tomahawk
[/URL][/URL]

the difference was astounding. up to 2 times performance in a lot of games. we used to play heroes of storm together and 1700x would drop to 90s in 5v5 combats, now it doesn't budge below 200 lol (not kidding).



and usually you get %40-70 uplift over zen/zen+. look at the hitman section, literally %70 improvement.

zen 3 sometimes destroys zen 2 too in certain situations like flight simulator;



here, zen 3 casually murdering zen 2 by %50-60 percent in flight simulator

as i said, it is reailly pointless to fixate on zen/zen+/zen 2.

it is important to note that nvidia drivers use %20 more CPU cycles to do scheduling. this is the case with all CPUs so you really gotta feed those rtx 3000 series with a minimum of something like 5600x. or just live with consequences if you choose not to do

That's amazing I have a 3600 at the moment but no GPU 🤣 (did have a 3070 but sold it) bought a legion 5 pro 3070 laptop last year though. Might pick up the next gen 4060ti/4070 or amd equivalent next year if they play well at 4k. If not I'll upgrade the CPU to 5000 series too so handy to know. Thanks again!
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
95% of the time I didn't even look at the reflections when I was playing Spider-Man and Miles Morales on PS5. Was just enjoying the game like a normal person.
Then you wasted money on the reflections. They could have sold a product without it and cheaper.
when you spend 3 grand on a monster rig it becomes compulsory to stare and nitpick at reflections and stuff
They paid for the reflections, so nothing strange to stare at them. If you don't, your loss/waste of money.
 
Last edited:

RoadHazard

Gold Member
95% of the time I didn't even look at the reflections when I was playing Spider-Man and Miles Morales on PS5. Was just enjoying the game like a normal person.

Yeah, I checked out the reflections for a few minutes because it was the first RT game I played (MM on PS5, the original I played on PS4 Pro), but then I just played the game. Having proper reflections does add something when you're swinging through the city VS static cubemaps, so I played in RT Performance mode, but the relatively low reflection quality never bothered me one bit. It's something that only really becomes noticeable when you're right up against a shiny building and looking for it, it doesn't matter during regular gameplay.
 

JCK75

Member
All I know is I'm crazy excited this came out on Steam.. All I need is a release of Ghost of Tsushima to come to Steam and my Steam Deck has made my PS4 pro pointless.
 

01011001

Banned
95% of the time I didn't even look at the reflections when I was playing Spider-Man and Miles Morales on PS5. Was just enjoying the game like a normal person.

95% of the time you also dont look at shadows cast by the buildings around you, but they are there...

the thing is with effects like these, it is supposed be a thing that's just there and you don't even think about it.

because the moment it stands out is when there's either something you've never seen before and is like a "wow" moment... or something that sticks out and looks off.

back during the PS2 days reflections never stood out to me. often they looked serviceable and weren't offensive.
techniques used back then like render to texture, planar reflections, or simply having hand placed low detail assets below a transparent texture were good looking enough, didn't stand out in any negative way and were often just a nice addition to the graphical makeup of a game.

but as games got more and more detailed developers started looking towards solutions that allow them to push super high detail graphics and then add a cheap and simple way to just throw in some reflective surfaces.
basically a way to not even think about how to implement reflections and just, almost literally, click on a toggle and be done.

and that's where Screen Space Reflections come in.
and that's when it started for me to become a problem.
because unlike the PS2/GC/Xbox era style of reflections that were often less detailed and not 100% correct, this new Screen Space solution introduced a gigantic amount of possible scenarios where the whole illusion will just break apart in the most ugly ways possible.

while the low detail or low resolution reflections of the PS2 era weren't amazing looking, they all at least had one thing in common, and that is that they don't completely break apart simply because the player turns the camera a bit or an object is on screen in the wrong position.

so since then, with the push for more and more detail, we needed a better solution to have detailed and realistic reflections, basically to get back to a point where reflections do not look out of place, but also don't look like glitched shit like Screen Space Reflections often (almost always) do.


and to see for yourself just how far we have fallen in terms of reflections in games going from PS2/GameCube/Xbox to today, just compare how excellently Super Mario Sunshine's first level handles reflections and then look at how Altissia looks in Final Fantasy XV... with sailboats basically creating real-rime graphic glitches wherever they go due to the use of Screen Space Reflections.
 
Last edited:

Stuart360

Member
I couldnt give too shits about ray tracing (which is why i upgraded to a 1080ti over a RTX card), but this game is one of the only times RT would be worth while, due to the sheer amount of windows in the game.
It will be interesting to see if the 10 series cards can actually do RT in this, like in other games like Control.
 

Pedro Motta

Member
I couldnt give too shits about ray tracing (which is why i upgraded to a 1080ti over a RTX card), but this game is one of the only times RT would be worth while, due to the sheer amount of windows in the game.
It will be interesting to see if the 10 series cards can actually do RT in this, like in other games like Control.
Maybe it supports it, like some games before, but most likely it will tank performance.
 

01011001

Banned
I couldnt give too shits about ray tracing (which is why i upgraded to a 1080ti over a RTX card), but this game is one of the only times RT would be worth while, due to the sheer amount of windows in the game.
It will be interesting to see if the 10 series cards can actually do RT in this, like in other games like Control.

I think I've seen in a video where someone tests it on a 1070 that the RT settings sre greyed out.

I wonder if that actually means they aren't supported or if maybe the guy didn't have up to date drivers.

I would love to see it run with RT at console equivalent settings on a 1080ti
 
Top Bottom