• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NxGamer] Spider-Man Remastered PC vs PS5 vs Steam Deck vs PS4 vs 750Ti - Technical Review & Comparison

Mr Moose

Member
No idea honestly, even native 1080 with ray tracing is not kind to 6600xt with maxed out Ray Tracing.
WycTizv.jpg



However "high" RT is more forgiving for it

ykyVAqc.jpg
I might just go with the Asus 3060 Phoenix instead.
Got a 550W Corsair PSU so I can't get anything too much better lol.
 

01011001

Banned
"even when the higher VRAM card has slower raster and RT performance..." also add, higher VRAM card having slower bandwidth, slower texture fillrate, and lower memory bus. I'm sure this will create enough controversy to a point Nixxes may come out and make an explanation. I really look foward to their reasoning for falling back to 5.2 GB of VRAM usage when game reaches 6.4 GB VRAM and then using 4.5 GB of normal shared memory, causing %50-70 slowdowsn. I wonder if its an actual intended logical solution they made. It is still ridiculously stupid to me that game falls back to 5.2 GB after reaching 6.4 GB and using 4.5 GB of normal RAM as a substitute. If it needs 9.8 GB budget, it can still use normal 7.4 GB from the normal VRAM, then use another 2 GB from RAM. Instead, it falls back to 5.2 GB, potentially leaving an almost empty buffer of 3 GB. They should understand that it is not logical to sacrifice %50 performance for this solution. Just made it so that frames drop to 3-5 and inform that user is not suited to use those textures. That would prevent that video from spawning entirely.

Some people theorized that it leaves VRAM for sudden turns. I actually have a refute to that. I actually FILLED that empty VRAM buffer with: Twitch Studio, Chrome, Spotify and Discord. These 4 programs together filled that buffer. Then THE GAME still performed exactly like it did before. This proves that that portion of VRAM is never touched by the game's engine. I just find this design horrible, however you put it. I hope Alex can communicate with Nixxes regarding this. Then we can rectify certain inequities.

it's IMO pretty clear that the VRAM behaviour is a remnant of the PS5 version.
the game most likely is made to use the memory scrubbers and fast SSD to stream data in and out on the fly. it could be a way for Insomniac to get used to the PS5 hardware by implementing stuff they wanna use in the future in early ports like these that don't even necessarily really needed it.

the fact that it doesn't even come close to utilising the full VRAM pool is fucking weird tho even with this explanation
 
Last edited:

Loxus

Member
explain how higher VRAM cards have better performance than lower VRAM cards then, even when the higher VRAM card has slower raster and RT performance...

also explain how performance stabilises once you lower texture settings.

these things can only be explained with the game running out of VRAM for no apparent reason
I you really did understand how video games work you would know of to things when it comes to VRAM.

If VRAM is full, the game crashes to desktop or the game starts to use the CPU's RAM.

As long as the game doesn't use all the VRAM, performance is the same. It can be using 4GB or 7.9GB, fps is going to be the same as long as it doesn't require more VRAM than what the GPU has.

Insomniac confirmed the game crashes if it uses more VRAM than the GPU has.
We know Spider-Man stays in the 6-7GB range, so it's not VRAM starved.

It's been proven by the studio that did the port and others, it's CPU bound. But yet you ignore all this and think it's VRAM starved because you're looking for excuses because of how good the PS5 is performing.
 

01011001

Banned
I you really did understand how video games work you would know of to things when it comes to VRAM.

If VRAM is full, the game crashes to desktop or the game starts to use the CPU's RAM.

As long as the game doesn't use all the VRAM, performance is the same. It can be using 4GB or 7.9GB, fps is going to be the same as long as it doesn't require more VRAM than what the GPU has.

Insomniac confirmed the game crashes if it uses more VRAM than the GPU has.
We know Spider-Man stays in the 6-7GB range, so it's not VRAM starved.

It's been proven by the studio that did the port and others, it's CPU bound. But yet you ignore all this and think it's VRAM starved because you're looking for excuses because of how good the PS5 is performing.

explain how SLOWER cards with more VRAM run the game better than FASTER cards with less VRAM.

I'll wait.
 

yamaci17

Member
I might just go with the Asus 3060 Phoenix instead.
Got a 550W Corsair PSU so I can't get anything too much better lol.

Yeah, seems like NVIDIA will push 240w through their pathetic 4060. There are also rumors it will be now 8 GB over a 128 bit bus. I wouldn't be surprised if they also hail a 10 GB 4070 with 280w TDP at this point. They've completely lost themselves. Now rumored 8 GB 3060s, 3060ti with 8 GB GDRR6x. It all becomes a big joke when you consider how controversial even this game is towards VRAM. What good 8 GB GDRR6x will bring to 3060ti? You need GDRR6x for super high resolutions, yet the card does not have enough VRAM to push through there. It is just too funny to me.
 

Loxus

Member
That's without very high textures though.
These are how the cards perform in general without RT.
hGgwStb.jpg


I don't get how you guys think having more VRAM = better performance. If that was the case, we would have 40GB cards by now.
 
Last edited:

yamaci17

Member
exactly, that's the issue, the textures below very high are also below PS5 quality.

so only very high texture tests are of interest here
actually, most of the high textures do hold up, to a point most of the community is not disgruntled towards it
when you compare rdr 2's high textures and ultra textures, spiderman's high textures are actually a blessing XD at least they look respectable.
 

01011001

Banned
These are how the cards perform in general.
hGgwStb.jpg


I don't get how you guys think having more VRAM = better performance. If that was the case, we would have 40GB cards by now.

high textures (which is what is used in the graph you showed) is below PS5 quality.

these are not of interest.
as soon as you use the highest texture settings the faster cards perform worse than the slower cards with more vram, even tho these slower cards also have lower bandwidth and the vram pool of either type of cards never gets fully utilised.

the port has major issues here and is not a good PC port, very simple
 
Last edited:

Loxus

Member
high textures (which is what is used in the graph you showed) is below PS5 quality.

these are not of interest.
as soon as you use the highest texture settings the faster cards perform worse than the slower cards with more vram, even tho these slower cards also have lower bandwidth and the vram pool of either type of cards never gets fully utilised.

the port has major issues here and is not a good PC port, very simple
What does that graph has to do with PS5?
That graph shows the comparison of the other cards and shows VRAM size doesn't effect performance.

The only thing VRAM does is store files.
The bigger the files, the more VRAM is needed.

How fast you can access those files, is what determines performance.
 
Last edited:

01011001

Banned
What does that graph has to do with PS5?
That graph shows the comparison of the other cards and shows VRAM size doesn't effect performance.

The only thing VRAM does is store files.
The bigger the files, the more VRAM is needed.

How fast you can access those files, is what determines performance.

you do understand that higher res textures need more vram right?

I am confused how you can't grasp this very simple state of affairs.

if you use very high textures the cards with less vram perform worse even tho they are faster cards and their vram pool never gets fully used... this is not hard to understand...

an 8GB card will not use even close to the full 8GB and will perform worse than a slower 12GB card
 

Loxus

Member
you do understand that higher res textures need more vram right?

I am confused how you can't grasp this very simple state of affairs.

if you use very high textures the cards with less vram perform worse even tho they are faster cards and their vram pool never gets fully used... this is not hard to understand...

an 8GB card will not use even close to the full 8GB and will perform worse than a slower 12GB card
The quality of the textures and how big they are is what determines how much VRAM is needed.

Again, how fast you can access those files is what determines performance. Which is why performance drops if the game uses the CPU RAM, because RAM bandwidth is much slower than GDDR bandwidth.

Spider-man crashes as soon as it becomes VRAM starved. It's doesn't use the CPU RAM.

Spider-man is a PS4 game, the VRAM requirements aren't that high and the PS4 only had a bandwidth 176GB/s.
 

SlimySnake

Flashless at the Golden Globes
I think the 3070 vs 2080 Ti results here should put to rest any concerns over the VRAM being the bottleneck. Both are virtually identical in theoretical compute. 2080 Ti has 11 GB vs the 8 GB 3070.

The 3070 offers slightly better performance despite having the same amount of VRAM as the 2070. This is about the CPU. You can read the Nixxes interview. VRAM almost never comes up as a bottleneck. The performance increases going up from the 3080 to 3090 Ti are also in line with their compute performance. Going from 10 GB to 24 GB isnt increasing the performance by 2.4x or vice versa.

4K-RT.png
 
Last edited:

Md Ray

Member
if you can explain why youn found my other post "funny", maybe I can answer your question then
Did that trigger you?

Couldn't care less if you answered my question. I went home and tested it myself on my 3070.
 
Last edited:
I think the 3070 vs 2080 Ti results here should put to rest any concerns over the VRAM being the bottleneck. Both are virtually identical in theoretical compute. 2080 Ti has 11 GB vs the 8 GB 3070.

The 3070 offers slightly better performance despite having the same amount of VRAM as the 2070. This is about the CPU. You can read the Nixxes interview. VRAM almost never comes up as a bottleneck. The performance increases going up from the 3080 to 3090 Ti are also in line with their compute performance. Going from 10 GB to 24 GB isnt increasing the performance by 2.4x or vice versa.

4K-RT.png
Now post very high textures benchmarks.
 
this is a sub par PC port that clearly needs some serious work with VRAM usage being fucking weird.

is it really so hard to agree on that? properly made PC games never will get into issues with VRAM on 8GB cards like this.
you can play Cyberpunk with max taxture settings and have no issues like these. that game will use about 7.5GB of vram, sometimes a bit more sometimes a bit less, and has way higher object and detail density.

the fact that the likes of DF and NXG still call this a good PC port is ridiculous.. it's serviceable, that's it... serviceable... not great, not good... it's playable, but it clearly underperforms and has CPU and VRAM issues.
This is why dfs word can’t be taken as gospel. You already knew it was suspect with how much they were downplaying the Fran time spikes and frame pacing issues on mid and low end cards
 

SatansReverence

Hipster Princess
Like I said above, if it was VRAM starved or running out of VRAM. It would crash to desktop.

Even Insomniac Games confirmed this.
Video Memory Crash
"The game has crashed due to using more video memory than currently available on this PC. Please try lowering your graphics settings, lowering your resolution, and closing any unnecessary background applications before running the game again."


Performance is only lost when the game starts using RAM, which is not the case with Spider-Man.

Can we move pass this VRAM starved nonsense.
You really are fucking clueless and pathetically desperate to defend your precious plastic box.
 

Loxus

Member
You really are fucking clueless and pathetically desperate to defend your precious plastic box.
Explain to me how the size of the VRAM affects performance. The only thing the size of the VRAM affects is texture quality.

Bandwidth is what effects performance, which is way we hear the term bandwidth starved.

I never heard the term VRAM starved yet.

Spider-man crashes when it runs out of VRAM.

Spider-man uses only 6-7GB out of 8GB, so it's not running out of VRAM.

The 3070 with 8GB of VRAM still outperforms the 2080TI with 11GB and still doesn't run out of VRAM at max 4k.

If you had went to school often, you would of had basics upstanding skills.
 

Loxus

Member
It's been explained to you multiple fucking times in this thread.

Now STFU and accept that you're wrong already.
Those guys don't bring hard evidence.
I show you confirmation from Insomniac, Spider-man crashes when it runs out out VRAM.


Digital Foundry and the studio that did the port Nixxes, says it's CPU bound because the CPU handles RT BVH management.

You guys purposely ignore official information because you can except the benchmark results.
 

01011001

Banned
it crashes when it runs out out VRAM.

you say that over and over and I don't see how that is in any way relevant.

maybe that's actually the reason why it runs like shit. the engine is so unoptimised for PC that they had to make sure to never run out of VRAM, this could explain why it doesn't even come close to utilising the full VRAM pool of any GPU.

so it has a hard limit now how many % it can use and then just removes and loads assets based on that artificial limit.

so "it crashed when it runs out of VRAM" doesn't mean what I think you think it means my guy... it makes it even a worse port than one would think it is, because that's not how a well made PC game should ever work, it's almost a confession by them that the port sucks
 
Last edited:

Loxus

Member
you say that over and over and I don't see how that is in any way relevant.

maybe that's actually the reason why it runs like shit. the engine is so unoptimised for PC that they had to make sure to never run out of VRAM, this could explain why it doesn't even come close to utilising the full VRAM pool of any GPU.

so it has a hard limit now how many % it can use and then just removes and loads assets based on that artificial limit.

so "ir crashed when it runs out of VRAM" doesn't mean what I think you think it means my guy... it makes it even a worse port than one would think it is, because that's not how a well made PC game should work
It runs that why because of how RT is done on the CPU. Digital Foundry explains this in the video.

Are you saying DF and Nixxes don't know what they are talking about?
 

01011001

Banned
It runs that why because of how RT is done on the CPU. Digital Foundry explains this in the video.

Are you saying DF and Nixxes don't know what they are talking about?

Nixxes has reasons to downplay issues they couldn't get rid of during the porting process, or do you think they'll go out there and tell their potential customers "hey we couldn't make this work right btw!"

explain why faster GPUs run worse then slower GPUs... explain... just do that...
as soon as you have textures set to very high suddenly performance drops significantly while VRAM is not fully utilised. and weirdly as soon as you do that the slower GPU with more memory has less issues 🤔

how fucking hard is it to look at the fucking numbers and come to that conclusion?

VRAM is not utilised
slower cards with more VRAM suddenly run faster

explain how that is a sign of a good PC port, spoiler it's not, it's exactly the opposite.
 
Last edited:

Zathalus

Member
And this has proven them wrong.
FDuS6tk.jpg
ta21LJT.jpg

now show very high + RT
Already done, quite clearly 8GB VRAM is not enough at 4k.


2080ti outperforming a 3070 by over 20%. 3060 faster then a 2070 Super. Heck the 2060 12GB is almost as fast as a 2070 Super.

Obvious what the problem is.
 

Loxus

Member
show very high + RT... all you do is show RT with lower settings or very high without RT...

show Very High + RT :) if you can't give us those tests you have no argument
Dude, the 3070 with 8GB performs fine with max textures at 4K beating the 2080TI with 11GB. Very High RT isn't going to change that.
 
Top Bottom