"even when the higher VRAM card has slower raster and RT performance..." also add, higher VRAM card having slower bandwidth, slower texture fillrate, and lower memory bus. I'm sure this will create enough controversy to a point Nixxes may come out and make an explanation. I really look foward to their reasoning for falling back to 5.2 GB of VRAM usage when game reaches 6.4 GB VRAM and then using 4.5 GB of normal shared memory, causing %50-70 slowdowsn. I wonder if its an actual intended logical solution they made. It is still ridiculously stupid to me that game falls back to 5.2 GB after reaching 6.4 GB and using 4.5 GB of normal RAM as a substitute. If it needs 9.8 GB budget, it can still use normal 7.4 GB from the normal VRAM, then use another 2 GB from RAM. Instead, it falls back to 5.2 GB, potentially leaving an almost empty buffer of 3 GB. They should understand that it is not logical to sacrifice %50 performance for this solution. Just made it so that frames drop to 3-5 and inform that user is not suited to use those textures. That would prevent that video from spawning entirely.
Some people theorized that it leaves VRAM for sudden turns. I actually have a refute to that. I actually FILLED that empty VRAM buffer with: Twitch Studio, Chrome, Spotify and Discord. These 4 programs together filled that buffer. Then THE GAME still performed exactly like it did before. This proves that that portion of VRAM is never touched by the game's engine. I just find this design horrible, however you put it. I hope Alex can communicate with Nixxes regarding this. Then we can rectify certain inequities.
I you really did understand how video games work you would know of to things when it comes to VRAM.explain how higher VRAM cards have better performance than lower VRAM cards then, even when the higher VRAM card has slower raster and RT performance...
also explain how performance stabilises once you lower texture settings.
these things can only be explained with the game running out of VRAM for no apparent reason
I you really did understand how video games work you would know of to things when it comes to VRAM.
If VRAM is full, the game crashes to desktop or the game starts to use the CPU's RAM.
As long as the game doesn't use all the VRAM, performance is the same. It can be using 4GB or 7.9GB, fps is going to be the same as long as it doesn't require more VRAM than what the GPU has.
Insomniac confirmed the game crashes if it uses more VRAM than the GPU has.
We know Spider-Man stays in the 6-7GB range, so it's not VRAM starved.
It's been proven by the studio that did the port and others, it's CPU bound. But yet you ignore all this and think it's VRAM starved because you're looking for excuses because of how good the PS5 is performing.
I might just go with the Asus 3060 Phoenix instead.
Got a 550W Corsair PSU so I can't get anything too much better lol.
actually, most of the high textures do hold up, to a point most of the community is not disgruntled towards itexactly, that's the issue, the textures below very high are also below PS5 quality.
so only very high texture tests are of interest here
These are how the cards perform in general.
I don't get how you guys think having more VRAM = better performance. If that was the case, we would have 40GB cards by now.
What does that graph has to do with PS5?high textures (which is what is used in the graph you showed) is below PS5 quality.
these are not of interest.
as soon as you use the highest texture settings the faster cards perform worse than the slower cards with more vram, even tho these slower cards also have lower bandwidth and the vram pool of either type of cards never gets fully utilised.
the port has major issues here and is not a good PC port, very simple
What does that graph has to do with PS5?
That graph shows the comparison of the other cards and shows VRAM size doesn't effect performance.
The only thing VRAM does is store files.
The bigger the files, the more VRAM is needed.
How fast you can access those files, is what determines performance.
The quality of the textures and how big they are is what determines how much VRAM is needed.you do understand that higher res textures need more vram right?
I am confused how you can't grasp this very simple state of affairs.
if you use very high textures the cards with less vram perform worse even tho they are faster cards and their vram pool never gets fully used... this is not hard to understand...
an 8GB card will not use even close to the full 8GB and will perform worse than a slower 12GB card
Now post very high textures benchmarks.I think the 3070 vs 2080 Ti results here should put to rest any concerns over the VRAM being the bottleneck. Both are virtually identical in theoretical compute. 2080 Ti has 11 GB vs the 8 GB 3070.
The 3070 offers slightly better performance despite having the same amount of VRAM as the 2070. This is about the CPU. You can read the Nixxes interview. VRAM almost never comes up as a bottleneck. The performance increases going up from the 3080 to 3090 Ti are also in line with their compute performance. Going from 10 GB to 24 GB isnt increasing the performance by 2.4x or vice versa.
This is why dfs word can’t be taken as gospel. You already knew it was suspect with how much they were downplaying the Fran time spikes and frame pacing issues on mid and low end cardsthis is a sub par PC port that clearly needs some serious work with VRAM usage being fucking weird.
is it really so hard to agree on that? properly made PC games never will get into issues with VRAM on 8GB cards like this.
you can play Cyberpunk with max taxture settings and have no issues like these. that game will use about 7.5GB of vram, sometimes a bit more sometimes a bit less, and has way higher object and detail density.
the fact that the likes of DF and NXG still call this a good PC port is ridiculous.. it's serviceable, that's it... serviceable... not great, not good... it's playable, but it clearly underperforms and has CPU and VRAM issues.
You really are fucking clueless and pathetically desperate to defend your precious plastic box.Like I said above, if it was VRAM starved or running out of VRAM. It would crash to desktop.
Even Insomniac Games confirmed this.
Video Memory Crash
"The game has crashed due to using more video memory than currently available on this PC. Please try lowering your graphics settings, lowering your resolution, and closing any unnecessary background applications before running the game again."
Performance is only lost when the game starts using RAM, which is not the case with Spider-Man.
Can we move pass this VRAM starved nonsense.
Explain to me how the size of the VRAM affects performance. The only thing the size of the VRAM affects is texture quality.You really are fucking clueless and pathetically desperate to defend your precious plastic box.
Those guys don't bring hard evidence.It's been explained to you multiple fucking times in this thread.
Now STFU and accept that you're wrong already.
it crashes when it runs out out VRAM.
It runs that why because of how RT is done on the CPU. Digital Foundry explains this in the video.you say that over and over and I don't see how that is in any way relevant.
maybe that's actually the reason why it runs like shit. the engine is so unoptimised for PC that they had to make sure to never run out of VRAM, this could explain why it doesn't even come close to utilising the full VRAM pool of any GPU.
so it has a hard limit now how many % it can use and then just removes and loads assets based on that artificial limit.
so "ir crashed when it runs out of VRAM" doesn't mean what I think you think it means my guy... it makes it even a worse port than one would think it is, because that's not how a well made PC game should work
It runs that why because of how RT is done on the CPU. Digital Foundry explains this in the video.
Are you saying DF and Nixxes don't know what they are talking about?
Already done, quite clearly 8GB VRAM is not enough at 4k.now show very high + RT
Why you guys keep using that German article, there are many other benchmarks that are more accurate than that.Already done, quite clearly 8GB VRAM is not enough at 4k.
2080ti outperforming a 3070 by over 20%. 3060 faster then a 2070 Super. Heck the 2060 12GB is almost as fast as a 2070 Super.
Obvious what the problem is.
Dude, the 3070 with 8GB performs fine with max textures at 4K beating the 2080TI with 11GB. Very High RT isn't going to change that.show very high + RT... all you do is show RT with lower settings or very high without RT...
show Very High + RT if you can't give us those tests you have no argument