• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NxGamer] Spider-Man Remastered PC vs PS5 vs Steam Deck vs PS4 vs 750Ti - Technical Review & Comparison

01011001

Banned
Why you guys keep using that German article, there are many other benchmarks that are more accurate than that.

look, we have an omniscient being among us, this god like figure knows which benchmark is more accurate simply by looking at bars!

ALL HAIL OUR BENCHMARK OVERLORD!

Cracking Up Lol GIF
 

01011001

Banned
Dude, the 3070 with 8GB performs fine with max textures at 4K beating the 2080TI with 11GB. Very High RT isn't going to change that.

show very high + RT... you literally have no argument without that.

very high textures = PS5 quality, anything else is irrelevant
 

Zathalus

Member
Why you guys keep using that German article, there are many other benchmarks that are more accurate than that.
It's the only one that has RT plus very high textures. No point in linking benchmarks that don't meet that criteria when trying to determine VRAM limitations at 4k.

Plus pcgameshardware.de is extremely reputable.
 

Mr1999

Member
It is sad to see how overpriced PC hardware has become.
it's not that bad. i was able to get a 3070ti at msrp back in march. i have seen little price drop for 12700k though, you can sometimes get really good deals on pc hardware. its raining hw at the moment and its great. however i would never overpay hundreds for hw either.
ozXON6w.jpg
 
They didn't do 4K with RT enabled benchmarks, but the 3070 with 8GB still performs a bit better than the 2080TI 11GB at 4K RT disabled.

Even the 2070 with 8GB performs a bit better than the 3060 with 12GB.
tnxxFHF.jpg
These are from Techspot yes? Ie HUB? I think their benchmark was just a jog through a few streets (NOT swinging based, so game isn't contending with fast movement). Maybe VRAM isn't much of a constraint in this scenario as textures don't need to be streamed and swapped in and out of VRAM that fast. And thus, relative performance between cards here reflect their computational power more directly.
 
Last edited:
Alex is completely fucking hopeless, but NX Gamer is very close to getting these things right. I saw the 6600xt for just $250 the other day. If these fools dont buy it, i might go and grab it myself. But my 8 core 16 thread CPU runs at 4.9-5.1 GHz so it wont be an accurate comparison.
The problem is that it's not about trying to match the PS5, it's about showing what the game potentially has to offer on a decent PC.

That being said, DF used to have a "comparable" build early in the PS4 days (if I remember well) and they would pull it out in comparisons (along the top PC config). It's always interesting to know, I just find that the information you get from both configuration has its use, but showing only the mid-range or console equivalent PC would not be representative.

Imagine if the game had a PS4 version, you would be forced to test the PC version on an old CPU/GPU combo just for the sake of it, this is not what the platform has to offer.
 

Loxus

Member
Maybe VRAM isn't much of a constraint in this scenario as textures don't need to be streamed and swapped in and out of VRAM that fast.
Bandwidth is what your talking.

rn4q24A.png


The more bandwidth the GPU has, the faster textures can be swapped in and out and accessed by the GPU.

The size of the VRAM determine how much data can be stored in the high bandwidth memory.
 
They spent the entire the video comparing it to the ps5 though.
Yes, you can get this when you play the PC version of the game... And that when you play the ps5.

Having a similar PC configuration is only interesting for people who want shop for an equivalent PC, which I don't think is the way people go about when they buy a PC.

He could have tested it on a 1080p 240hz monitor as well.

I have a PC hooked up to my TV, right beside the PS5. When I look at a game released on both like this (or Elden Ring) I could not care less about some arbitrary PS5 équivalent configuration, my PC eats the PS5 for dinner, so I want to know what are the benefits that the extra horsepower will give me. If anything substantial is offered I will buy the game on PC (which is what my kids use to play Switch games as well).
 

Loxus

Member
It's the only one that has RT plus very high textures. No point in linking benchmarks that don't meet that criteria when trying to determine VRAM limitations at 4k.

Plus pcgameshardware.de is extremely reputable.
The 3070 and 2080ti performs nearly identical in compute, but yet that extra 3GB of VRAM doesn't give the 2080ti an edge.




These videos are 10 times more reputable than that article.
 

Zathalus

Member
The 3070 and 2080ti performs nearly identical in compute, but yet that extra 3GB of VRAM doesn't give the 2080ti an edge.




These videos are 10 times more reputable than that article.

... seriously?

The reason that the 2080ti is outperforming the 3070 at 4k is because of VRAM limitations. At lower resolutions the cards are almost identical. Or in any game where VRAM bottlenecks are not a thing.

They are not claiming that the 2080ti is a more powerful GPU, just that it is outperforming the 3070 at 4k. Same thing is happening with the 2060 12GB and 2070 Super at 4k.

It's also funny when you link Digital Foundry, when Alex himself often refers to pcgameshardware.de.
 

Loxus

Member
... seriously?

The reason that the 2080ti is outperforming the 3070 at 4k is because of VRAM limitations. At lower resolutions the cards are almost identical. Or in any game where VRAM bottlenecks are not a thing.

They are not claiming that the 2080ti is a more powerful GPU, just that it is outperforming the 3070 at 4k. Same thing is happening with the 2060 12GB and 2070 Super at 4k.

It's also funny when you link Digital Foundry, when Alex himself often refers to pcgameshardware.de.
Nope, DF says it's bandwidth speed not the size of the VRAM.
4Rezdek.jpg
 

Loxus

Member
Not exactly what a screenshot from a completely unrelated game has to do with anything?
All games use the same graphics pipeline.
8 games show the size of the VRAM doesn't determine performance, it's bandwidth.

But keep believing some random German article. I'm done with this topic.
 

Zathalus

Member
All games use the same graphics pipeline.
8 games show the size of the VRAM doesn't determine performance, it's bandwidth.

But keep believing some random German article. I'm done with this topic.
Oh my God, you cannot be serious? It's obvious you are not a PC gamer if you have not read about VRAM limitations impacting performance.

Obviously the size of VRAM does not matter if you stay within the total amount, but performance will suffer as soon as you exceed it.

And lol, random German article. Yeah buddy, keep trying to downplay one of the most reputable German tech sites because you don't like the conclusions.
 

yamaci17

Member
Oh my God, you cannot be serious? It's obvious you are not a PC gamer if you have not read about VRAM limitations impacting performance.

Obviously the size of VRAM does not matter if you stay within the total amount, but performance will suffer as soon as you exceed it.

And lol, random German article. Yeah buddy, keep trying to downplay one of the most reputable German tech sites because you don't like the conclusions.
Guru3D also found that 3070 was getting WORSE framerate averege than a.... 3060 @4K resolution. This is not a joke. Sadly, the person is unable to understand this.


As I said, this is not even an exclusive behaviour to Spiderman. If it it was, it wouldn't happen with various other games. But it happens. If anyone thinsk RE Village is designed like how they think Spiderman is designed for PS5, then think again.

ylZBMPW.jpg



See, game did not "crash". It did not got destroyed. It simply halved framerate. 3070, instead of getting 78 framerate average, now gets 43 framerate average because it is stalled out by RAM to VRAM transaction.

And some games will simply deceive you. Here are three outcomes to being out of VRAM
1) Crashing (very rare)
2) Halving the framerate by compensating the lack of VRAM with normal RAM (what happens in RE Village and Spiderman)
3) Rendering almost gets to a stop, near 3-6 FPS, where GPU completely is choked
4) Deceiving the user by using horribly ugly super low res textures hiddenly, in the chance that they wouldn't notice, so they would think they're running textures fine

7:40


Look how "ultra" textures look considerably worse than "medium" textures.

Practically, stop chasing ultra textures with 3070. It cannot handle it. Even in this game that no one cares about.
 

Loxus

Member
Oh my God, you cannot be serious? It's obvious you are not a PC gamer if you have not read about VRAM limitations impacting performance.

Obviously the size of VRAM does not matter if you stay within the total amount, but performance will suffer as soon as you exceed it.

And lol, random German article. Yeah buddy, keep trying to downplay one of the most reputable German tech sites because you don't like the conclusions.
That's the thing, Spider-man doesn't exceed it.
It stays within 6-7GB of 8GB, so how is performance being impacted?
 

Mr Moose

Member
All games use the same graphics pipeline.
8 games show the size of the VRAM doesn't determine performance, it's bandwidth.

But keep believing some random German article. I'm done with this topic.
Let's say there's some artificial limit set on the VRAM, about 80%.
When using RT and very high textures (which PS5 uses), that artificial limit gets hit, about 6.4GB of a 8GB card.
Hmm, kinda like a GTX 970 with the shit 3.5 + 512MB VRAM, when that passed 3.5 it tanked performance.

Still, I think with RT and everything else, the PS5 is performing well (normally at a 2060 with RT, right?).
 

yamaci17

Member
Let's say there's some artificial limit set on the VRAM, about 80%.
When using RT and very high textures (which PS5 uses), that artificial limit gets hit, about 6.4GB of a 8GB card.
Hmm, kinda like a GTX 970 with the shit 3.5 + 512MB VRAM, when that passed 3.5 it tanked performance.

Still, I think with RT and everything else, the PS5 is performing well (normally at a 2060 with RT, right?).
The game is similar to Far Cry when it comes to Ray Tracing. It is more raster heavy than RT heavy. NVIDIA's RT is only shining when you really put them against heavy ray tracing. The game actually has very lightweight RT implementation. There's very few games where a 2080ti or 3070 can get native 4K with ray tracing. Games that are similar to that is Far Cry 6 and so on. It is understandable why they made RT in a light way. RT high reflections are very low quality, while they are really a huge improvement over stock cubemaps and SSR, it does not change the fact that it is nothing to hugely boast about.


Here in DL2, where RTGI and RTO comes in, 6700xt bails out. Being slower than 3060/3060ti and all competing cards. We're looking at 3060 being %15 faster than 6700xt. Do note that 6700xt is a rasterization beast. Yet it is so slow in Ray Tracing in this title that it gets beaten by a puny 3060, which really lacks rasterization punch.
csX1uah.jpg



Then comes Far Cry 6 with its light weight RT implementation. Here we see 6700xt nearly evening out a 3060ti, leaving 3060 in dust by %21.

d7bZhSd.jpg




In other terms, complexity of Ray Tracing is a huge factor when it comes to NV/AMD ray tracing performance. This game being light will easily make a PS5 perform like how it performs against NV Gpus in rasterization. Actually, you can practically use Very High RT geometry and still get great performance results. If PS5 had a theoritical Very High geometry preset, it would most likely perform even lower compared to equivalent cards.

Assuming that PS5's is %15 below 6700xt, it is actually very normal to see how it fares against 2070. 2070, despite having good RT performance is way behind of 6700xt in Far Cry 6. G

GOTG for example is an havier RT title;

9P0fYhj.jpg


So now 3060 again, outperforms the 6700xt.


More points to the argument;

F1.png


%28 faster at 4K, %20 faster at 1440p

Metro.png


%68 (holy) faster at 4K, %55 faster at 1440p

As you can see, RT performance parity between NV and AMD architectures hugely varies on implementation. (it is not specifically favoring AMD, it is just not stressing NV hardware). NV hardware is designed for even more complex RT scenarios where you throw GIs, shadows and reflections, AMD implementation is actually very compentent when it comes to less complex and straightforward implementations, like, simply having only RT reflections or RT shadows.
 

Loxus

Member
Let's say there's some artificial limit set on the VRAM, about 80%.
When using RT and very high textures (which PS5 uses), that artificial limit gets hit, about 6.4GB of a 8GB card.
Hmm, kinda like a GTX 970 with the shit 3.5 + 512MB VRAM, when that passed 3.5 it tanked performance.

Still, I think with RT and everything else, the PS5 is performing well (normally at a 2060 with RT, right?).
That would of made sense if it wasn't going above 7GB, I've even seen it hit 7.5GB usage.
I don't think there is an artificial limit on VRAM.
nzvXDsB.jpg


Even at 4K, it still remains under 7GB.
r9INeMT.jpg


Imo, it's bandwidth in more cases that can be a VRAM limit.

But like I said, I'm done with this topic.
These guys are cherry picking and you would also realize there are from a certain crowd.
 
That would of made sense if it wasn't going above 7GB, I've even seen it hit 7.5GB usage.
I don't think there is an artificial limit on VRAM.
nzvXDsB.jpg


Even at 4K, it still remains under 7GB.
r9INeMT.jpg


Imo, it's bandwidth in more cases that can be a VRAM limit.

But like I said, I'm done with this topic.
These guys are cherry picking and you would also realize there are from a certain crowd.
Dude unironically posting RT off benchmarks. What a troll.
 

SlimySnake

Flashless at the Golden Globes
Having a similar PC configuration is only interesting for people who want shop for an equivalent PC, which I don't think is the way people go about when they buy a PC.
Yes, thats true but Alex purposefully tries to do comparisons between PS5 GPUs and PC GPUs. Hes been doing that since launch. AC Valhalla, Hitman 3, Watch Dogs Legion, Control. He MAKES those comparisons and if the point of the comparison is to figure out the processing power of the GPU then pairing up the PC GPUs with a 12 core 24 thread 5.2 ghz $599 monster isnt the right way to go about it. Especially when we've known from day one that RT has a CPU cost.

He's the reason we all say that the rtx 2060 is basically what the ps5 and xsx are equal to in ray traced games. Even though he's been pairing it up with a $599 CPU. Who owns those CPUs on the PC market? A fraction of a fraction. No one spends that much on a CPU. Even the people who pay $700-800 on the 2080s and 3080s at launch.

His comparisons were always bs and even though he always presented them as academic exercises, now we know that his testing methodology was incorrect because he failed to account for the CPU bottlenecking the PS5 GPU in his 'GPU' comparisons going back two years.
 

SlimySnake

Flashless at the Golden Globes
Still, I think with RT and everything else, the PS5 is performing well (normally at a 2060 with RT, right?).
Well, that depends from game to game. In Doom RT, PS5 and XSX destroy the 2060 offering double the performance. In Control, the corridor of doom has them outpeforming a 2070 but Alex dismisses that because consoles are using half res reflections. In the Matrix, the PS5, XSX and AMD cards are able to keep up with the RTX cards and perform close to 2080. Alex kept referring to the PS5 and XSX running the game at 1080p but if you pay close attention he only talks about that for the cutscenes during the chase sequence that have black bars and run at nearly 1300p dropping at times due to DRS. Open world is at 1620p on the one time he was able to count the pixels. I spent the last 10 months thinking the PS5 was running at 1080p thanks to him. Even VG tech had his pixel counts at 1440p.

Basically, if the game is using RT that was developed with next gen consoles in mind, the PS5 and XSX will do way better than if they were ported from PC games that had Nvidia's RTX implementation. Even games like Far Cry which had AMD sponsored RT performed way better than other RT games that initially did their dev on Nvidia's GPUs.

I think the fact that both Ratchet and Miles are able to push high quality reflections at 40-50 fps and native 4k is proof that these consoles are a lot more powerful than DF gave them credit for. They just need devs to write code for the AMD's RT implementation instead of downporting from their RTX solution.
 
Yes, thats true but Alex purposefully tries to do comparisons between PS5 GPUs and PC GPUs. Hes been doing that since launch. AC Valhalla, Hitman 3, Watch Dogs Legion, Control. He MAKES those comparisons and if the point of the comparison is to figure out the processing power of the GPU then pairing up the PC GPUs with a 12 core 24 thread 5.2 ghz $599 monster isnt the right way to go about it. Especially when we've known from day one that RT has a CPU cost.

He's the reason we all say that the rtx 2060 is basically what the ps5 and xsx are equal to in ray traced games. Even though he's been pairing it up with a $599 CPU. Who owns those CPUs on the PC market? A fraction of a fraction. No one spends that much on a CPU. Even the people who pay $700-800 on the 2080s and 3080s at launch.

His comparisons were always bs and even though he always presented them as academic exercises, now we know that his testing methodology was incorrect because he failed to account for the CPU bottlenecking the PS5 GPU in his 'GPU' comparisons going back two years.
it was always straightforward the ps5 doesnt have a 10900k let alone a 12900k in it, so why are you pairing the pc with it if your trying to do a like for like benchmark
 
Well, that depends from game to game. In Doom RT, PS5 and XSX destroy the 2060 offering double the performance. In Control, the corridor of doom has them outpeforming a 2070 but Alex dismisses that because consoles are using half res reflections. In the Matrix, the PS5, XSX and AMD cards are able to keep up with the RTX cards and perform close to 2080. Alex kept referring to the PS5 and XSX running the game at 1080p but if you pay close attention he only talks about that for the cutscenes during the chase sequence that have black bars and run at nearly 1300p dropping at times due to DRS. Open world is at 1620p on the one time he was able to count the pixels. I spent the last 10 months thinking the PS5 was running at 1080p thanks to him. Even VG tech had his pixel counts at 1440p.

Basically, if the game is using RT that was developed with next gen consoles in mind, the PS5 and XSX will do way better than if they were ported from PC games that had Nvidia's RTX implementation. Even games like Far Cry which had AMD sponsored RT performed way better than other RT games that initially did their dev on Nvidia's GPUs.

I think the fact that both Ratchet and Miles are able to push high quality reflections at 40-50 fps and native 4k is proof that these consoles are a lot more powerful than DF gave them credit for. They just need devs to write code for the AMD's RT implementation instead of downporting from their RTX solution.
ratchet and clank has a slightly higher average fps than spiderman ive seen it hit the mid 50s fairly often in the fidelity mode
 

yamaci17

Member
it was always straightforward the ps5 doesnt have a 10900k let alone a 12900k in it, so why are you pairing the pc with it if your trying to do a like for like benchmark
why are you comparing a PC which doesn't have 10 GB budget available for game like PS5 does?

i will forever bug and haunt you with this. you've brought this upon for yourself with your constant 12900k/10900k crusade when I've shown you PS5-equivalent CPU is perfectly capable to locking to 60 and even get upwarods 70 frames
 
Last edited:
why are you comparing a PC which doesn't have 10 GB budget available for game like PS5 does?

i will forever bug and haunt you with this. you've brought this upon for yourself with your constant 12900k/10900k crusade when I've shown you PS5-equivalent CPU is perfectly capable to locking to 60 and even get upwarods 70 frames
I was just adding the guys original point
 

Rubim

Member
it would still be heavy on the cpu on console as well... your trying to give the pc a cpu benefit but not the console which isnt equal
Its not.

In this particualr case we don't even have to take my word for it.

"It's even worse for us because we also have the added overhead of the abstraction layer to DX12 and the DXR abstraction layer, which is obviously very lean on the Sony side. So even if you have a more powerful CPU than on the PlayStation 5, you might still end up with a lower frame-rate."

This is a PS5 game, ported to PC.
The game being heavy on the CPU on PC does not make it CPU heavy on PS5 as well.

There are aditional steps that you have to do on PC when you port a game that was only designed for a single plataform.
 

DenchDeckard

Moderated wildly
All games use the same graphics pipeline.
8 games show the size of the VRAM doesn't determine performance, it's bandwidth.

But keep believing some random German article. I'm done with this topic.
I've only just read this last page...but are you claiming a game cant be vram starved at high resolutions?

Have you ever been a pc gamer because thats crazy if you are stating that. Have you ever experienced vram stalls where you get severe hitches because you've tapped out your vram allocation....I can assure you it's a very real thing.
 
Last edited:

yamaci17

Member
full of dishonesty and unrelated takes from "slimysnake".

no, DF using a 3700x would not change the performance profile of games that they found 2060/2070 to be equivalent to a PS5. these GPUs would not even be bottlenecked with a 2700x. my tests confirms it. you're just preaching to the choir

"Alex dismisses that because consoles are using half res reflections. "

this is dumbness at its peak. if it uses half reflections, then it is not equivalent. also ps5 beating 2060 by two times is also meaningless when they use half res reflections. they're quite literally there to save performance. them being not on PC is enough to dismiss doom eternal from like for like comparisons.

the fact that in spiderman 2080ti shoots %35-40 above a PS5 is proof that PS5 is doing nothing magic or special for ray tracing. i could have a 2080ti in my rig, and a 3700x, and my rig still outperform the 2080ti by that %35-40 amount. 2080ti/3070 also outperforms consoles in metro exodus ee with like for like settings by %50. RTX cards will beat consoles in different percentages whether that game is heavily raster based or has complex RT calculations. Spiderman does not even have any complex RT calculations with PS5-equivalent RT settings. they're quite literally using texture maps as reflections. pushing much higher quality RT reflections with a 2070 is extremely easy at 1440p.


all of these results would be the exact same way if they used a 3700x. 3700x to 12900k won't even change a single framerate with a heavily GPU bound rtx 2070/2080. my 2700x/3070 tests confirm these facts. even with my CPU, I'm able to get %35 more frames at native 4k with ray tracing than PS5, and %50 more frames than PS5/Sx in metro exodus ee with equivalent RT settings. the reason metro exodus ee sees a higher framerate bump with RTX cards is because it actually has a solid RT implementation whereas Spiderman's RT implementation is purposefully light to have high resolutions and good framerates alongside with it.

you bring up fc6 then you completely dismiss the fact that 6700xt also matches a 3060ti in RT department. this does not happen in cyberpunk, dl2 or metro exodus. exact same way it does not happen between rtx cards and ps5.

a game that favors raster more than RT will perform great on RDNA2. it is not an exclusive behaviour to PS5. and all of this is true even if you use a 3700x to pair these GPUs with.

what see here with nxg has nothing to do with his 2700x. his 2070 is getting an extreme vram bottleneck.
 
Last edited:

Loxus

Member
I've only just read this last page...but are you claiming a game cant be vram starved at high resolutions?

Have you ever been a pc gamer because thats crazy if you are stating that. Have you ever experienced vram stalls where you get severe hitches because you've tapped out your vram allocation....I can assure you it's a very real thing.


Edit:
The above also shows RT doesn't dramatically increase VRAM usage.

Take a look at this.
The 3070 and the 2080Ti are basically equal in compute performance.
VXEAlYf.jpg

The only difference, the 3070 has 8GB @448.0 GB/s and the 2080Ti has 11GB @616.0 GB/s.

Here we see the 2080Ti utilizing more VRAM than the 3070 just like in Spider-Man, but yet there is no performance lost by the 3070.
F1fsFvZ.jpg


 
Last edited:

Rubim

Member
Seems indeed there was an issue with the VRAM budget.


  • Made changes to address performance degradation when raytracing is enabled.
  • Changed VRAM budgets to allow for more video memory usage.
 
Will you both now admit you were WRONG? Or will you brush this under the rug?

https://gamerant.com/spider-man-remastered-pc-update-october-2022/

That would of made sense if it wasn't going above 7GB, I've even seen it hit 7.5GB usage.
I don't think there is an artificial limit on VRAM.
nzvXDsB.jpg


Even at 4K, it still remains under 7GB.
r9INeMT.jpg


Imo, it's bandwidth in more cases that can be a VRAM limit.

But like I said, I'm done with this topic.
These guys are cherry picking and you would also realize there are from a certain crowd.

er...did you even watch the video, I literally call this out and explain it, with even a chapter entitled memory within the video. I show and discuss the Vram issues, low textures, lower mips, how it is worse than PS5 and the bigger 16GB of the RX6800. Your post is nothing but confirming what my video covers.

I then discuss the VRAM to Sysram issue, bandwidth a data bound and state in the 750Ti what I said at last gen, 2GB is and was not going to cut it and 8GB will not now.

I get frustrated when people attack facts with no logic, your argument is, "Well if the GPU had more Vram than it does it would be performing better!"
Well yeah, of course, This argument (which it is clearly not) is if my Fiesta had a Ferrari engine, it would be able to beat a Porshe. I see this very pigeonhole logic a great deal in comments, and it misses the point of these tests and how tests should be. The fact is you cannot buy a 2070 or 3070 with anything other than 8GB, so this game, in this mode, in this card, it performs as shown. All stated clearly in the video, you are arguing the same old, it is not a fair test, this is not about that or should it ever be, this is about what and where is the PS5 performing. I do not see you and others here arguing that DF using a 12900K with a RX580 is not madness and completely off what would be a real system do you? My rig here is a real example of what will exists and is around the same level as the consoles target.

Even that aside, you are skipping the other modes with the flat Performance Mode NO RT having clear GPU bound points that still show a deficit to the PS5, when not CPU constrained.


I mean just read this comment, the GPU that is performing worse here has nothing to do with the PS5 performing better???!!!!????

The GPU can and is Memory bound often, but NOT 100% of the time and is not the only reason, as noted above the 3060 with more Vram is not suddenly leaping ahead in performance.
 

Zathalus

Member
  • Made changes to address performance degradation when raytracing is enabled.
Like I've been saying. The CPU handle RT BVH management, which was the reason for the performance drop. They most likely reduce the CPU involvement in RT. VRAM was never the issue.

It's literally in the patch notes:

- Changed VRAM budgets to allow for more video memory usage.
 
  • Made changes to address performance degradation when raytracing is enabled.
Like I've been saying. The CPU handle RT BVH management, which was the reason for the performance drop. They most likely reduce the CPU involvement in RT. VRAM was never the issue.


Read the damn patch notes, its not yet ILLEGAL to read.
And put down your ego for just one secs. My goodness.

Patch Notes:
  • Changed VRAM budgets to allow for more video memory usage.
 
Last edited:

Loxus

Member
It's literally in the patch notes:

- Changed VRAM budgets to allow for more video memory usage.
Read the damn patch notes, its not yet ILLEGAL to read.
And put down your ego for just one secs. My goodness.

Patch Notes:
  • Changed VRAM budgets to allow for more video memory usage.
Where does it say VRAM budget degraded performance?

You guys are reaching too much.
Let it die already.

All evidence say RT is responsible for the performance lost, compared to other games.
It's in the patch notes for peace sake.

  • Made changes to address performance degradation when raytracing is enabled.
 

01011001

Banned
Where does it say VRAM budget degraded performance?
The 3070 with 8GB of VRAM still outperforms the 2080TI with 11GB and still doesn't run out of VRAM at max 4k.

how the fuck are you still arguing this bullshit?
the game CLEARLY didn't use the VRAM pool optinally which lead to bad performance. there's VERY STRONG evidence for this, and you are still here with this shit trying to argue against clear evidence.
 
Last edited:

Zathalus

Member
Where does it say VRAM budget degraded performance?

You guys are reaching too much.
Let it die already.

All evidence say RT is responsible for the performance lost, compared to other games.
It's in the patch notes for peace sake.

  • Made changes to address performance degradation when raytracing is enabled.
Or both were a issue? Otherwise why fix it?
 
Where does it say VRAM budget degraded performance?

You guys are reaching too much.
Let it die already.

All evidence say RT is responsible for the performance lost, compared to other games.
It's in the patch notes for peace sake.

  • Made changes to address performance degradation when raytracing is enabled.

OMFG, you're joking no ?

Cant See Cheech Marin GIF
 

damiank

Member
I think the most noteworthy part of all is performance with RT on. AMD's solution kind of sucks in RT as we've seen time and again, yet the framerate with RT on is equivalent to RTX cards in this case. That feels quite impressive, if you were to make an RT comparison in spiderman with an 6600XT which is around the same level of GPU power it'd be almost sad for the desktop GPU.

I've always found it odd how PC to console comparisons are always done with nvidia GPUs forgetting AMD exists even thought it'd be far more accurate in many cases.

6650xt is the closer match to ps5 I’d say 6700xt is more series x level

yeah, thats why i suggested they downclock the 6700xt GPU to get to 10.2 tflops.

Simplest you can get is RX 6700

https://www.amd.com/en/products/graphics/amd-radeon-rx-6700

Basically PS5, just clocked higher
 

SlimySnake

Flashless at the Golden Globes
Simplest you can get is RX 6700

https://www.amd.com/en/products/graphics/amd-radeon-rx-6700

Basically PS5, just clocked higher
Interesting. RAM bandwidth is way higher than the 6600xt too so a way better comparison.

Yes, that would be perfect for comparisons especially with Miles and Uncharted now releasing on PC. Both have unlocked framerates on the PS5 so a 1:1 comparison can be done without vysnc muddying waters.
 
Top Bottom