• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Playstation 5 Pro specs analysis, also new information

Mr.Phoenix

Member
It isn't as simple as that. This isn't 2005.
He's right though. Consoles do dictate the baseline. And even worse, consoles do not even represent the lowest possible spec platform. While PCs represent the absolute high end, they also represent the absolute low end.

Not a single developer out there is prioritizing their game/engine for something that requires specs clearly above what a PS5 (hell, even an XSS) can handle; not only do the majority of the games out there sell more on consoles, but you would be alienating a significant chunk of your PC market.

Contrary to what or how people on forums like these may sound, the majority of the PC master race is actually made up of people who look like a PS5.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
let me put this way: There's no way any game from this generation requires 20 or 24 GB of VRAM
You aren't wrong but Cyberpunk can reach up to 18GB of VRAM usage at 4K with path tracing and DLSS3 (which increases VRAM requirements).

High-end GPUs are generally meant to be forward thinking. There's a reason the GTX 1080 Ti had 11GB way back in 2017 and it's still enough today. Same reason the 4090 has 24GB. It will still be plenty even in 5 years so that no game targets them is kind of a self-fulfilling prophecy, because the point of putting so much VRAM is so these cards can be useful years into the future.
 

Gaiff

SBI’s Resident Gaslighter
He's right though. Consoles do doctate the baseline. And even worse, consoles do not even represent the lowest possible spec platform. While PCs represent the absolute high end, they also represent the absolute low end.

Not a single developer out there is prioritizing their game/engine for something that requires specs clearly above what a PS5 (hell, even an XSS) can handle; not only do the majority of the games out there sell more on consoles, but you would be alienating a significant chunk of your PC market.

Contrary to what or how people on forums like these may sound, the majority of the PC master race is actually made up of people who look like a PS5.
I said it isn't that simple, not that he's incorrect. Things like budget and dev time matter more than the specs of the machines when it comes to what a developer wants to accomplish. So it really isn't just a matter of, "devs target consoles and that's that,". There are a myriad of considerations when developing a game. I'd argue exclusives are the only games that really target a specific set of specs. Everything else casts a large net to catch as many fish as possible which is why they scale all the way down from low-spec PCs, to consoles in the middle, and finally relatively powerful high-end PCs at the top.

Take HFW for instance. This game barely scales above PS5, but games like Alan Wake 2 scale appreciably above their console counterparts. Most PS5 ports aren't that much better on PC besides the obvious fps increase. This isn't the case for 3rd-party titles.
 
Last edited:
With the added 1.2 GB on PS5 Pro it's more probable that VRAM becomes a problem for people who have 12 GB VRAM on PC to achieve the same performance as PS5 Pro that will have 13.7 GB.
 

ChiefDada

Gold Member
I don’t even know why y’all listening to DF when it comes to leaks and specs.

Alex is the only one in the crew who actually knows how these specs translate to performance but he’s a PCMR and hardly cares about the PS5 Pro. DF is good to get game benchmarks and performance profiles. I tend to ignore the rest of their stuff, especially when it comes to hardware because their knowledge in that area is sorely lacking.

Well considering the fact that he:

1. Refused to believe i/o could impact visual fidelity until counter commentary from an ex-Sony dev forced him to backtrack and throw a temper tantrum on Resetera

2. Claimed Series X would have better raytracing than PS5 without reservation simply because it had a bigger GPU.

3. Continues to think a large part of the 2-4x RT improvement of PS5 Pro over PS5 can be explained by the 67% CU increase.

(The list goes on...)

I would say he's just a clueless as his colleagues.


Still only 16GB of RAM, won't that be a bottleneck if PC GPU's come with 16/24GB VRAM as standard?

#PrettyCoolRight
 

Gaiff

SBI’s Resident Gaslighter
With the added 1.2 GB on PS5 Pro it's more probable that VRAM becomes a problem for people who have 12 GB VRAM on PC to achieve the same performance as PS5 Pro that will have 13.7 GB.
Why would it be a problem? The PS5 still needs to work.

And no one on PC wants to target console performance. They'd just get a console if that's what they wanted.
 
Last edited:
Yep obvious to some and literally by design. Why didn't they improve the CPU? Because they didn't want to! You have DF saying its the best they can do whilst at the same time saying it helps compatibility lol. Its not the best they can do at all, its literally what they wanted to do. For compatibility sure, but also because they don't want a divergence between the two SKU's beyond a very specific targeted area. The Pro is very carefully designed with boundaries to control developers and ensure the standard machine isn't left behind.
Is that really necessary? It's not like developers would ever make a game exclusively for the Pro- it won't sell enough to make it worth it.
 

Gaiff

SBI’s Resident Gaslighter
Sure, but base PS5 won't be asked to execute the additional RT workloads. So it's a potential issue for PC not base PS5
Again, this doesn't make sense. The Pro will have 13.7GB if VRAM to work with but will still need to do things that the PC does with just regular RAM. How often does a 10GB GPU struggle to achieve what the PS5 does? Pretty much never. You add 1.2GB to the Pro and 2GB for the PC, why would the PC start struggling when it has more RAM compared to the PS5 Pro than the 10GB does compared to the regular PS5?
 
Last edited:

leizzra

Member
Same reason the 4090 has 24GB.
Well it’s not only a gaming GPU (or maybe it should be that it can play games too ;) ). Those 24 GB of RAM are quite easily filled when working on 4K textures sets. This is where they were needed the most and it’s a shame that they haven’t increased it from 3090.
 

Zathalus

Member
'Teeth' Leadbetter trying to piss on PS5 Pro again in the DF direct....what a sap. Ruling out 40fps modes now, calling it a niche and insignificant product. Totally fixated on the small CPU bump as his answer to anything to do with it, and pretending PSSR and the GPU changes don't exist. Oliver at least did his best to contradict without upsetting the bald one.
They were asked can a game that is limited to 30fps by the CPU on PS5 hit 40fps on the Pro. The answer is no (duh), unless the game is quite close to 40fps. As for why he didn't bring up PSSR or the GPU... the question was directly about the CPU.

He also claimed 120hz LFC modes and the Pro will be niche (not insignificant) which they absolutely are. How many games actually have that mode, and what percentage of PS5s sold do you think are actually going to be the Pro model?
 
Why would it be a problem? The PS5 still needs to work.

And no one on PC wants to target console performance. They'd just get a console if that's what they wanted.

Yes that's why I said to achieve the same results.

It could happen that you have to scale down a "PC preset" or resolution because you exceed 12 GB.

I'm just guessing but in theory:

The "LOW/MEDIUM" requirements won't change but the "HIGH/VERY HIGH" requirements could

I don't know if I explained myself
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Yes that's why I said to achieve the same results.

It could happen that you have to scale down a "PC preset" or resolution because you exceed 12 GB.

I'm just guessing but in theory:

The "LOW/MEDIUM" requirements won't change but the "HIGH/VERY HIGH" requirements could

I don't know if I explained myself
I know what you mean but this doesn't make sense.

A 10GB GPU today doesn't struggle to reach parity with the regular PS5. Add 1.2GB to the PS5 but 2GB to the PC, the PS5 went from having 25% more VRAM to 14%. The gap actually shrunk so that the 12GB will struggle is illogical. The 10GB? Sure. The 12GB has enough VRAM to keep up just fine.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
With the added 1.2 GB on PS5 Pro it's more probable that VRAM becomes a problem for people who have 12 GB VRAM on PC to achieve the same performance as PS5 Pro that will have 13.7 GB.
PS5 isnt using 12GB for vram lmao.

Here is KZSF at the PS4 launch. They had 5GB of the 8GB allocated for games. Of that 5GB, only 3GB was for VRAM. The rest was CPU and system related tasks.

wx7XRap.png


PS5 is probably topping out at 10 GB now that CPUs are doing far more than they did last gen with RT, higher NPC counts, and other simulations.

Here is a breakdown.

Eh8xAqBXsAE_ToQ.jpg


images
 
Last edited:
I know what you mean but this doesn't make sense.

A 10GB GPU today doesn't struggle to reach parity with the regular PS5. Add 1.2GB to the PS5 but 2GB to the PC, the PS5 went from having 25% more VRAM to 14%. The gap actually shrunk so that the 12GB will struggle is illogical. The 10GB? Sure. The 12GB has enough VRAM to keep up just fine.

I trust you on this, you know way more than I do about PC Gaming.

It was a thought experiment
 

Gaiff

SBI’s Resident Gaslighter
Well considering the fact that he:

1. Refused to believe i/o could impact visual fidelity until counter commentary from an ex-Sony dev forced him to backtrack and throw a temper tantrum on Resetera

2. Claimed Series X would have better raytracing than PS5 without reservation simply because it had a bigger GPU.

3. Continues to think a large part of the 2-4x RT improvement of PS5 Pro over PS5 can be explained by the 67% CU increase.

(The list goes on...)

I would say he's just a clueless as his colleagues.




#PrettyCoolRight
That's him being dismissive of consoles. If these were PC parts, he'd actually make sense. Have you seen him during the PS5 Pro talks? He just goes, "Whatever." He barely tolerates the Xbox because some of the features trickle down to PC and there is a level of cross-platform development. I don't even think he makes a candid effort to understand the console environment unless it's to dunk on them with better PCs.

He also gladly says time and again (proudly at that) that he doesn't play on consoles and just uses them for his DF videos. His PC coverage and insight is really good because he actually cares about PCs and has a genuine interest in them. He obviously has a disdain for consoles.

Case and point, DirectStorage. He was hyping it up a lot for PC despite the fact that PS5 had been doing something similar since 2021. Rift Apart is arguably the best example of asset streaming and decompression on the fly and Alex didn't care one bit for it until it hit the PC with DirectStorage. Then it turns out DirectStorage on PC is still a bit shit and wasn't even needed in Rift Apart.
 
Last edited:

Ashamam

Member
It's not like developers would ever make a game exclusively for the Pro- it won't sell enough to make it worth it.
The point isn't they would make a game exclusive to the Pro (they can't as Sony requires a standard version), its that they would compromise the standard version because they prioritised the Pro version as the hardware was effectively a totally different target. (and weren't able to back port to the standard well)

As to not selling enough, 80/20 is often flung around for the prior generation. The reality is the correct figure is 60/40 for the period the Pro was actually sold. Or at least something along those lines depending on the overall sales curve over the generation. 40% of sales is not insignificant in anyone's book.

I'd argue 20% wouldn't be either but that doesn't take into account the fact that the Pro wasn't available for the first half of the generation, so of course the standard will sell dis-proportionally across the whole generation.
 
Last edited:

ChiefDada

Gold Member
PS5 isnt using 12GB for vram lmao.

Here is KZSF at the PS4 launch. They had 5GB of the 8GB allocated for games. Of that 5GB, only 3GB was for VRAM. The rest was CPU and system related tasks.

wx7XRap.png


PS5 is probably topping out at 10 GB now that CPUs are doing far more than they did last gen with RT, higher NPC counts, and other simulations.

Here is a breakdown.

Eh8xAqBXsAE_ToQ.jpg


images

That's not the best way to analyze. We know base PS5 can use up to 12.5gb or so of VRAM at any given moment. The question is how much data can be fed to VRAM then GPU over a specified period of time. That is where PS5 i/o becomes important and where precaching will presumably be done on PC side. Which is why you can have a scenario where PC GPU may need to hold 16gb of vram data whereas PS5 can hold 12.5gb for the same workload and swap in/out 3.5gb only when the data is needed in the next half second. In such a scenario, 12.5gb of available vram on PC wouldn’t suffice.

*The example is hypothetical and only used to illustrate just one reason why console vs PC VRAM requirements aren't comparable 1:1
 

Perrott

Gold Member
They should just go ahead and have their Cerny ASMR talk on the specs. Wasn't the PS5 specs revealed in march the same year it launched? We're kinda overdue by now
They seemingly have a PlayStation Showcase last month and, although last year it turned out to be an underwhelming showing due to various first-party titles facing setbacks behind the scenes (TLOU Online, Twisted Metal, Deviation, PixelOpus, etc), they actually went and teased the PlayStation Portal, among other hardware news.

This being the release year of the PS5 Pro, I'm expecting that they'd feature the enhanced console in some way - whether full unveil or a soft announcement, we'll have to see. What we do know though is that PlayStation Studios have had the development kits since September, over half a year ago, so the teams due for a game announcement definitely had the time to prepare and optimize their various materials in order to showcase the PS5 Pro in its best light.

So, if there are games ready, I'd say chances are high that we might get a PlayStation Meeting-style presentation, Cerny and all.
 

SlimySnake

Flashless at the Golden Globes
. The question is how much data can be fed to VRAM then GPU over a specified period of time.
No. The discussion was about the size of the vram on current Gen pc gpus. The ssd streaming stuff you are talking about has nothing to do with what we are talking about here.
 
They were asked can a game that is limited to 30fps by the CPU on PS5 hit 40fps on the Pro. The answer is no (duh), unless the game is quite close to 40fps. As for why he didn't bring up PSSR or the GPU... the question was directly about the CPU.

He also claimed 120hz LFC modes and the Pro will be niche (not insignificant) which they absolutely are. How many games actually have that mode, and what percentage of PS5s sold do you think are actually going to be the Pro model?
A 1440p or 4k game that goes down to 1080p with upscaling to 4k, will likely gain significant framerate boost. Lowering resolution doesn't just free gpu but also cpu, especially if it relates to ray tracing.
 
They were asked can a game that is limited to 30fps by the CPU on PS5 hit 40fps on the Pro. The answer is no (duh), unless the game is quite close to 40fps. As for why he didn't bring up PSSR or the GPU... the question was directly about the CPU.

He also claimed 120hz LFC modes and the Pro will be niche (not insignificant) which they absolutely are. How many games actually have that mode, and what percentage of PS5s sold do you think are actually going to be the Pro model?
Who do you think select the questions? They do! They are probably receiving hundreds of those. By selecting the questions they are able to control their narrative and push their agenda: PS5 Pro CPU sucks and GPU improvemnts are not worthy to be talked about. Rich has being doing this for years and he knows what he is doing.
 

Tqaulity

Member
With everyone and their mothers now getting the spec sheets, I'm surprised we're still waiting on concrete answers for GPU clock frequency. Everyone is quoting the new CPU clock profile but nothing for GPU. This suggests to me GPU clocks aren't final.
Exactly! Yet everyone is so keen to be quoting this 33.5 TLOPS figure which itself was NEVER mentioned in the docs and was calculated from a Machine Learning section with FP16 figure of 67 TFLOPS. How do you get a TFLOP figure with no set clock speed? (If Sony knew enough to specify the TFLOPs then they could have easily just specified the clock speed as well right?) Why would you quote GPU Shader compute perf from a sub-bullet in a machine learning section? Doesn’t it also raise some flags that computing the clock speed from known 60CU shader counts results in a lower clock than the base PS5. Never in the history of gaming hardware has there been a decrease in clock speed for any kind of revision of HW (releasing post launch). Not to mention the confusing, contradictory, and ambiguous “45% rendering” increase figure 🤷🏾‍♂️

So much here doesn’t add up but folks are eating it up 😆. I can’t wait to see the reaction when Sony actually does announce the official final specs of the machine.
 
Last edited:

ChiefDada

Gold Member
Exactly! Yet everyone is so keen to be quoting this 33.5 TLOPS figure which itself was NEVER mentioned in the docs and was calculated from a Machine Learning section with FP16 figure of 67 TFLOPS. How do you get a TFLOP figure with no set clock speed? (If Sony knew enough to specify the TFLOPs then they could have easily just specified the clock speed as well right?) Why would you quote GPU Shader compute perf from a sub-bullet in a machine learning section? Doesn’t it also raise some flags that computing the clock speed from known 60CU shader counts results in a lower clock than the base PS5. Never in the history of gaming hardware has there been a decrease in clock speed for any kind of revision of HW (releasing post launch). Not to mention the confusing, contradictory, and ambiguous “45% rendering” increase figure 🤷🏾‍♂️

So much here doesn’t add up but folks are eating it up 😆. I can’t wait to see the reaction when Sony actually does announce the official final specs of the machine.

Precisely. All of these TF numbers - 16, 33, 67, seem to be referencing the relative performance of the AI architecture by floating point FP16, FP32, FP64.
 

ChiefDada

Gold Member
No. The discussion was about the size of the vram on current Gen pc gpus. The ssd streaming stuff you are talking about has nothing to do with what we are talking about here.

It has everything to do with what you're talking about. Otherwise, why is 10gb 3080 performing so poorly here

 

SlimySnake

Flashless at the Golden Globes
It has everything to do with what you're talking about. Otherwise, why is 10gb 3080 performing so poorly here


because its a poor port. we have seen this time and time again on PC. TLOU1 was literally unplayable when it launched with medium settings looking like ps2 quality textures. means nothing other than devs being clueless.

besides, look at the GPU usage. its at 95-100%. the game even at 1080p is GPU bound. classic shitty ps first party port where the base PS5 outperforms a 3070. here i guess it outperforms the 3080. we saw this with Uncharted, Spiderman, gow, and i guess now ratchet.

look at 99% of PC games that dont have these issues. I have been gaming on a 3080 since 2022 and while the vram limit is very real, it only factors in when I push settings way above the PS5 or when devs release shitty ports that are fixed in a few weeks like hogwarts, tlou, and RE4 last year. hell, RT adds like a 1 gig or 1.5 GB to the vram and i ran cyberpunk PT at roughly 30 fps at 4k dlss performance.

From what i understand the RT in Ratchet's highest preset is better than the PS5 and uses shadows and AO as well so its not like its PS5 settings anyway.
 

sachos

Member
If PS6 were to release in 2026 then then same thing will happen where it can kinda do PT but not really.
Yup this is my biggest bet against MS spearheading the "next generation" with a 2026 release, too early to get next level AI/RT giving the chance to Sony to wait a little longer to achieve it.
 

Gaiff

SBI’s Resident Gaslighter
because its a poor port. we have seen this time and time again on PC. TLOU1 was literally unplayable when it launched with medium settings looking like ps2 quality textures. means nothing other than devs being clueless.

besides, look at the GPU usage. its at 95-100%. the game even at 1080p is GPU bound. classic shitty ps first party port where the base PS5 outperforms a 3070. here i guess it outperforms the 3080. we saw this with Uncharted, Spiderman, gow, and i guess now ratchet.

look at 99% of PC games that dont have these issues. I have been gaming on a 3080 since 2022 and while the vram limit is very real, it only factors in when I push settings way above the PS5 or when devs release shitty ports that are fixed in a few weeks like hogwarts, tlou, and RE4 last year. hell, RT adds like a 1 gig or 1.5 GB to the vram and i ran cyberpunk PT at roughly 30 fps at 4k dlss performance.

From what i understand the RT in Ratchet's highest preset is better than the PS5 and uses shadows and AO as well so its not like its PS5 settings anyway.
The port is fine for the most part. The PS5 does outperform a 3070 but not a 2080 Ti and that's because the latter has an extra 3GB of VRAM to play with.

Also, even at 1080p, all those RT effects add a substantial amount of pressure on the VRAM. The PS5 only uses reflection at around High in its Performance Mode.

It has everything to do with what you're talking about. Otherwise, why is 10gb 3080 performing so poorly here


Because you're running the game at much higher settings but lower resolution. You'd get better frame time stability running at PS5 settings and 1440p+RT reflections than 1080p max settings+all RT effects. The latter uses less VRAM.
 

SlimySnake

Flashless at the Golden Globes
The port is fine for the most part. The PS5 does outperform a 3070 but not a 2080 Ti and that's because the latter has an extra 3GB of VRAM to play with.

Also, even at 1080p, all those RT effects add a substantial amount of pressure on the VRAM. The PS5 only uses reflection at around High in its Performance Mode.
It would still be at 1080p 60 fps. I watched it again and saw some GPU usage drops to 85% for a split second and thats your stutter while the game loads from system ram. but the game is otherwise still running at 1080p 60 fps while the GPU is at 99%. thats basically what a GPU bottleneck, not a vram thing. the 12GB version will smooth out the stutters but wont make it perform 2x better like it should.
Because you're running the game at much higher settings but lower resolution. You'd get better frame time stability running at PS5 settings and 1440p+RT reflections than 1080p max settings+all RT effects. The latter uses less VRAM.
I have documented many issues with 3080's vram limit but thats also because i game at 4k 60 dlss quality maxed out settings most of the time. the ps5 settles for 1080p 60 fps with some games dipping to 720p in most of these games with no RT and severely scaled back Graphics settings. each of which have an impact on vram, minor or major.

i think the only games that are likely using 10+ GB of vram for graphics are the ps3 remakes like Demon Souls and TLOU part 1 because they are effectively running the same ps3 game underneath so very little vram would be reserved for cpu or system tasks. but even they are using 3d audio and we've seen from the GG slides that 600 MB was reserved for sound alone. who knows how much 3d audio is reserving on the PS5.

Even then, i can now get tlou1 running maxed out at ultra textures without any kind of stuttering issues at 4k dlss quality. i am now exclusively GPU bound with some drops to the 50s and i have their new texture streaming setting set to fastest. so those launch issues where my 3080 was dropping to 2 fps for 20-30 seconds just because i turned the camera around too fast have now been resolved and the game runs perfectly fine at 4k dlss quality on a 10 gb 3080.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
BTW, the latest tom warren tweet/article confirms that the OG PS5 had a 12.5 GB vram limit. something DF revealed a couple of years ago, but most here dismissed it because well, reasons.

I have my issues with DF, but lets not dismiss everything they report just because they might have a slant or preference.
 

ChiefDada

Gold Member
because its a poor port. we have seen this time and time again on PC. TLOU1 was literally unplayable when it launched with medium settings looking like ps2 quality textures. means nothing other than devs being clueless.

besides, look at the GPU usage. its at 95-100%. the game even at 1080p is GPU bound. classic shitty ps first party port where the base PS5 outperforms a 3070. here i guess it outperforms the 3080. we saw this with Uncharted, Spiderman, gow, and i guess now ratchet.

We are starting to see a recurring theme where Sony 1st party ports that flex PS5 memory management capabilities get labeled as "bad PC ports". Funny because I never heard anyone complain about Nixxes quality or work ethic until this gen.

I have been gaming on a 3080 since 2022 and while the vram limit is very real, it only factors in when I push settings way above the PS5 or when devs release shitty ports that are fixed in a few weeks like hogwarts, tlou, and RE4 last year. hell, RT adds like a 1 gig or 1.5 GB to the vram and i ran cyberpunk PT at roughly 30 fps at 4k dlss performance.

Friendly bet with absolutely nothing on the line - there will be zero games where a 3080 can match PS5 Pro settings.

Edit: zero RT games.
 
Last edited:

Loxus

Member
BTW, the latest tom warren tweet/article confirms that the OG PS5 had a 12.5 GB vram limit. something DF revealed a couple of years ago, but most here dismissed it because well, reasons.

I have my issues with DF, but lets not dismiss everything they report just because they might have a slant or preference.
Not really a confirmation. That Verge article is literally a copy and paste from both Tom Henderson's and Digital Foundry's PS5 Pro articles with nothing new.

He most likely did that article for clicks.
The amount of people under his Twitter post that's now seeing this PS5 Pro info for the first time is amazing.
 

SlimySnake

Flashless at the Golden Globes
Not really a confirmation. That Verge article is literally a copy and paste from both Tom Henderson's and Digital Foundry's PS5 Pro articles with nothing new.

He most likely did that article for clicks.
The amount of people under his Twitter post that's now seeing this PS5 Pro info for the first time is amazing.
its a confirmation because its from the same sony docs that were leaked. DF had a source but we now have an official sony document confirming DF's source.
 

Gaiff

SBI’s Resident Gaslighter
It would still be at 1080p 60 fps. I watched it again and saw some GPU usage drops to 85% for a split second and thats your stutter while the game loads from system ram. but the game is otherwise still running at 1080p 60 fps while the GPU is at 99%. thats basically what a GPU bottleneck, not a vram thing. the 12GB version will smooth out the stutters but wont make it perform 2x better like it should.
The 3080 seldom performs twice as well as the PS5. It's more often around 80%. As for the 3070, well, it only has 8GB of VRAM. Can't really label Rift Apart a bad port because some PC parts it wasn't made for aren't up to snuff when it comes to the VRAM capacity. Hell, the 2080 Ti can actually straight-up beat the 3080 in this game so this isn't a case of a bad port. This is a case of the developers leveraging the architecture of the PS5.

If every card performed terribly, we could have a discussion, but get 11GB or more, and suddenly, most of your issues go away. 10GB is cutting dangerously close to what the PS5 uses for its VRAM. I wouldn’t be one bit surprised if it can go a bit higher and make 10GB GPUs have some problems in exclusive games (though probably not in 3rd-party ones).

12GB should give the PC parts enough headroom to match the Pro in almost any given scenario, even in exclusive titles with RT.
 

Loxus

Member
Imagine if this was all just a controlled leak.



lol
It's amazing that Sony had the PS5 specs locked up in Fort Knox, with a few Github leaks here and there and we had to wait for Road to PS5 specs.

But the PS5 Pro specs are just randomly leaking. One would think Sony would learn from the pass and tighten up leaks. Which could be true, cause so far the PS5 Pro hasn't been spotted on Github.

Maybe this is really one big controlled leak, which started with MLiD. Anything before MLiD leak could probably be legit leaks though.

BTW, this is just my speculation in regards to the person I'm replying to and not to be taken seriously.
 

Loxus

Member
its a confirmation because its from the same sony docs that were leaked. DF had a source but we now have an official sony document confirming DF's source.
The day Tom Warren has that document, is the same day we all will have it. Sending something that confidential to him is the same as just putting it out there for all of us to have.


I do believe the PS5 Pro got a bump in having more memory usage for games though.

How this is done, either Sony increase it to 18GB and we don't know like with PS4 doubling to 8GB vs 4GB leaks or they did like with the PS4 Pro and added more SSD Cache to the SSD Controller as shown below.
w8tNk0y.jpg


But until another source other that Digital Foundry says the PS5 has 12.5GB available for games, I'm not believing it. In fact, any Sony console spec leak from Digital Foundry I'm not believing. Those guys have a clear agenda against Sony consoles and have a bad track record with the PS5 performance, especially back in 2019-2020.
 
Top Bottom