• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Playstation 5 Pro specs analysis, also new information

Kacho

Member
Not sure what you are on about, games will not run slower on PS5 Pro.

You enjoyed last gen / cross generation games at faster framerates because well, developers were aiming lower. Many games will still deliver 30 FPS games and beyond, some with something like frame generation might get “visually” close to the range you want that were not before…

Some devs will make games that on any console will run at 30/40 FPS. Maybe, but if it is for games like GTA VI you will buy it, enjoy it, and be glad you got it early next year than later.

Again, you know as well as I do that the market is not there for much much bigger and more expensive consoles and get upset at physics because Moore’s Law is well… really not what it used to be 😂. Again, I refer to what I posted earlier in the thread and in other threads where we had super high expectations for the Pro console.
This entire post is damage control. My point stands.
 
probably, yes. ive mostly stayed away from FF7 Rebirth threads and most games im not interested in, but have strong feelings about. But i do get the urge to shit on them every now and then so i wont make any promises.

i dont know which devs you have heard speaking positively about these specs though. they are all under embargo. DF is literally pointing out several games that have been CPU bound this gen including literally the next two big games games coming out in 2 days. So if you want your $600 purchase bottlenecked by the CPU without me criticizing it, im afraid thats not going to happen.
knLL8wn.gif
 

Bojji

Member
You didn't watch the youtube video did you? The CPU is covered there and the general consensus was ok, makes sense, bit disappointing but whatever. Then they move on to all the different things that the Pro does that could be really impactful.

They literally said that CPU bottlenecked games will remain CPU bottlenecked and that adding RT to games may make them more CPU intensive.

They talk about Gotham Knights, BG3 and GTA6 as examples of CPU limited games. They would have talked about more but can't, that means DD2 probably (under NDA).
 

Panajev2001a

GAF's Pleasant Genius
This entire post is damage control. My point stands.
😂, I am sorry, but I do not think it does no matter how one may feel emotional at the thought of a 30 FPS game. Check what I have said about PS5 Pro for a while or keep thinking that the reality is still releasing more consoles iteration faster is the secret and that semiconductor tech is still progressing as fast as it did… or not and call it damage control and keep staying outraged 🤷‍♂️.
 

SlimySnake

Flashless at the Golden Globes
But no one argued against that. What they mean by upscaling techniques, "freeing" resources is that upscaling to a target resolution always costs less than the target resolution itself, thus freeing resources.

[
Nah that’s what was being said. Go back and read the replies i originally replied to. Which itself was in reply to me saying the Cpu upgrade was not enough so the implication that pssr would somehow help with the cpu bottleneck by freeing up resources is what i was arguing against.

dp1b2Iy.jpg
 

yamaci17

Member
I'm not really sure what some of you are arguing about. If you talk to developers they are pretty happy with this damn thing.
they will be happy because at least they're not being forced to optimize extreme CPU bound code to hit 30 fps on 1.6 ghz jaguar cores

but they will have no trouble targeting the very same 30 fps on zen 2 cores as well. which is why some people are getting worked up. if ps5 pro focused on CPU upgrade while keeping GPU more or less similar or with slight upgrade + big upgrade on upscaling, it would've been better for high framerate enjoyers

i dont really care about playing games at 30 fps or 60 fps so i dont actually care what ps5 pro ends up with. but it is fun to participate in discussions regardless. if you have to ask me though, I'd prefer more balanced builds rather than xbox one x-like builds where the focus is on resolution and graphics (despite myself building a PC that has the mindset of xbox one x but that I'm just an odd person overall).

d it. If you told them you built a ryzen 3600 rtx 4070 rig, they would probably be like "but that cpu will hold that gpu back, why didnt you get something decent, modern that can accompany 4070 properly. but if sony does it, cerny is a genius, ps5 is not cpu limited at all, rules are different, spiderman runs at this framerate, 3rd party suck, gta 6 sucks anyways, etc. etc.

it is like someone with 3600 and 4070 getting 80+ fps in spiderman and saying game is optimized and their rig is fine. and when they heavily get bottlenecked in jedi survivor, they blame the developer. when in reality, they could've solved the bottleneck by pairing that 4070 with at least a ryzen 7600. that is the core of the problem. sometimes you gotta give the GPU the CPU it deserves. otherwise you're just limiting the build to specific resolution/framerate parameters. which is okay by itself. you can still get great mileage out of that GPU. but you still squander potential for high framerate experiences

with a 3600 and a 4070, you won't be CPU limited at 4k in a vast majority of titles especially the ones that are released before 2021. more so if you just push ultra settings and 4070 can take it too. but then you try jedi survivor, hogwarts legacy, dragons dogma and quickly realize this CPU is simply not meant for 4070 unless you specifically gimp 4070 and push extreme graphical settings that target 30 FPS. all that because the CPU can't keep up. where is the sense in that? why sacrifice 4k/optimized settings/dlss quality 60 fps experience and go for native 4k, unoptimized ultra ray tracing settings just so that you can saturate GPU at 30 fps target? it is what Cerny is trying to do here by keeping the same CPU and giving the GPU a great ray tracing and mild raster improvements. it is the exact same logic they had with xbox one x and ps4 pro. it is a mistaken approach. but they keep doing it. because people have no trouble with 30 fps indeed on consoles
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
They literally said that CPU bottlenecked games will remain CPU bottlenecked and that adding RT to games may make them more CPU intensive.

They talk about Gotham Knights, BG3 and GTA6 as examples of CPU limited games. They would have talked about more but can't, that means DD2 probably (under NDA).
… and then there is a plethora of games where the CPU cost of adding RT on a console will not be more than what is available and thanks to the new GPU they will be able to do just that.
 

GermanZepp

Member
it started with "gta 6 will have the same game logic as rdr 2/gta 5 so it should run at 60 fps"

now it is going to "if they cant hit 60 fps its all their fault because it is the same as death stranding/forbidden west and those games hit 60 fps on ps5!1"

they already prepare for the inevitable fact of GTA 6 running at 30 fps even on ps5 pro.

if it runs at 60 fps "i told ya so"
if it does not run at 60 fps "it has no rights to run at 30 fps, i told ya so!"

(my personal opinion is that rockstar will destroy all other studios as they always did and game will run at 30 fps and push console CPU to its limits. geometric density, npc counts and draw distance is going to be insane. people just remember things fondly. gta 5 was above most other games but its draw distance and geometric density and NPC density was actualyl lackluster, but lackluster in a way that it was still beyond other games)
Oh I really don't care if it runs at 30 or 60fps, I'm gonna enjoy it nevertheless.

But the technology and things happening in display in Rdr2 is unmatched imo, so the comparisons always get me half laughing and half triggered over the nonsense.
 
Last edited:
They literally said that CPU bottlenecked games will remain CPU bottlenecked and that adding RT to games may make them more CPU intensive.

They talk about Gotham Knights, BG3 and GTA6 as examples of CPU limited games. They would have talked about more but can't, that means DD2 probably (under NDA).
BG3 is the most CPU heavy games we have and it still has a 60fps mode, one level has trouble but VRR works in most cases. Gotham Knights is years old (likely GPU limited) and GTA6 is not released yet. They just needed to create a new negative narrative (because it's not a new Xbox). But this is complete BS and FUD.
 

Fafalada

Fafracer forever
It's very limited to not consider what the CPU actually does in conjunction with more memory/gpu bandwith.
4 decades of console history doing the same thing with CPUs, and internet still acts like it's a new shocking development every time it happens.

TBH - the more irksome part is that there are no 'CPU intensive' games out there atm - just games that utilise the CPU badly (Star Citizen doesn't count because by the time it comes to consoles we'll be burying PS7 in the ground, and besides, it's debatable calling it a game in the first place).

I mean literally we've had 2+ decades of GTA games staying the same when it comes to world interaction, and we still accept that always running at 30fps(or below) is because 'but but CPU workload'.
Dr Evil Whatever GIF
 

Gaiff

SBI’s Resident Gaslighter
… and then there is a plethora of games where the CPU cost of adding RT on a console will not be more than what is available and thanks to the new GPU they will be able to do just that.
If they can somehow offload tasks typically reserved for the CPU on the GPU, sure. If not, then it won't make a difference.
 

ChiefDada

Gold Member
Nah that’s what was being said. Go back and read the replies i originally replied to. Which itself was in reply to me saying the Cpu upgrade was not enough so the implication that pssr would somehow help with the cpu bottleneck by freeing up resources is what i was arguing against.

dp1b2Iy.jpg

Think of pristine image quality as the constant; for these reasons, FSR2 and lesser upscalers are off the table. i.e. your only 2 choices to run the game are AI upscaling or native rendering which for the sake of this discussion provide the same great image quality. Which of the 2 options leaves developers with more resources to play with assuming excellent image quality is mandatory?
 

SKYF@ll

Member
PS5Pro: +45%GPU, +10%CPU ,+28% RAM bandwidth ,PSSR using AI core(HW) with FSR2(SW) disabled ,+1.2GB RAM ,+RT x2-x4

The frame rate will definitely increase.
If the CPU becomes a bottleneck, just reduce the number of NPCs and AI vehicles.
 

Perrott

Gold Member
Where’s the supposed intense CPU logic?

There’s no heavy destruction or physics.

It’s an open world story driven game.
You're really arguing that Red Dead is not CPU intensive? A game in which all non-hostile NPCs follow different sets of routines throughout the entire day? A game that has the most advanced physics & locomotion engine in the industry? A game with more dynamic physics objects and fine-tuned destruction, environment interactivity and smarter AI behaviors than literally all Sony open-worlds combined?

Sorry, but you're making yourself look stupid by trying to downplay Rockstar out of all studios. The only ones capable of reaching their level of fidelity and technical ambition across all fronts are Naughty Dog, and only because they make linear games with fixed time of day settings.
 
Last edited:
Everyone that is unhappy, it’s on 6nm ,there is literally no way of getting more out of it, could they have waited a year and go with 4nm. maybe, but they made the call, and it’s up to Microsoft to beat them.
 

Crayon

Member
I could definitely see some games staying stuck at 30fps but not all of them. Performance modes being pushed down to very low internal resolutions provide a hint there. Some of the 30fps games probably don't go to 60 because it's asking too much of the GPU.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
4 decades of console history doing the same thing with CPUs, and internet still acts like it's a new shocking development every time it happens.

TBH - the more irksome part is that there are no 'CPU intensive' games out there atm - just games that utilise the CPU badly (Star Citizen doesn't count because by the time it comes to consoles we'll be burying PS7 in the ground, and besides, it's debatable calling it a game in the first place).

I mean literally we've had 2+ decades of GTA games staying the same when it comes to world interaction, and we still accept that always running at 30fps(or below) is because 'but but CPU workload'.
Dr Evil Whatever GIF
I mean 2 decades of GTA games is like 2 games lol.
 

Bojji

Member
… and then there is a plethora of games where the CPU cost of adding RT on a console will not be more than what is available and thanks to the new GPU they will be able to do just that.

Every game becomes more CPU heavy with RT, CPU needs to do more calculations, in some games this impact is small and in some it's big.

BG3 is the most CPU heavy games we have and it still has a 60fps mode, one level has trouble but VRR works in most cases. Gotham Knights is years old (likely GPU limited) and GTA6 is not released yet. They just needed to create a new negative narrative (because it's not a new Xbox). But this is complete BS and FUD.

We know for sure that GK is CPU limited and BG3 drops to this framerate (they added few FPS in patch):

UyvUpUH.jpg


Big CPU upgrade is needed for 60FPS:

8vzLsEs.jpeg
 

Gaiff

SBI’s Resident Gaslighter
BG3 is the most CPU heavy games we have and it still has a 60fps mode, one level has trouble but VRR works in most cases. Gotham Knights is years old (likely GPU limited) and GTA6 is not released yet. They just needed to create a new negative narrative (because it's not a new Xbox). But this is complete BS and FUD.
You’re seriously downplaying "one level". It’s most of Act 3 and that’s around 40% of the entire game.
 
they will be happy because at least they're not being forced to optimize extreme CPU bound code to hit 30 fps on 1.6 ghz jaguar cores

but they will have no trouble targeting the very same 30 fps on zen 2 cores as well. which is why some people are getting worked up. if ps5 pro focused on CPU upgrade while keeping GPU more or less similar or with slight upgrade + big upgrade on upscaling, it would've been better for high framerate enjoyers

i dont really care about playing games at 30 fps or 60 fps so i dont actually care what ps5 pro ends up with. but it is fun to participate in discussions regardless. if you have to ask me though, I'd prefer more balanced builds rather than xbox one x-like builds where the focus is on resolution and graphics (despite myself building a PC that has the mindset of xbox one x but that I'm just an odd person overall).

d it. If you told them you built a ryzen 3600 rtx 4070 rig, they would probably be like "but that cpu will hold that gpu back, why didnt you get something decent, modern that can accompany 4070 properly. but if sony does it, cerny is a genius, ps5 is not cpu limited at all, rules are different, spiderman runs at this framerate, 3rd party suck, gta 6 sucks anyways, etc. etc.

it is like someone with 3600 and 4070 getting 80+ fps in spiderman and saying game is optimized and their rig is fine. and when they heavily get bottlenecked in jedi survivor, they blame the developer. when in reality, they could've solved the bottleneck by pairing that 4070 with at least a ryzen 7600. that is the core of the problem. sometimes you gotta give the GPU the CPU it deserves. otherwise you're just limiting the build to specific resolution/framerate parameters. which is okay by itself. you can still get great mileage out of that GPU. but you still squander potential for high framerate experiences

with a 3600 and a 4070, you won't be CPU limited at 4k in a vast majority of titles especially the ones that are released before 2021. more so if you just push ultra settings and 4070 can take it too. but then you try jedi survivor, hogwarts legacy, dragons dogma and quickly realize this CPU is simply not meant for 4070 unless you specifically gimp 4070 and push extreme graphical settings that target 30 FPS. all that because the CPU can't keep up. where is the sense in that? why sacrifice 4k/optimized settings/dlss quality 60 fps experience and go for native 4k, unoptimized ultra ray tracing settings just so that you can saturate GPU at 30 fps target? it is what Cerny is trying to do here by keeping the same CPU and giving the GPU a great ray tracing and mild raster improvements. it is the exact same logic they had with xbox one x and ps4 pro. it is a mistaken approach. but they keep doing it. because people have no trouble with 30 fps indeed on consoles

God damn it stop spitting facts it's missing me off because it destroys my hopes. This is the sad truth and it's God damn infuriating the way they treat consoles. The funny thing is console gamers are not ok with 30 fps any more than PC, we have no friggin choice except to buy a PC. Ok, I take it back about console being just as unhappy as PC about 30 ...there are far too many casuals that accept 30 on console but again it's because what other choice is there.

This is exactly what they did with Pro/X and I knew it would happen. I really do think many more gamers have changes their minds towards 30 vs 60 though this gen ..look at the Dragons Dogma 2 debacle. Maybe this strategy will backfire this time.

I'm still getting a Pro though so I guess that makes me a hypocrite. 30 fps sucks on an Oled in a fast paced game or any game that requires fast, precise inputs
 

Bojji

Member
You’re seriously downplaying "one level". It’s most of Act 3 and that’s around 40% of the entire game.

Yeah this drop from 50-60 FPS to below 30 is quite fucking horrifying. It's like "last part of the game was shit" but only talking about FPS.
 

Gp1

Member
More like under 1000$


Pro won't be 500$...

Exactly and not even that.
I already have half of the build with an older GPU that doesn't come close to current gen raw power, but serves me well. I can even wait for a "5060" without any major hassle.
Between that and a PS5pro with that specs? No brainer.

But it sure looks like the bottleneck butthurt is hard on some guys here :D
 
Last edited:
You're really arguing that Red Dead is not CPU intensive? A game in which all non-hostile NPCs follow different sets of routines throughout the entire day? A game that has the most advanced physics & locomotion engine in the industry? A game with more dynamic physics objects and fine-tuned destruction, environment interactivity and smarter AI behaviors than literally all Sony open-worlds combined?

Explain how any of that is cpu intensive. Most of that is just attention to detail and design intensive work, but it’s not magically stressing hardware more than other top games of that era

They should have spent that budget on half decent gunplay mechanics instead
 

DaGwaphics

Member
Same zen 2 cpu. 10% clock speed boost.

pMSCQKv.jpg


Im out.
Jerry Seinfeld Reaction GIF

I honestly feel like a cache increase would have been worth more than the clock speed (when you look at the G models of Zen2 in comparison to the high cache models). It could have been transformative in ways that the 10% won't be, IMO.
 

Fafalada

Fafracer forever
I mean 2 decades of GTA games is like 2 games lol.
We got 7 3d ones in this time-frame (9 if you want to count RDR games as GTA on horses - although I kind of treat them as different) - but anyway.
R* output has only really slowed down in last decade, it was pretty good until 2013 or so (don't forget they also used to make non-GTA games during that period - but I guess once GTA5 became a forever hit - why bother with anything else).
 

yamaci17

Member
God damn it stop spitting facts it's missing me off because it destroys my hopes. This is the sad truth and it's God damn infuriating the way they treat consoles. The funny thing is console gamers are not ok with 30 fps any more than PC, we have no friggin choice except to buy a PC. Ok, I take it back about console being just as unhappy as PC about 30 ...there are far too many casuals that accept 30 on console but again it's because what other choice is there.

This is exactly what they did with Pro/X and I knew it would happen. I really do think many more gamers have changes their minds towards 30 vs 60 though this gen ..look at the Dragons Dogma 2 debacle. Maybe this strategy will backfire this time.

I'm still getting a Pro though so I guess that makes me a hypocrite. 30 fps sucks on an Oled in a fast paced game or any game that requires fast, precise inputs
for it to backfire, dragon dogma 2 needs to fail in sales for its performance . it wont happen. it is not exclusive to console userbase either. elden ring had absurdly bad performance on PC at launch yet it was a great game, so it got extremely positive reviews on Steam

hogwarts legacy is another case. that game is literally uncompliant with 16 gb ram which is probably more than half the game's players. Most of my friends had 16 gb ram while playing that game, I asked them "guys, don't you have problems in hogsmeade" they reply "i dont care, lmao game is mad fun". friend streams his game on Discord, in hogwarts, game stutters every 5 seconds due to extreme data swapping with 16 GB RAM. and here he is, putting 50 hrs to the game, %100'ing it, leaving a positive review, and saying "what a great game, our dreams of visiting hogwarts in a game has been finally fulfilled!"

in a world where devs can get away with extremely stuttery experiences on PC, they will get away with rocky 30 fps on consoles lol

my casual friend who was playing in baldur's gate 3 with a ryzen 2600 and I ask her "how goes it in act 3, I heard it is rough there" she replies "idk what is wrong it, I'm playing the game the 3rd time". when people have fun, they can even see past <30 fps, stutters and all that stuff. and they're sadly the majority of the people who actually buy these games. we're just an echo chamber. we should even be glad that they at least had the decency of putting a zen 2 CPU there. nothing stopped them from putting a 3 ghz zen cpu there
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
We got 7 3d ones in this time-frame (9 if you want to count RDR games as GTA on horses - although I kind of treat them as different) - but anyway.
R* output has only really slowed down in last decade, it was pretty good until 2013 or so (don't forget they also used to make non-GTA games during that period - but I guess once GTA5 became a forever hit - why bother with anything else).
I was counting the mainline ones. SA, IV, and V. Didn’t know exactly if SA was 20 years old or not.
 
You’re seriously downplaying "one level". It’s most of Act 3 and that’s around 40% of the entire game.
Maybe but AFAIK it's still the only really CPU limited game we have. And it's running even worse on Xbox (with their faster CPUs). What does it tell us?

But even there the game would run 10% better on PS5 Pro. One game out of ~3000.
 
Glad they touched upon the lower clocks which goes against Cerny's previous philosophy. Strange that he didn't opt for less CUs clocked higher.
Higher clocks means more heat, which means bigger heat sink, which increases the weight, which increases the shipment costs.
 

Fafalada

Fafracer forever
It could have been transformative in ways that the 10% won't be, IMO.
Console CPUs have gone for decades with far more limited caches (or cutdown if it was customised PC part) than their desktop PC counterparts. It matters a whole lot less to consoles than it does to PCs - the more fine-grained control of memory and code-paths helps. It is fair to say - software has gotten - less optimal and it does need more cache - but then caches have grown on consoles as well.
Basically not arguing more cache wouldn't be nice - but I wouldn't put my hand in fire it would beat the clock increase at all. Eg. on PS4 Pro - CPU performance scaled almost entirely linearly with the boosted clocks - perfect 30% (and those CPUs had much worse cache/memory subsystems to contend with), while GPU almost never hit that 2.3x increase outside of specialised scenarios.

I was counting the mainline ones. SA, IV, and V. Didn’t know exactly if SA was 20 years old or not.
Well I did say 20+ years ;) I counted starting from III. And the two non-mainline entries are still the same GTA formula in every respect that mattered - plenty of 'mainline' teams people involved in those too.
 
Backcompat info! All games can benefit from PSSR if developers do the work to patch it in.
Hardware upscalers like DLSS and PSSR are completely dependent on geometry and distance from the camera. Modders have built DLSS and software FSR2 into DX12 titles by piggy backing off existing motion blur values.

We don't know, but Sony CAN build in PSSR automation through their usual API channels.

Run everything at 1080p/60 internally and then use automated proprietary hardware upscaling to output a clean 1440p/60 is absolutely possible. I'm floored by the knee jerk misinformation in some of the above posts
 
Last edited:
Exactly and not even that.
I already have half of the build with an older GPU that doesn't come close to current gen raw power, but serves me well. I can even wait for a "5060" without any major hassle.
Between that and a PS5pro with that specs? No brainer.

But it sure looks like the bottleneck butthurt is hard on some guys here :D

I'm sure everyone will sell their PS5 and move to the same PC you have

PS5 Pro will sell 0 units....

Happy now?
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
If they can somehow offload tasks typically reserved for the CPU on the GPU, sure. If not, then it won't make a difference.
I know, they may be able to offload and/or have CPU capacity. There is a difference between locking at 30 FPS and “cannot run at anything above 30”. The game could run at 38 FPS and without LFC the dev may decide to sacrifice the 8 FPS investing it in more RT resources.
 

yamaci17

Member
Not as afraid as PS5 owners seeing this:


That 45% upgrade won't help...
at some point I also feel bad about ps5 userbase actually. it seems like a scam, PSSR will allow insanely better image quality output than fsr 2 if it can fit somewhere between xess and dlss. so it means that ps5 pro will be able to have better quality per pixel (the argument that i've been using for dlss for a while now) my friend who bought ps5 as his first console is actually fuming if pssr does not end up on ps5 (and most likely won't...)

while %45 upgrade won't help that much, we can say that perceived image quality will probably see a huge upgrade. this never happened between ps4 and ps4 pro, ps4 pro just had fancy checkerboard but it was to make 4k workable. pssr on ps5 pro and fsr 2 on ps5 will create a weird situation. ps4 still forced developers to optimize competent TAA for it so that ps4 pro can checkerboard its way in. now most devs will slap fsr 2 + pssr on respective consoles and we will have this weird disparity we're having on pc between amd and nvidia cards

i'm glad even people who paid 300 bucks for a 2060 back in 2019 do not have these kind of issues though. :)

pssr should've been there on ps5, supported by hardware at launch. it is interesting really. it seems like ps5 was not designed with future proof upscaling in mind. it is definitely not a worthy successor to the PS4 imo.
 
Last edited:

Chiggs

Member
If I can find one in stock without any hassle, I'll pick one up. If not, oh well...I won't be missing out on much.
 
Top Bottom