• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Stray PS5 vs PS4/PC: A Superb 4K 60FPS Rendition on PS5 - But What’s Up With PC?

Skifi28

Member
What part of limited budget you don't understand?
Limited budget doesn't say much, it's just an easy excuse to use at forums without knowing their actual budget, how they divided it among platforms or what perhaps went wrong. If your budget is so strict then don't release a PS4 version at all. But you want the extra sales without making the extra effort? (Is an increased resolution on better hardware really so straining on a game's budget?). Maybe it's a bug at release that slipped QA or they're already working on it for the first update, we don't really know. I just don't like the excuses. First it's Sony's fault then the budget that we don't know much about it and then something else.

Personally I'm of the opinion that developer studios should be held accountable (in a civil manner) for issues. This is how they often get solved, just accepting things how they are won't do much.
 
Last edited:
Stutters are one of the reasons, other than that:

- with Series X there was no need for gaming GPU
- GPU mining was no longer worth it (GPU was on 24/7)
- I sold it for x2 the price I bought it after over a year of using it, this was in February before mining crash

PC versions more often than not are not up to par, something is fucked and right now most console games have 60 FPS modes so gaming on them is quite good.

When devs decide to go 30FPS only again I will just buy Lovelace or RDNA3 and go back to PCMR and let's hope stutters are fixed by then :messenger_grinning_sweat:
You’d think by the time consola stop offering 60 and 120 modes they would have pro models
 
Not all games/engines scale in a linear fashion. The biggest issue with the Uncharted reamsters was not using a dynamic resolution up to 4k instead of a "safe" 1440p , that would have made a big difference.
Shouldn’t need dynamic res they literally slapped on the ps4 versión with no optimization and basically had the ps5 brute force that’s why it has dissapointing results
 

assurdum

Banned
Limited budget doesn't say much, it's just an easy excuse to use at forums without knowing their actual budget, how they divided it among platforms or what perhaps went wrong. If your budget is so strict then don't release a PS4 version at all. But you want the extra sales without making the extra effort? (Is an increased resolution on better hardware really so straining on a game's budget?). Maybe it's a bug at release that slipped QA or they're already working on it for the first update, we don't really know. I just don't like the excuses. First it's Sony's fault then the budget that we don't know much about it and then something else.

Personally I'm of the opinion that developer studios should be held accountable (in a civil manner) for issues. This is how they often get solved, just accepting things how they are won't do much.
It's not an excuse. Do you know the size of the studio and their resources? From what I read it was really minuscle and now some of you pretend higher res than 1080p when higher on UE4 isn't it exactly that easily scalable on ps4 pro. At least try to be a bit more informed about stuff like this before to complain about their laziness without offence.
Shouldn’t need dynamic res they literally slapped on the ps4 versión with no optimization and basically had the ps5 brute force that’s why it has dissapointing results
Ok let's face the cruel reality: how many games do you know higher resolution than 1080p on ps4 pro with UE4? Because from my personal knowledge they are a very restricted selection of games.
 
Last edited:

Erebus

Member
What part of limited budget of an indie studio it's complicate to grasp.

You think if it was that easily feasible, they won't do it? Probably it required too much investment for them. Keep in mind ps4 pro has very limited bandwith. And honestly I don't know so many games on UE4 with higher res than 1080p on ps4 pro. UE4 has been always fucking expensive for the pro hardware.
I don't understand why you're trying so hard to defend an obvious oversight on the developers behalf. You're telling me that a game that runs at almost stable 1080p30 on base PS4, can't run at a higher resolution on PS4 Pro keeping everything else equal? That's ludicrous.
 
Last edited:

Skifi28

Member
It's not an excuse. Do you know the size of the studio and their resources? From what I read it was really minuscle and now some of you pretend higher res than 1080p when higher on UE4 isn't it exactly that easily scalable on ps4 pro. At least try to be a bit more informed about stuff like this before to complain without offence.
I don't know and you don't either, that was the point I'm making. You just throw words around like "Sony's fault" or "indie" or "budget" or "UE4" without any context on what might have actually happened. You want me to be informed but you don't really have any information yourself other than "leave Britney alone".

I'm also not sure about the offence part? Who took offence? I sure didn't and neither am I offending someone. Just trying to have a rational discussion on what happened and what could be improved. I don't see why I can't point out technical issues with a game release. Again, this is how things get fixed, by pointing them out. If you're satisfied with the technical aspect of the game, more power to you, go and enjoy it.
 
Last edited:
It's not an excuse. Do you know the size of the studio and their resources? From what I read it was really minuscle and now some of you pretend higher res than 1080p when higher on UE4 isn't it exactly that easily scalable on ps4 pro. At least try to be a bit more informed about stuff like this before to complain about their laziness without offence.

Ok let's face the cruel reality: how many games do you know higher resolution than 1080p on ps4 pro with UE4? Because from my personal knowledge they are a very restricted selection of games.
I wasn’t talking about ps4 pro
 

assurdum

Banned
I don't understand why you're trying so hard to defend an obvious oversight on the developers behalf. You're telling me that a game that runs at almost stable 1080p30 on base PS4, can't run at a higher resolution on PS4 Pro keeping everything else equal?
I don't try to defend anything. I said UE4 is not easily scalable in resolution on ps4 pro because it has very limited bandwith differently some of you I'm not raising the voice without check what was the typical perfomance limitations of UE4 on ps4 pro which, surprise, it's 1080p of framebuffer. I don't find anything of shocking about it considered the size of the studio.
 
Last edited:

yamaci17

Member
Im a console gamer and was planning to build my first pc for both 3d modeling and animation as well as a 2ndary gaming platform I was looking into the 4090/7900xt (likely 7900xt at this point) are there really all these problems on pc? I was supposed to have the halo series x from last year but my order convienelty got canceled and I got screwed
ue4 have problems. no one can deny that. and there are no fixes on sight. nextgen will be full of heavily demanding ue5 titles. i would hold out building a pc right now and get a console instead. nextgen witcher, nextgen tomb raider, many more blockbuster titles will be made in ue5. i dont know how long they can ignore this issue but we shall see.
 

yamaci17

Member
Sorry my mistake. Anyway 1080p it's definitely too high to handle on base ps4 with UE4, probably it's that reason. 900p is the typical framebuffer choice on ps4.
what do you mean, even rdr 2 holds a consistent rock solid 1080p on base ps4. this game could've run 1080p 50-60 fps if not for the pathetic jaguar cpu.



stray is the case of witcher 3 all over again. ps4 gpu had the capability of running the witcher 3 at 1080p/60 fps, since equivalent hardware such as 1050ti can run it like that with competent cpu. (i'm giving the 1050ti example because it performs exactly like ps4 in rdr 2. 1080p 34-38 frames, a tad bit above ps4, but ps4 also have headroom to lock to a rock solid 30 fps, so it checks out) 1050ti is the perfect matched GPU, with 2.1 tflops of computational power against ps4's 1.8 tflops. lack of async, and low api features practically eliminates the %16-25 advantage a 1050ti would have over ps4 in most cases. therefore, they're a perfect match. this is why it was really childish of DF to match a ps4 to a 750ti. (especially considering 750ti is a mere 1.3 tflops gpu. ps4 is %38 stronger computionally, add a %25-30 for async and low level performance advantage, you easily get %70-80. easily explains why 750ti can only match ps4's performance at 720p/800p in recent titles that maxes out PS4 such as rdr 2 and detroit become human and ac valhalla. although 750ti in most cases will be able to match xbox one's performance)

i would even say, they could've went for 1440p/30 on ps4 and 4k/30 on ps4 pro. seems like they didn't care much. this game is very easy on GPUs

this stray game is not even maxing out the gpu of ps4. it is most likely bound to 30 fps due to anemic CPU. do you really think witcher 3 from 2015 fills the gpu of a ps4? its unlikely.
 
Last edited:
ue4 have problems. no one can deny that. and there are no fixes on sight. nextgen will be full of heavily demanding ue5 titles. i would hold out building a pc right now and get a console instead. nextgen witcher, nextgen tomb raider, many more blockbuster titles will be made in ue5. i dont know how long they can ignore this issue but we shall see.
Im still planning on getting a ps5 And I should have had a series x right now but alas
 

assurdum

Banned
what do you mean, even rdr 2 holds a consistent rock solid 1080p on base ps4. this game could've run 1080p 50-60 fps if not for the pathetic jaguar cpu.



stray is the case of witcher 3 all over again. ps4 gpu had the capability of running the witcher 3 at 1080p/60 fps, since equivalent hardware such as 1050ti can run it like that with competent cpu. (i'm giving the 1050ti example because it performs exactly like ps4 in rdr 2. 1080p 34-38 frames, a tad bit above ps4, but ps4 also have headroom to lock to a rock solid 30 fps, so it checks out) 1050ti is the perfect matched GPU, with 2.1 tflops of computational power against ps4's 1.8 tflops. lack of async, and low api features practically eliminates the %16-25 advantage a 1050ti would have over ps4 in most cases. therefore, they're a perfect match. this is why it was really childish of DF to match a ps4 to a 750ti.

i would even say, they could've went for 1440p/30 on ps4 and 4k/30 on ps4 pro. seems like they didn't care much. this game is very easy on GPUs

this stray game is not even maxing out the gpu of ps4. it is most likely bound to 30 fps due to anemic CPU. do you really think witcher 3 from 2015 fills the gpu of a ps4? its unlikely.

RDR2 was 900p on ps4 eh. Anyway UE4 hardly reached higher resolution than 1080p on multiplat on pro because extremely expensive for such hardware, anyway I don't follow the logic about compare games with totally different engines.
 
Last edited:

yamaci17

Member
RDR2 was 900p on ps4 eh. Anyway UE4 hardly reached higher resolution than 1080p on multiplat on pro so I don't follow the logic to mention games with totally different engines.
"As expected, pixel-counts on PlayStation 4 resolve at the de facto standard 1080p, while the Xbox One S - quantifiably the lowest quality experience - delivers just 864p."

rdr 2 on ps4 have frame drops here and there, especially on towns. but it is not even GPU related. this is why they dont employ a dynamic resolution solution reduce the resolution, it would be pointless. anemic CPU was dropping frames in towns. ps4 had very decent GPU for its time. it is just being stranded by the anemic CPU.
 

assurdum

Banned
"As expected, pixel-counts on PlayStation 4 resolve at the de facto standard 1080p, while the Xbox One S - quantifiably the lowest quality experience - delivers just 864p."
[/URL][/URL][/URL][/URL]

rdr 2 on ps4 have frame drops here and there, especially on towns. but it is not even GPU related. this is why they dont employ a dynamic resolution solution reduce the resolution, it would be pointless. anemic CPU was dropping frames in towns. ps4 had very decent GPU for its time. it is just being stranded by the anemic CPU.
Again the hell it has to do RDR2 with UE4 on ps4 pro. It's a complete different engine. Go to check what was the more common resolution of UE4 on ps4 pro. My mistake about RDR2 resolution.
 
Last edited:

yamaci17

Member
Again the hell it has to do RDR2 with UE4 on ps4 pro. It's a complete different engine. Go to check what was the more common resolution of UE4 on ps4 pro.
"Anyway UE4 hardly reached higher resolution than 1080p on multiplat on pro so I don't follow the logic to mention games with totally different engines."

this makes no sense. i'm sure there are tons of multiplat ue4 games that run well above 1080p on ps4 pro. this game is an outlier.

you're really downplaying base ps4 and by extension ps4 pro
 

assurdum

Banned
"Anyway UE4 hardly reached higher resolution than 1080p on multiplat on pro so I don't follow the logic to mention games with totally different engines."

this makes no sense. i'm sure there are tons of multiplat ue4 games that run well above 1080p on ps4 pro. this game is an outlier.

you're really downplaying base ps4 and by extension ps4 pro
They are not tons but a very minority selection of games. Go to check some DF videos about UE4. 1080p was the typical output on pro, 1440p on one X. And I'm not downplaying anything. UE4 was an hell expensive for the past generation.
 
Last edited:

yamaci17

Member
They are not tons but a very minority selection of games. Go to check some DF videos about UE4. 1080p was the typical output on pro, 1440p on one X. And I'm not downplaying anything. UE4 was an hell expensive for the past generation.
i dont know man,



the game almost runs at 1440p/60 fps on 6 year old gtx 1060

you really mean to say me that ue4 taps out the ps4 pro at 1080p/30 fps?
 

DeepEnigma

Gold Member
This shows that a 1080p 30 game in base ps4 can scale up to 4k 60 on ps5 on top of higher settings despite what is popularly said. This just makes the uncharted remasters look more pathetic
Donald Trump GIF by reactionseditor
 

assurdum

Banned
i dont know man,



the game almost runs at 1440p/60 fps on 6 year old gtx 1060

you really mean to say me that ue4 taps out the ps4 pro at 1080p/30 fps?

Have you seen at what resolution UE4 runs typically on pro? And GTX 1060 has a lot of more faster gpu frequency of the ps4 pro, from what I seen on the net. Deleted the part about the bandwith, I was wrong but frequency are noticeably higher in the nvidia gpu. The ps4 pro gpu is very slow.
 
Last edited:
what do you mean, even rdr 2 holds a consistent rock solid 1080p on base ps4. this game could've run 1080p 50-60 fps if not for the pathetic jaguar cpu.



stray is the case of witcher 3 all over again. ps4 gpu had the capability of running the witcher 3 at 1080p/60 fps, since equivalent hardware such as 1050ti can run it like that with competent cpu. (i'm giving the 1050ti example because it performs exactly like ps4 in rdr 2. 1080p 34-38 frames, a tad bit above ps4, but ps4 also have headroom to lock to a rock solid 30 fps, so it checks out) 1050ti is the perfect matched GPU, with 2.1 tflops of computational power against ps4's 1.8 tflops. lack of async, and low api features practically eliminates the %16-25 advantage a 1050ti would have over ps4 in most cases. therefore, they're a perfect match. this is why it was really childish of DF to match a ps4 to a 750ti. (especially considering 750ti is a mere 1.3 tflops gpu. ps4 is %38 stronger computionally, add a %25-30 for async and low level performance advantage, you easily get %70-80. easily explains why 750ti can only match ps4's performance at 720p/800p in recent titles that maxes out PS4 such as rdr 2 and detroit become human and ac valhalla. although 750ti in most cases will be able to match xbox one's performance)

i would even say, they could've went for 1440p/30 on ps4 and 4k/30 on ps4 pro. seems like they didn't care much. this game is very easy on GPUs

this stray game is not even maxing out the gpu of ps4. it is most likely bound to 30 fps due to anemic CPU. do you really think witcher 3 from 2015 fills the gpu of a ps4? its unlikely.

A 1050 ti 4gb is absolutely a tier above PS4.

Witcher 3 could not run 60fps on PS4 you are talking crazy, that's why it has fps dips in the bog, with all the alpha effects. That is not CPU related. Even simple igni sign can still cause dips. BUT I actually could run it at 60fps in 1050 ti, albeit at low settings but with max textures which is better than PS4 setting. I think maybe I had crowd density at high/max as well.

750 ti yes is worse than PS4.
 
Last edited:
i dont know man,



the game almost runs at 1440p/60 fps on 6 year old gtx 1060

you really mean to say me that ue4 taps out the ps4 pro at 1080p/30 fps?

Pro should probably be 1440p-1600p/30fps or something like that. Pro is weaker than 1060, and that's a bigger gap than 1050 ti > PS4.
 

assurdum

Banned
Pro should probably be 1440p-1600p/30fps or something like that. Pro is weaker than 1060, and that's a bigger gap than 1050 ti > PS4.
1440p-1660p seems really an high expectation for the ps4 pro. It's not that flexible machine in terms of resolution scalability. Only CBR or any sort of resolution reconstruction helps to avoid his bottlenecks. The only game I know with extremely high resolution on UE4 is FF7 remake but I wouldn't expect it as the normal standard res on UE4 for the pro.
 
Last edited:
1440p-1660p seems really an high expectation for the ps4 pro. It's not that flexible machine in terms of resolution scalability. Only CBR or reconstruction helps to avoid his bottlenecks.
Plenty of games go from a native 1080p on base PS4 to 1600p on pro. Certainly 1440p is easy to achieve if base PS4 is 1080p

Far cry 5 is 1080p on base, 1600p non checkerboard on pro

Uncharted 4 is 1080 on base, 1440p on pro
 

assurdum

Banned
Plenty of games go from a native 1080p on base PS4 to 1600p on pro. Certainly 1440p is easy to achieve if base PS4 is 1080p

Far cry 5 is 1080p on base, 1600p non checkerboard on pro

Uncharted 4 is 1080 on base, 1440p on pro
Where you have seen that plenty of games native 1600p on pro, out of my curiosity? Because I had one and it was quite rare. Ubisoft games uses DRS not fixed resolution. Most of the games if not all higher than 1440p on pro are CBR, reconstructed or DRS.
 
Last edited:

yamaci17

Member
A 1050 ti 4gb is absolutely a tier above PS4.

Witcher 3 could not run 60fps on PS4 you are talking crazy, that's why it has fps dips in the bog, with all the alpha effects. That is not CPU related. Even simple igni sign can still cause dips. BUT I actually could run it at 60fps in 1050 ti, albeit at low settings but with max textures which is better than PS4 setting. I think maybe I had crowd density at high/max as well.

750 ti yes is worse than PS4.
it will depend on the port. as i said, you could run witcher 3 at 1080p/60 on 1050ti. ps4 gpu also had grunt. rdr2/god of war on PC demands 2-2.5 times more powerful GPU for similar framerates. this alone proves that witcher 3 barely taps into the potential of PS4's actual GPU grunt. i simply refuse to believe that witcher 3 used ps4 gpu to its maximum potential. fps dips i mention were in novigrad, i dont know about bog fps drop however.

what i know is that 1050ti is capable of 1080p/33-40 on recent games that max out a ps4 at 1080p/30 fps target



its still a good performance, but nowehere near you would put it as "one tier above". back in 2013-2016, you could run lots of lighter games at 1080p/60 fps with a 1050ti with a better CPU. as i said, this is why games like god of war and rdr2 went full ham on graphics and their gameplay still resembles something that came out of 2011. they had to push graphics to fill the GPU while keeping the complexity/gameplay mechanics/physics to a minimum to maintain solid framerate on jaguar CPUs

witcher 3 neither had the graphics of rdr 2/ gow nor had any more advanced physics than them (at best i would say they're on par)

i literally had a r7 260x. it is well below what PS4 is capable of in terms of graphical compute power. with that GPU I managed to run witcher 3 at 1080p/40-45 FPS. nowadays it runs rdr 2 at 800p/30 fps lowest settings due to being very ancient GCN GPU and limited to 2 GB VRAM. it is clear that witcher 3 and similar type of games from 2013-2016 couldn't tap into the ps4's total graphical power. either because they had to limit the framerate to 30 due to CPU.
 
Last edited:
it will depend on the port. as i said, you could run witcher 3 at 1080p/60 on 1050ti. ps4 gpu also had grunt. rdr2/god of war on PC demands 2-2.5 times more powerful GPU for similar framerates. this alone proves that witcher 3 barely taps into the potential of PS4's actual GPU grunt. i simply refuse to believe that witcher 3 used ps4 gpu to its maximum potential. fps dips i mention were in novigrad, i dont know about bog fps drop however.

what i know is that 1050ti is capable of 1080p/33-40 on recent games that max out a ps4 at 1080p/30 fps target



its still a good performance, but nowehere near you would put it as "one tier above". back in 2013-2016, you could run lots of lighter games at 1080p/60 fps with a 1050ti with a better CPU. as i said, this is why games like god of war and rdr2 went full ham on graphics and their gameplay still resembles something that came out of 2011. they had to push graphics to fill the GPU while keeping the complexity/gameplay mechanics/physics to a minimum to maintain solid framerate on jaguar CPUs

witcher 3 neither had the graphics of rdr 2/ gow nor had any more advanced physics than them (at best i would say they're on par)

i literally had a r7 260x. it is well below what PS4 is capable of in terms of graphical compute power. with that GPU I managed to run witcher 3 at 1080p/40-45 FPS. nowadays it runs rdr 2 at 800p/30 fps lowest settings due to being very ancient GCN GPU and limited to 2 GB VRAM. it is clear that witcher 3 and similar type of games from 2013-2016 couldn't tap into the ps4's total graphical power. either because they had to limit the framerate to 30 due to CPU.

We can agree that red dead 2 is a little bit unoptimized on PC, right?

Also as you said, PS4 has that direct compute advantage so it's safe to say that red dead was a little more suited to PS4 architecture than pascal, despite even little pascal being faster.

Regarding God of war, I would say the same for pascal's lack of direct compute but also, driver optimization must be lacking at that point.

1050 ti is definitely a tier above. Not hugely so, I would say, but still. More flops on a more efficient architecture that focuses on overall rasterization and not just compute, and more vram than PS4.
 
Also here is red dead 2 running on 1050 ti, low settings max textures and getting between 40 and 50 fps. Perfect for freesync.. plus you can disable the extremely blurry taa and inject some smaa 1x. I would definitely call that a tier above PS4 yamaci17 yamaci17 and keep in mind the PS4 is not locked 30 either, so even without freesync/gsync that's a better faster experience on the 1050 ti

 
Last edited:

gypsygib

Member
Why are game devs so scared of compiling shaders at the start of the game. Monster Hunter Rise does it every once in a while and it works fine.
Right?! If they're worried about annoying people's first impression at start up with a delay, at least give an option to compile in the settings at start up.
 
Last edited:

Mr Moose

Member
Also here is red dead 2 running on 1050 ti, low settings max textures and getting between 40 and 50 fps. Perfect for freesync.. plus you can disable the extremely blurry taa and inject some smaa 1x. I would definitely call that a tier above PS4 yamaci17 yamaci17 and keep in mind the PS4 is not locked 30 either, so even without freesync/gsync that's a better faster experience on the 1050 ti


An overclocked 1050 Ti and a CPU that costs as much as the PS4 at launch? :pie_thinking:
 

Mr Moose

Member
Don't move the goalpost

You don't need a 5900x to max out that 1050 ti. A 2600k would do
Moving the goalposts of what? I never mentioned anything on this subject before, just stating a question.
What settings are the PS4s? Low? Does the 1050 Ti need to be overclocked? How much did that overclock give in performance?
 
Moving the goalposts of what? I never mentioned anything on this subject before, just stating a question.
What settings are the PS4s? Low? Does the 1050 Ti need to be overclocked? How much did that overclock give in performance?
What does it being overclocked or not have anything to do with it? 1050ti is a very low watt card so overclocking is free performance and pascal overclocked very well.

The goalpost is, is the 1050ti a significant step above PS4 or not. Not how much it costs (it was $130-140), not hey look it has an extremely unnecessarily powerful CPU attached to it on that particular PC.
 
Last edited:

Mr Moose

Member
What does it being overclocked or not have anything to do with it? 1050ti is a very low watt card so overclocking is free performance and pascal overclocked very well.

The goalpost is, is the 1050ti a significant step above PS4 or not. Not how much it costs (it was $130-140), not hey look it has an extremely unnecessarily powerful CPU attached to it on that particular PC.
I see, so you have to overclock it. Thanks.
 

Lysandros

Member
There is no question as to 1050ti being noticeably more powerful than PS4 with the exception of ASYNC compute capabilities and bandwidth. The difference sould be pretty slim though, maybe around 15% and 1050ti being coupled with a proper PC CPU certainly helps.
 
Last edited:
I see, so you have to overclock it. Thanks.
That was really weird, and you dodged the question. No the 1050 ti doesn't have to be overclocked to spank base PS4, as I didn't overclock my old 1050ti to get Witcher 3 running at 60fps as I mentioned above.

Seriously what is with this weirdness from you? They're just electronics
 
Last edited:

Mr Moose

Member
That was really weird, and you dodged the question. No the 1050 ti doesn't have to be overclocked to spank base PS4, as I didn't overclock my old 1050ti to get Witcher 3 running at 60fps as I mentioned above.

Seriously what is with this weirdness from you? They're just electronics
I dodged your question? You didn't even attempt to answer mine, you just said "why would overclocking matter?" Overclocking gives better performance.
What settings are the PS4s? Low? Does the 1050 Ti need to be overclocked? How much did that overclock give in performance?
 
There is no question as to 1050ti being noticeably more powerful than PS4 with the exception of ASYNC compute capabilities and bandwidth. The difference sould be pretty slim though, maybe around 15% and 1050ti being coupled with a proper PC CPU certainly helps.
That's the right assessment overall. It's basically 15 to 20 percent better than PS4 BUT having non gimped cpus on PC can unlock the full potential of the 1050 ti.

Although regarding bandwidth, it doesn't really have less than PS4 because pascal has much better color compression, which Maxwell already had that advantage over ps4. Plus PS4 has to share bandwidth with CPU.
 
I dodged your question? You didn't even attempt to answer mine, you just said "why would overclocking matter?" Overclocking gives better performance.
PS4 uses high textures and a mix of medium and low iirc. But the point is you get better textures on PC which is the biggest difference, and higher fps.

I don't know why you are fixated on the overclock. You can do that on PC, whether you have a caveman attachment to your game device of choice or not.
 
Last edited:

Lysandros

Member
Although regarding bandwidth, it doesn't really have less than PS4 because pascal has much better color compression, which Maxwell already had that advantage over ps4. Plus PS4 has to share bandwidth with CPU.
Yeah, i took these into account. I saw the VRAM bandwidth situation as pretty equal, i only mentioned it because i didn't think PS4 was lesser in that area.
 

Mr Moose

Member
PS4 uses high textures and a mix of medium and low iirc. But the point is you get better textures on PC which is the biggest difference, and higher fps.

I don't know why you are fixated on the overclock. You can do that on PC, whether you have a caveman attachment to your game device of choice or not.
I was interesting in the base performance of the GPU, not overclocked. I wanted to know how much of a performance boost the overclock would've given. Tests against consoles should be at similar settings to see exactly how much of an improvement over them it is in fps, like DF try to do in some of their videos.
I don't like overclocking things, my CPU is using a stock cooler and my GPU is dog shit and old lol.
 
I was interesting in the base performance of the GPU, not overclocked. I wanted to know how much of a performance boost the overclock would've given. Tests against consoles should be at similar settings to see exactly how much of an improvement over them it is in fps, like DF try to do in some of their videos.
I don't like overclocking things, my CPU is using a stock cooler and my GPU is dog shit and old lol.
I can understand what you're saying, sorry I got a bit rude. I don't overclock either. In fact I undervolt my gpu, although that didn't fix the loudness of my 3060 unfortunately.

I really can't say how much extra performance the 1050ti gets from overclocking but it is more than overclocking will get you nowadays.
 
Top Bottom