• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RedGamingTech - More PS5 "very unverified" info: Zen 3, SmartShift, Geometry Engine

Well, he says things that fans of the PlayStation platforms want to hear. His sources are twitter and 4chan. Every "rumor" that he has spoken of, regarding the consoles, has started on those platforms. From Rdna3, last minute spec bump on ps5, to this; all can be found being talked about on those social media platforms before he puts out a video on it. I used to watch his videos pretty consistently, as I am a PlayStation fan myself, until I started noticing this. To be clear though, I'm also an Xbox fan for what its worth. I have owned PS1, PS2, PS3, PS4, OG Xbox, Xbox 360, X1X for transparency.

To give him credit, he does seem to have a source (not insider, could be a certain person on twitter he follows) when it comes to AMD's PC gpus. He leverages that info gain creditability. Unfortunately, he has figured out something that a lot of these youtube tech guys have noticed, make a PS5 speculation video and you are going to get a ton of views. When it's all said and done, I could have pegged this guy wrong though only time will tell.

Edit: My opinion obviously.

His source is Matt Hargett, a software engineer who worked on the PS5.
 

[Sigma]

Member
All unnecessary info with needless drama. We have all the info we need about the PS5 already. Everyone knows that.....
ps5.png
+
PS5-SSD.png
PS5-SSD2.png
+
Play-Station-PS5-2020.png
=
thegoat.png
 
what exactly would the unified cache bring to the table tho?

for real, these CPUs are more than capable enough for 60fps, these features that Zen3 brings to the table on PC are mostly aimed at getting parity with Intel for super high refresh gaming.

That's what Matt Hargett said. The unified cache is needed for high refresh gaming. Although of course performance is affected by all things in the pipeline, but that's one less bottleneck in the over-all efficiency.
 

MarkMe2525

Gold Member
His source is Matt Hargett, a software engineer who worked on the PS5.
Like I stated beforehand, he gets his info from twitter, where Matt is active. The doubt comes from the fact if he (Matt) has access to proprietary information that is indeed secret, he would be bound to a NDA.

Paul takes this info and extrapolates from there.

Update: I'll take it a step further to explain my viewpoint.
Every couple of weeks I'll visit misterxmedia to see what crazy shit they are cooking up. It's a guilty pleasure of mine. There is a poster named misterc that does something similar. He takes posts from real engineers working in the field and legit hardware documentation, then in turn will extrapolate to come to ridiculous conclusions. The people there (because of confirmation bias) looks at it, and because he is deriving his conclusions from legitimate sources, will believe it wholesale.
 
Last edited:

rnlval

Member
Last edited:

geordiemp

Member

The point is poorly worded, smart shift is 2 ms apparently.

And the quote says GPU downclocks when CPU is under heavy load, to help out.

It does not say GPU downclocks when GPU is under heavy load which would mean loosing potential processing power.

You know that CPU and GPU are never fully loaded in the same ms in a frame of gaming. Hence why Sony did it.

Most posters never think in nanoseconds or even milliseconds.
 
Last edited:

rnlval

Member
The point is poorly worded, smart shift is 2 ms apparently.

And the quote says GPU downclocks when CPU is under heavy load, to help out.

It does not say GPU downclocks when GPU is under heavy load which would mean loosing potential processing power.

You know that CPU and GPU are never fully loaded in the same ms in a frame of gaming. Hence why Sony did it.

Most posters never think in nanoseconds or even milliseconds.
You missed "GPU sits a max frequency (2.23GHz) 95% of the time " which is 2.1185 Ghz.

2.1 Ghz is a known worst case.

My argument was 97% to 98% of max frequency (2.23GHz). LOL
 
Last edited:

MarkMe2525

Gold Member
Who RedGamingTech?

The guy who got literally everything he leaked absolutely spot-on so far?
Granted, I didn't take the time to look at every claim he has made, but any claim I have investigated was posted on Twitter or 4chan before he "leaked" it (mostly twitter). Of course, this does leave open the possibility that he may be a primary source on some info, but I have yet to find any situations where that was the case.

I hate to bang on about this guy, I have no grudge against him or anything. Does anyone have any info on "leaks" where indeed he was the primary source?
 
Last edited:

mckmas8808

Banned
XSX has 8 mb of CPU L3 cache in two clusters, 4 mb+4 mb.

Okay, so what's the big deal with the PS5's solution. Seems both went the same route.

This has the potential to greatly reduce latency (thus improve CPU performance) compared to two cluster (2x4mb) approach. Big part of Zen 3 IPC increase is due to unified L3 cache.

Okay, forget my last post above then. I understand the difference now.
 
Last edited:

LordOfChaos

Member
So Zen 3 reviews are out. What's the next plausible NDA that Sony would be waiting on to mention this? If they still have hardware advantages up their sleeves, it seems like time.
 
Granted, I didn't take the time to look at every claim he has made, but any claim I have investigated was posted on Twitter or 4chan before he "leaked" it (mostly twitter). Of course, this does leave open the possibility that he may be a primary source on some info, but I have yet to find any situations where that was the case.

I hate to bang on about this guy, I have no grudge against him or anything. Does anyone have any info on "leaks" where indeed he was the primary source?
A couple of quick examples.

Radeon VII. Most people were under the impression Vega 20 would not be made for desktop and would remain as datacentre exclusives in the form of Instinct MI50 and MI60.
He came out with its existence a day or two before the CES presentation in which it debuted.

Infinity Cache. Absolutely nailed the 128MB of "Infinity Cache" that a lot of people were skeptical with RDNA2, literally months before it launched.
Those are two big ones. Have been a number of smaller things along the way as well.
 
So Zen 3 reviews are out. What's the next plausible NDA that Sony would be waiting on to mention this? If they still have hardware advantages up their sleeves, it seems like time.

These are the same guys who made a video how to put used games into the console. If they had any tangible advantage here, we would know.
 
So Zen 3 reviews are out. What's the next plausible NDA that Sony would be waiting on to mention this? If they still have hardware advantages up their sleeves, it seems like time.

If I were Sony I would release a God of War: Ragnarok* footage before I discuss (or let open for discussion/end NDA) any of this stuff.

*or any of their 1st party game already using the Geometry Engine.

Even if none of these are true, I still think it's better to talk deepdive with proper game footage to prove their point.
 
Last edited:

longdi

Banned
sounds like bs and this guy is riding on his big navi strikes

either it is the same l3 setup as renoir
or whats the point of 'special' 8mb l3? it is too small, too slow and pointless with 4cores ccx.

read anandtech review of zen3 to know how amd got that performance uplift, and unified l3 is but a small factor. 🤷‍♀️
 

MarkMe2525

Gold Member
To give him credit, he does seem to have a source (not insider, could be a certain person on twitter he follows) when it comes to AMD's PC gpus. He leverages that info gain creditability. Unfortunately, he has figured out something that a lot of these youtube tech guys have noticed, make a PS5 speculation video and you are going to get a ton of views. When it's all said and done, I could have pegged this guy wrong though only time will tell.

Edit: My opinion obviously.
A couple of quick examples.

Radeon VII. Most people were under the impression Vega 20 would not be made for desktop and would remain as datacentre exclusives in the form of Instinct MI50 and MI60.
He came out with its existence a day or two before the CES presentation in which it debuted.

Infinity Cache. Absolutely nailed the 128MB of "Infinity Cache" that a lot of people were skeptical with RDNA2, literally months before it launched.
Those are two big ones. Have been a number of smaller things along the way as well.
I'm sorry for not being more specific as it matters in this context. I do believe he has a source in the AMD pc space as I quoted above. Do you know of any console specific leaks?
 

duhmetree

Member
You missed "GPU sits a max frequency (2.23GHz) 95% of the time " which is 2.1185 Ghz.

2.1 Ghz is a known worst case.

My argument was 97% to 98% of max frequency (2.23GHz). LOL
95% of the time the GPU is under max load.. As in 95% of the times it's at 2.23GHZ

When the CPU is under heavy load, GPU downclocks to 2.11ghz for a frame to help the CPU.

This is not from his 'trusted' sources, so make of it what you will
 
Last edited:

Rob_27

Member
'I'm gonna be honest with you, that smells like pure gasoline'..'They've done studies, you know. 60% of the time, it works every time'.
 
  • Like
Reactions: MrS

Lysandros

Member
95% of the time the GPU is under max load.. As in 95% of the times it's at 2.23GHZ

When the CPU is under heavy load, GPU downclocks to 2.11ghz for a frame to help the CPU.

This is not from his 'trusted' sources, so make of it what you will
Small correction, it's not '2.11ghz' specifically, it's 2.100's Ghz (hundreds), so it can be 2.190, 2.180 Ghz ect..
 
Last edited:

RaySoft

Member
If all this is true, Sony has some mayor "secret sauce" on their hands. It actually sounds really plausible as well.
Seems I was right on the money when I speculated about the primitive shaders in an older post, wich Im too lazy to dig up:-/

Edit: found it..
 
Last edited:
I'm sorry for not being more specific as it matters in this context. I do believe he has a source in the AMD pc space as I quoted above. Do you know of any console specific leaks?
To be fair, I don't recall any console specific examples. So you are right to be skeptical.

However, Paul is no joke when it comes to this stuff. He clearly has excellent sources.
 
I sure hope some of the people here get paid to speculate as much as the YouTubers. Otherwise, it's kinda crazy to review so much time into playing the guessing game, and then arguing why you you are right and the other person is wrong. On the flip side, it is pretty entertaining to see both sides throwing shit at each other's glass houses.
 

Vae_Victis

Banned
How many of RGT's bigger scoops panned out, out of curiosity?
In regards to PC stuff, a lot. He 100% has legit sources at AMD, and possibly at Nvidia.

As for consoles, he has been keeping a lot more vague and cautious, which I think is a reasonable approach if he isn't 100% sure himself of the things he reports on. But I don't think he had even a single real "blunder" there either (as in, he said he's really sure of something and then it turns out to be completely wrong).
 

Calverz

Member
This guy needs to give it up. Was he also the same guy who said the og xbox one had an extra gpu in the power brick?
 

CobraAB

Member
Dat Cerny.

The guy knows his shit.

Hell, he is old school. Developed the arcade Maxine Marble Madness fir Atari like back in 1981.

And I played it too!
 
Unified cache pool for the CPU is easy to verify per say. We'll see soon enough.

That will make or break a lot of that leak.
 
Last edited:

Allandor

Member
Man, they really prove they have no clue what they are writing.
The 8mb l3 is the same the mobile 4x00 CPUs had. Yes unified is a bit better, that's why the Zen2 CPUs in the Ryzen 4x00 line do not really loose that much performance vs haven a much bigger L3 like Ryzen 3x00 CPUs. But it is basically still the same core and the same on both consoles.
 
RGT is a clout chaser.
He is trying to build his channel, and make some $$ off it, so he pushes fanboy stuff to get those clicks.
Reality is he will get more clicks from a PS5 speculation video than one talking about a new motherboard.
We know from a legit Sony engineer that PS5 doesn't have machine learning.
We know that MS added 4 and 8 bit Int on their own to increase ML abilities, and AMD has not shown the same on their own RDNA 2 cards.
So if he is pushing that PS5 has 4 and 8 bit Int then he is clowning.
We know from the same Sony engineer that the PS5 wasn't full RDNA 2, and that was confirmed by MS saying XSX/S were the only consoles that were full RDNA 2.

Why people refuse to believe what a actual Sony engineer says, but swallow the RGT leaks is beyond me.

Now I am sure Sony made their own tweaks to the GPU and CPU, and these may eventually prove to be better than what MS did, but at this point we have nothing to show for it, so we just wait until it all leaks out, as it will do.
 

RoadHazard

Gold Member
This guy needs to give it up. Was he also the same guy who said the og xbox one had an extra gpu in the power brick?

You taking about MisterXMedia? No, that's a very different person, lol. This guy knows his stuff (that doesn't mean he has trustworthy sources though, I don't have any opinion on that).
 
Man, they really prove they have no clue what they are writing.
The 8mb l3 is the same the mobile 4x00 CPUs had. Yes unified is a bit better, that's why the Zen2 CPUs in the Ryzen 4x00 line do not really loose that much performance vs haven a much bigger L3 like Ryzen 3x00 CPUs. But it is basically still the same core and the same on both consoles.
It's going to be more important at higher framerate. 60fps but most importantly at 120fps (and notably on PSVR2 games). Saying that I don't think that's true.
 

MrS

Banned
Unless it comes from Sony, I don't believe it. Too many liars, warriors and FUD-spreaders bullshittin' rn.
 
Top Bottom