• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5.0 OUT NOW

Just to double down.
On consoles yeah its gonna be a hustle using lumen, we gotta rely on TSR to help these console last.
But on PC devs have options.
RTXGI and other implementations of GI within Unreal will produce results that beat Lumen.
Lumens reflections arent that good, weve already seen better reflections in realtime, its just about gauging whether its worth using or not.
This specific demo doesnt really give any real indication of what UE5 games will look like or run like.
Ive been messing with UE4 and UE5 seeing all the mixes of implementations to get the best results at the best performance, relying solely on Lumen is "kinda" a mistake, if you are developing on PC Unreal is easily extensible for really good results.
I can't wait for the first AAA game on Unreal 5, it's a shame it's taking this long. We are one and a half years into the gen already, I don't think there's even a single Unreal 5 game with a release date.
 
Last edited:

winjer

Gold Member
I can't wait for the first AAA game on Unreal 5, it's a shame it's taking this long. We are one and a half years into the gen already, I don't think there's even a single Unreal 5 game with a release date.

Consider that UE5 was only released last week. Before that there was only a beta.
 

winjer

Gold Member
That doesn't change the fact it was released one and a half years after a new gen, with AAA games taking years and years to release it seems way too late.

Unreal Engines were neve released at the same time as consoles.
The PS4 and Xbox One were released in 2013. UE4 was released in 2014. Games came even later.
The Xbox 360 was released in 2005. The first games with UE3 were released in 2006, one of them was Gears of War from Epic.
 

ethomaz

Banned
AAA companies have been developing for it for over a year
They where developing in UE4... most AAA devs are choosing to migrate to UE5 now.
Kingdom Hearts 4 is a recent exemple.
Square Enix will migrate do UE5.

From the interview few days ago:

"The full game will be made with Unreal Engine 5, and the quality of lighting and details will be several levels higher,”

Because right now the trailer and the game is using UE4.
 
Last edited:

Lethal01

Member
They where developing in UE4... most AAA devs are choosing to migrate to UE5 now.
Kingdom Hearts 4 is a recent exemple.
Square Enix will migrate do UE5.

From the interview few days ago:

"The full game will be made with Unreal Engine 5, and the quality of lighting and details will be several levels higher,”

Because right now the trailer and the game is using UE4.

That's one example, others have been using early builds of UE5 for ages.
and ofcourse it's even more common with smaller developers.

Wukong and hellblade would be examples I think.
 

ethomaz

Banned
That's one example, others have been using early builds of UE5 for ages.
and ofcourse it's even more common with smaller developers.

Wukong and hellblade would be examples I think.
Hellblade 2 doesn't even have a launch date and the demo showed are using UE4.
They said they "will" use UE5... they probably had migrated to UE 5 now but it is just few months.

Black Myth: Wukong confirmed they started to migrate to UE5 in September 2021... they will release in 2023.
Before that they were using UE 4.24 >> UE 4.26 >> UE 5.

I mean no game is being developed to Unreal Engine 5 for a long time period... the migrations are all very recent.
Unreal Engine 5 only become early access in mid 2021 before that devs can't even use it in demos.

It is not even possible to be using for over a year... because it is actually in devs hands for around 10 months only.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Plus the sample enables Hardware Raytracing automatically on GPUs that support it (RDNA1 doesn't obviously) so a framerate comparison is a bit nonsensical as the visual quality between these cards is not the same. Unlike previous UE4 games, Hardware RT doesn't tank framerate anymore in UE5 but performs pretty similar. However, it can look drastically different in places.
that shouldnt matter when we look at the 20 and 30 series, and 6000 series cards. They all support ray tracing and the comparison is a 100% valid. The only cards that should be discounted are the 5000 series RDNA 1.0 cards and 10 series Nvidia GPUs.

As for the demo being CPU heavy, I think all games are going to be CPU heavy going forward. We havent seen many next gen only games announced but the one big game that was announced was Avatar and they specifically talked about how they had destruction, animal stampedes resulting from destruction and enemy attacks and a lot of physics based interaction. All CPU heavy tasks.

If the new traffic simulation and pedestrian simulation is what is causing CPU bottlenecks then i expect that in every single open world even if they arent using UE5.
 

Lethal01

Member
It is not even possible to be using for over a year... because it is actually in devs hands for around 10 months only.

It's been publicly available for that time sure but big companies have had access to it before that, anyway, just saying it's not like devs were all specifically waiting for the official release.
 

kraspkibble

Permabanned.
i have never made a game in my life nor intend to but it would be cool to fuck around in this. is it easy to build shit for fun or do i need to spend a lot of time learning?
 

ethomaz

Banned
It's been publicly available for that time sure but big companies have had access to it before that, anyway, just saying it's not like devs were all specifically waiting for the official release.
Can you source that?
I don't believe any developer had access to UE5 before the Early Access.
 

winjer

Gold Member
It's been publicly available for that time sure but big companies have had access to it before that, anyway, just saying it's not like devs were all specifically waiting for the official release.

But that was only a preview build.
A few companies have taken this time to experiment and test it, but to make and publish a game, on what is essentially an alpha of a game engine, it not a good idea.
 

ethomaz

Banned
UE5 has been in developers hand for quite a while.
That is unrelated to what I asked.

I just checked Unreal Engine 5 become available to developers (including AAA) in May 2021 with the Early Acess.

You could not even migrate your UE4 to UE5 before the early access. Epic developing it doesn’t mean developers can use it.

I could not find any developers that started to use UE5 before the early access.

So pushing the narrative games are being developed to UE5 for years is bullshit.
 
Last edited:
i have never made a game in my life nor intend to but it would be cool to fuck around in this. is it easy to build shit for fun or do i need to spend a lot of time learning?
You need to spend time learning, but there is a ton of free resources available to teach you. Something like gamemaker studio is more beginner friendly but also better suited to simple 2d games. I say just download unreal, jump in and mess around, hit some tutorials for the basics and set a very small goal for yourself, then grow knowledge from there.
 

ethomaz

Banned
No.

Indies will probably find better deals with Unity… that is the market domination engine.
CDPR words just confirmed what I said.

New generation tech that is what UE5 is increases dev time and development cost.

The trade off you have is better quality and more photorealistic graphics.
 
Last edited:

PaintTinJr

Member
Just to double down.
On consoles yeah its gonna be a hustle using lumen, we gotta rely on TSR to help these console last.
But on PC devs have options.
RTXGI and other implementations of GI within Unreal will produce results that beat Lumen.
Lumens reflections arent that good, weve already seen better reflections in realtime, its just about gauging whether its worth using or not.
This specific demo doesnt really give any real indication of what UE5 games will look like or run like.
Ive been messing with UE4 and UE5 seeing all the mixes of implementations to get the best results at the best performance, relying solely on Lumen is "kinda" a mistake, if you are developing on PC Unreal is easily extensible for really good results.
Lumen includes HW RT, so IMHO this comment misrepresents what "Lumen" actually is.

The SW aspect of lumen is to provide an efficient approximation of RT beyond the foreground, so making a comparison between SW lumen results - on shinny surfaces with less subdivision - and HW RT in the foreground feels disingenuous, and in regards of the consoles we've already seen very good reflection foreground results in Miles Morales in quality mode - even if the rest of the visuals could be described as last-gen.

No matter how powerful the current gen GPUs get - mid or high - the amount of foreground you can build BVHs for and do HW RT in real-time with is still going to be the thin end of the wedge. And if doing something with kitbashing - like Valley of the Ancient - where reflections are no existent but you need coherent and volumetric GI, SW lumen is going to be the correct solution IMHO, unless a superior solution emerges.

TSR is needed to improve signal to noise ratio of the SW lumen, so using it seems good IMO, not a negative.
 

winjer

Gold Member
A user at Guru3d, compiled the demo using UE5.1 and using Shader Model 6
This has improved frame rate while using TSR, bit a significant margin. Frame rate is now comparable to the demo using DLSS, using the same resolution scale.
It' doesn't fix everything with the demo, but it's a step in the right direction.

Here is the link:


https://1fichier.com/?ddbwcm7y015bp2oqx56h
 

ethomaz

Banned


- They talk about Unreal Engine 5 and that it is not ready yet... performance is not there but it will be there in the future.
- Talks about the ridiculous shader compilation issue (it is not how the PC Master Race wants to play games invest several $$$).
- Talks about how the manufacturer will sold the Parallelization Cores x Single Core issues to get more performance.
- Talks a PS5 having 6.5 cores to developers while Series X has 7 cores available to devs

I did not watch everything.
 

Loxus

Member


- They talk about Unreal Engine 5 and that it is not ready yet... performance is not there but it will be there in the future.
- Talks about the ridiculous shader compilation issue (it is not how the PC Master Race wants to play games invest several $$$).
- Talks about how the manufacturer will sold the Parallelization Cores x Single Core issues to get more performance.
- Talks a PS5 having 6.5 cores to developers while Series X has 7 cores available to devs

I did not watch everything.

I wonder what they mean by 6.5 cores. Can a core be dedicated in half or do they mean 6 cores and 1 thread?
 

ethomaz

Banned
I wonder what they mean by 6.5 cores. Can a core be dedicated in half or do they mean 6 cores and 1 thread?
Mostly like 6 physical cores and 5 virtual cores are allocated to game and 2 physical cores and 3 virtual cores to System and stuffs.
I don’t know how that is done but hat is the only way to do that in Windows I believe but they use a custom FreeBSD.

I wonder if the devs can disable SMT like in Series consoles? If that happened it will be 6:2.
 
Last edited:

CamHostage

Member
Can you source that?
I don't believe any developer had access to UE5 before the Early Access.

So, The Coalition got their build in November of 2020 (probably a similar Early Access build as what was released in May of 2021), and they were modeling some work in October 2020 on UE4 in preparation for what they were expecting from early UE5 based on the May 2020 showcase and conversations with Epic. But they are a spearhead team for Xbox (their whole Alpha Point project was to provide feedback to MS and Epic about using and optimizing UE5 for the Xbox Series consoles.) They're a tip-of-the-spear team (they collaborated with Epic again on the Matrix Awakens project,) it doesn't make any sense that anybody would have actually had access before they did. Maybe a select few others got brought into the early-Early Access elite community around that same late-2020 period... maybe?

Source (sorry I don't have the timecode but they talk about it in there.)

And then GSC Game World tweeted in August of 2021 that they were using UE5, but that was after Early Access released, and I would imagine none of the actual demos or footage of STALKER 2 was ever actually from even a UE5 Early Access build. (The first "In-Engine Gameplay Teaser" came out in December of 2020, and they've been making the game for years before that.)

Black Myth and everybody else who has shown anything which will ultimately be considered a UE5 game has, as you have said, been working in UE4.2X and experimenting with / transitioning to UE5, but I don't think anybody has done a showing of, "We have built our game entirely in UE5, and here it is..." I was hoping there was something that would have been an actual game inside that UE5 Release Day event, but the best we got was an announcement of a distant-future not-even-started Tomb Raider. Tons of games being made from here on will use UE5, because there are few reasons not to, and many UE4 projects will move into UE5 as well, assuming the conversion goes well and compatibility is clean. (UE5 only dropped a few components of UE4 and it's generally an upgrade rather than a rewrite, but there are some key differences that would leave a developer in UE4; also the benefit of upgrading just for the sake of it wouldn't be the smartest move for a crunched developer trying to ship a game...) Still, we've only got Matrix Awakens (albeit an actual thing we can play on our consoles, even though it doesn't have much "game" to it) to demonstrate UE5 yet.
 
Last edited:

Loxus

Member
Mostly like 6 physical cores and 5 virtual cores are allocated to game and 2 physical cores and 3 virtual cores to System and stuffs.
I don’t know how that is done but hat is the only way to do that in Windows I believe but they use a custom FreeBSD.

I wonder if the devs can disable SMT like in Series consoles? If that happened it will be 6:2.
This is from RedGamingTech.
Playstation 5 | RDNA 2 GPU Features, OS Reserves, Geometry Engine | EXCLUSIVE
"For the CPU reserves, I was told that there’s 1 core dedicated to OS functionality (so one physical core, with two threads), and this, of course, leaves 7 cores (14 threads) available for games."

I have a better time believing RedGamingTech over Digital Foundry when it comes to these kind of things.
 

hlm666

Member
This is from RedGamingTech.
Playstation 5 | RDNA 2 GPU Features, OS Reserves, Geometry Engine | EXCLUSIVE
"For the CPU reserves, I was told that there’s 1 core dedicated to OS functionality (so one physical core, with two threads), and this, of course, leaves 7 cores (14 threads) available for games."

I have a better time believing RedGamingTech over Digital Foundry when it comes to these kind of things.
Well the logical answer would be the same way they did it on ps4 to split a core right? DF know more about how the consoles work than RTG.

 

Loxus

Member
Well the logical answer would be the same way they did it on ps4 to split a core right? DF know more about how the consoles work than RTG.

Digital Foundry to this day doesn't know how Smart Shift works on PS5, so that's a no.
And let not forget about them claiming the PS5 doesn't support RT even though Cerny said it does in a Wired article.

RGT is know for leaking PC hardware specs.
DF only know for analyzing frame rate and resolution of games.
 

winjer

Gold Member
RGT is know for leaking PC hardware specs.

RTG is known for spamming rumors until something sticks.
Then he makes a video bragging about being right one or another time, ignoring all those times he missed.
I was subscribed to his channel for a while, but the more I watched, the more I came to realize he is a terrible source of information.
Among most PC hardware forums, he is no longer well regarded, because most people already caught up to him.

He does have knowledge about hardware above the average gamer, but he is not a professional or someone with deep insight into tech.
There are much better people in the hardware media, with much better knowledge about hardware. And with real contacts in the industry.
A good example to contrast to RTG is Tech Tech Potato. It's run by Dr. Ian Cutress, and he has a ton of interviews with people from Intel, AMD, nVidia, TensTorrent, Qualcomm, etc.
And then he has a several analysis about hardware, that are accurate and reliable.

There are many good sources for hardware analysis, but RTG is far from being one of them.
He does have a decent ability to fool people with less knowledge of hardware, into thinking he knows more, than what he really knows.
 
RTG is known for spamming rumors until something sticks.
Then he makes a video bragging about being right one or another time, ignoring all those times he missed.
I was subscribed to his channel for a while, but the more I watched, the more I came to realize he is a terrible source of information.
Among most PC hardware forums, he is no longer well regarded, because most people already caught up to him.

He does have knowledge about hardware above the average gamer, but he is not a professional or someone with deep insight into tech.
There are much better people in the hardware media, with much better knowledge about hardware. And with real contacts in the industry.
A good example to contrast to RTG is Tech Tech Potato. It's run by Dr. Ian Cutress, and he has a ton of interviews with people from Intel, AMD, nVidia, TensTorrent, Qualcomm, etc.
And then he has a several analysis about hardware, that are accurate and reliable.

There are many good sources for hardware analysis, but RTG is far from being one of them.
He does have a decent ability to fool people with less knowledge of hardware, into thinking he knows more, than what he really knows.
If it doesn't support his narrative, he won't ever listen, regardless of your factual post. Look back at the other unreal threads, and you'll realize you wasted time typing that out.
 

PaintTinJr

Member


- They talk about Unreal Engine 5 and that it is not ready yet... performance is not there but it will be there in the future.
- Talks about the ridiculous shader compilation issue (it is not how the PC Master Race wants to play games invest several $$$).
- Talks about how the manufacturer will sold the Parallelization Cores x Single Core issues to get more performance.
- Talks a PS5 having 6.5 cores to developers while Series X has 7 cores available to devs

I did not watch everything.

Given that DF had the inside track about how much of an SPU (14%?) that a social feature needed on PS3 - think it was voice chat IIRC, hence the lack of cross game chat - the 6.5 and 7 cores number is almost certainly correct IMO, especially if without context it has enabled Richard to paint the PS5 as less powerful while failing to provide context if PS5 developers are using most of the cores with SMT disabled - like was reported soon after reveal was how XsX developers were using the CPU cores to get the higher clocks for the streamed Velocity Architecture workloads, followed by a discussion of how much of the PS5 CPU was needed for the IO complex transfers IIRC.
 

ethomaz

Banned
Given that DF had the inside track about how much of an SPU (14%?) that a social feature needed on PS3 - think it was voice chat IIRC, hence the lack of cross game chat - the 6.5 and 7 cores number is almost certainly correct IMO, especially if without context it has enabled Richard to paint the PS5 as less powerful while failing to provide context if PS5 developers are using most of the cores with SMT disabled - like was reported soon after reveal was how XsX developers were using the CPU cores to get the higher clocks for the streamed Velocity Architecture workloads, followed by a discussion of how much of the PS5 CPU was needed for the IO complex transfers IIRC.
While it is probably true it make the things even more weird.
Series X has the clock and number of cores (or half-core) advantage but when games are CPU bound they run better on PS5 while only when CPU is not holding it the Series X can show the GPU advantage.
 

winjer

Gold Member
While it is probably true it make the things even more weird.
Series X has the clock and number of cores (or half-core) advantage but when games are CPU bound they run better on PS5 while only when CPU is not holding it the Series X can show the GPU advantage.

Could be an issue with Microsoft APIs causing some extra overhead.
 

PaintTinJr

Member
While it is probably true it make the things even more weird.
Series X has the clock and number of cores (or half-core) advantage but when games are CPU bound they run better on PS5 while only when CPU is not holding it the Series X can show the GPU advantage.
I didn't spend much time looking at the Matrix demo on my friend's XsX, but I get the impression that the collisions caused bigger issues on the XsX frame-rate than PS5, so I'd be inclined to think that the PS5 is probably using less cores with SMT disabled in the 6.5 made available to devs, which improves availability handling random events - like physics collisions if doing it on the CPU - than doing that type of processing on cores in 1-way mode.

I would guess that PlayStation are keeping 1x 2-way core back for running a hypvisored dedicated game server on each users console for being able to dynamic host to minimise hop counts between all connected users in games, and keep another thread of another core back for any future needs that might arise. Xbox probably do the same for serving, but haven't kept a thread back because they'd just release another console in the series if a new feature required better hardware, and because the disparity in IO complex to Velocity Architecture of CPU overhead, the difference in console Audio solutions with the PS5 having a dedicated Tempest engine, means the XsX is - I suspect - actually doing far more work with its CPU than the PS5 with an extra thread and higher clock, and possibly even using the additional CUs for the physics on the GPU, but still being less responsive.
 

N1tr0sOx1d3

Given another chance
Out of interest, do you think games can differentiate themselves from each other when using the same engine?
For example, UE3 back in the day, most games looked so similar, you could tell it was a game made on the UE engine. Wonder if the same would be true today and UE5?
 
Out of interest, do you think games can differentiate themselves from each other when using the same engine?
For example, UE3 back in the day, most games looked so similar, you could tell it was a game made on the UE engine. Wonder if the same would be true today and UE5?
Every game engine has its quirks and features.
 

Lethal01

Member
Out of interest, do you think games can differentiate themselves from each other when using the same engine?
For example, UE3 back in the day, most games looked so similar, you could tell it was a game made on the UE engine. Wonder if the same would be true today and UE5?

maxresdefault.jpg
cloud-buster-sword.jpg

00XpAmYpxd6wvLkvIO672zY-5.fit_lim.size_1200x630.v1569471329.png
1280x720.jpg


Only limit is the creator themselves, they are as free with Unreal as they are with any other engine or something built from the ground up. at least when it comes to visuals.

The engine has little to deal with it, yes if the creators choose to use the default setting and download stock assets there will be games that look similar but for those kinds of devs they would have ended up with somethign generic looking no matter what.
 
Last edited:

CamHostage

Member
Out of interest, do you think games can differentiate themselves from each other when using the same engine?
For example, UE3 back in the day, most games looked so similar, you could tell it was a game made on the UE engine. Wonder if the same would be true today and UE5?

With how much lighting has evolved past "tricks" of those times and how complex a sculpted object in a scene can be these days, I think we're past the days where everything in UE3 tended to look the same because of the limitations of techniques. It's a much deeper simulation of reality now. Seems unlikely you'd look at one metal material or one world lit by a GI sun source and recognize it as "too UE5'ish".

However, we are already at the point where, thanks to the proliferation of Unreal Marketplace assets and the free Quixel megascan assets, there will be and already is a visible replication of items in a world. You can tell right now, when looking at a lot of these "20 INCREDIBLE Unreal Engine 5 Fan-made Graphics Demos", that they're mostly made of popular assets that somebody else made and they just paid for (or got free) and stuck into the scene with glossy lighting. Sometimes even games are announced by indies, you look at them and go, "that's Medieval Village, that's ArchViz Bedroom 1, that's Modular Sci-Fi, those are 3D Cars Sportscar 1 vehicles..."

You may get deja-vu from seeing the same rock over and over again in different games.

WtHxlll.jpg


(*MetaHumans might be similar for a while too. You can vary a character created with MetaHuman quite a bit, but you're still operating under one common sense of anatomy and movement, and that's difficult to overcome. It'll be interesting to see if people will recognize a Unreal game from a Unity game based on just if a game uses MetaHumans versus a Ziva Character Virtual Human?)
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Is the demo crashing for people here? I got a 3080 last week and while i can run it at native 4k 45 fps, it crashes all the time. I dont remember it crashing on my 2080.

Downloaded the latest compiled demo winjer linked above and same thing. Crashed within minutes. All I was doing was driving around really fast.
 

PaintTinJr

Member
Is the demo crashing for people here? I got a 3080 last week and while i can run it at native 4k 45 fps, it crashes all the time. I dont remember it crashing on my 2080.

Downloaded the latest compiled demo winjer linked above and same thing. Crashed within minutes. All I was doing was driving around really fast.
I haven' but try increasing your virtual memory in windows . I had crashes with the early access despite having 12GBs Vram, 32GBs of RAM, the virtual memory was set too low and I had to increase it to bigger than my VRAM size.
 

winjer

Gold Member
Is the demo crashing for people here? I got a 3080 last week and while i can run it at native 4k 45 fps, it crashes all the time. I dont remember it crashing on my 2080.

Downloaded the latest compiled demo winjer linked above and same thing. Crashed within minutes. All I was doing was driving around really fast.

Curiously, you are the second person saying they have crashes on RTX 3000 series. But not on RTX 2000 series.
The other person that I saw, said he had 3 PCs with 3000 series, all crashing with these demos. But he said he was seeing this with the DLSS version.
Do you also have these crashes with non DLSS demos?
 

assurdum

Banned
Well the logical answer would be the same way they did it on ps4 to split a core right? DF know more about how the consoles work than RTG.

DF spread a lot of misinformation in the past when they talked about ps5 hardware however. They are still convinced and argue smartshift "bottlenecked" CPU GPU performance in some way which it's really annoying to hear today. It's hard to take them seriously about anything relative to the ps5 because every time they talked about some aspect, look at the coincidence, it's just to compare with the XSX hardware where "apparently" is suited better imo.
 
Last edited:

hlm666

Member
DF spread a lot of misinformation in the past when they talked about ps5 hardware however. They are still convinced and argue smartshift "bottlenecked" CPU GPU performance in some way which it's really annoying to hear today. It's hard to take them seriously about anything relative to the ps5 because every time they talked about some aspect, look at the coincidence, it's just to compare with the XSX hardware where is fitted better imo.
That RTG dude is just as tragic as MLiD, he just takes twitter "leakers" and makes video after video swinging one way to another so of course once a hardware release happens one of his 100's of videos will have something close to reality. Case in point, he has some video where he talks about a source told him dlss 3 is coming before lovelace, he then goes on to say but he thinks dlss 3 will launch after lovelace. Can you see the fortune teller shit wrong with that? He has covered his base for both possible scenarios, if it comes before his "source" is right and he must have legit sources. it comes later he is right so he obviously knows his shit..............

DF get shit wrong especially when they are playing the speculation game before hardware releases but channels like RTG are a whole other tier of the fantasy hardware league and a complete waste of time.
 

Loxus

Member
RTG is known for spamming rumors until something sticks.
Then he makes a video bragging about being right one or another time, ignoring all those times he missed.
I was subscribed to his channel for a while, but the more I watched, the more I came to realize he is a terrible source of information.
Among most PC hardware forums, he is no longer well regarded, because most people already caught up to him.

He does have knowledge about hardware above the average gamer, but he is not a professional or someone with deep insight into tech.
There are much better people in the hardware media, with much better knowledge about hardware. And with real contacts in the industry.
A good example to contrast to RTG is Tech Tech Potato. It's run by Dr. Ian Cutress, and he has a ton of interviews with people from Intel, AMD, nVidia, TensTorrent, Qualcomm, etc.
And then he has a several analysis about hardware, that are accurate and reliable.

There are many good sources for hardware analysis, but RTG is far from being one of them.
He does have a decent ability to fool people with less knowledge of hardware, into thinking he knows more, than what he really knows.
I never said he is 100% always right about PC leaks. I said it's what he is know for, leaking PC specs.
DF never leaked one thing about the PS5 specs or console hardware in general.

RGT had a whole thread here about PS5 Pro.

Look at Digital Foundry's history about PS5 specs and you would know why not to believe them.

One should ask why would the OS even need more than one Zen 2 core while playing a game.

This isn't like the OS PS4 that had to utilize two Jaguar Cores.

Jaguar was a weak CPU with one thread pre core. While Zen 2 is much more powerful and has two threads pre core.

Either way it doesn't matter as PS5 performance is still good and sometimes match or outperform the competition.
 
Top Bottom