• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Matt weighs in on PS5 I/O, PS5 vs XSX and what it means for PC.

kuncol02

Banned
Do you really think Sony and Cerny would build a whole console around the idea that you could use fewer CUs with more clock more efficiently
They didn't build console around that. It's designed around ultra fast SSD and less CUs is compromise which they had to do to achieve that.

Microsoft has build the best of what is available to us now.
Because PS5 contains technology from next century? Both didn't build console with best available now technology. Both did big compromises to keep their budget.
 
From all the chatter regarding SSD's, it sounds to me like these new SSD's will allow:

1) game worlds will be alot more detailed since the SSD's can stream in very high quality assets (and lots of them) very quickly which should lead to game worlds where there are hundreds of high quality objects on screen at any given time.

2) Super fast loading times.

3) Ease of development for developers since they won't have to duplicate assets.

4) Create new types of games since the data will be readily available to read from the SSD at any given time.

5) Incredibly responsive UI's and system functions.

Did i get all of the above right?
Is there anything i may have missed out?

*not the most technical minded person but this is what i gather from reading and watching what dev's have to say about the SSD's in these new consoles*
 

killatopak

Gold Member
well, XSX is not going to run PS4 games.
And I wouldn’t trust a Naughty Dog engineer anyway. Of course, they try to downplay XSX.
PS4 gen not PS4 games. That includes XBO games which are games the XSX will get majority for the first two years. Even first party games aren’t an exemption.
 

chilichote

Member
They didn't build console around that. It's designed around ultra fast SSD and less CUs is compromise which they had to do to achieve that.
Both did big compromises to keep their budget.

Of course it is, it is designed for efficiency; including SSD.
 

Men_in_Boxes

Snake Oil Salesman
I often think if we could harness the energy created in this thread, which contains no new info, we might uncover a renewable energy many times more efficient and green than solar, wind, or hydro power. We are so close.
 
Last edited:
That was the interview with the XSX architects and that description was their description of the ML enhancements you talk about. There has been no Tao of additional parallel HW, not even the slightest AMD RDNA related patent, that support your scenario beyond a “Why not? How do you know? Maybe it is there who’s to say” argument.

Not to talk about the irony of talking about fanboy goggles and another wall of text to try to claim yet another secret trump card to make the case for XSX to win in all scenarios... now it is secret extra HW which is there because it must else Sony has a win there where we cannot magically close the gap or hand wave it away.

So you know every bit of RDNA2's standardized specifications now? I mean, you've gone patent-hunting assumedly, by these remarks. So you must know everything standardized to the RDNA2 spec. How about sharing that with the legions of other people here and abroad who are still unknowing of what those particular features are?

Ah, you can't? Why? Wait, no never mind don't answer that I've had enough of a chuckle as-is.

You can probably guess what I'd say next given your posting history with this type of stuff and your own admission of bias. I don't hide behind walls of text, I simply know enough things and have enough fun speculating to put those thoughts to words, that's all. You're fantasizing these instances where I have somehow pulled trump cards for one platform out of thin air to give it an advantage when I'm literally doing the same thing many did when it came to looking at the GPU specs and things like TFs: the paper specs don't tell you everything.

Now though it's apparently an affront because it's being used to be cautiously optimistic yet critical of certain performance claims for PS5 even though I have done the same quite a bit with different parts of XSX's architecture such as the memory pool "split" RAM, SSD I/O (never claimed any optimizations would make it equal or better than PS5's, but being an unbiased gamer who'd like for both systems to be within reach of each other on this to the benefit of the 3rd party developers is what drives that outlook in me), and I've definitely been a bit of a debbie-downer with regards Lockhart (but not for the outdated (and demonstrably false) talking points some others are; mine have been around the logistics of manufacturing as related to cost and the ratio split between it and XSX, and how that could impact supply by misreading demand).

So that's a hard miss from me, bud :LOL:

You took that 10x multiplier remark far too literally, it's not a claim. The entire point I was making is that we don't know and we need proper testing to see just how much effect the difference between the PS5 and XSX SSD could have on something like VRAM. The point was that we can't put a number on it so saying that they can both do it is silly.

Hey you're the one who gave a ridiculous figure in the first place. We're trying to discuss the technical merits of these consoles in reasonable ways so you throwing a super-suspect figure around and having no method to try backing it is going to make some of us curious, that's all.

We can't put a number on it, yes, but we can make some rational guesses based on what has been provided and by looking at similar implementations in other fields, even historical ones, and using some critical analysis. Having a somewhat rich understanding of how most of this tech works (and its role in relation to other components in an overall architecture) also helps go a long way.


There's no point to this if you are misreading what I say this badly. IN GENERAL we are seeing developers saying that the PS5 storage architecture is more impressive than that of the SX and that it's paired with an SSD that's just faster. You are free to have a hunch that most developers speaking on it are uninformed, paid or biased but the point is that we have more reason to believe that the PS5 SSD is far above that of the XSX than we have to believe that they are near the same level. Why are basing their speculation around what's most likely to be true?

In general we are seeing what gets put out by those in the media who have a certain angle in terms of narrative or messaging they want to convey. Both Sony and MS butter up the media/press for desired coverage of their platforms and products, but Sony has a bit of a recent history of doing this in rather more notable form across multiple divisions, even with media platforms they own themselves. I didn't want to bring that aspect of the coverage up but it's a hidden truth that doesn't get discussed too much, some for obvious reasons.

Anyway aside from that it's just like what myself and others speculated would happen; you will have some developers preferring one platform and others preferring the other platform. However, it helps to take a look at what connections those making these kind of claims have to the platform they favor because there is always some aspect of business and backend gaming politics to this type of thing. There's always a bit of embellishment for the sake of generating good PR, that goes for both Microsoft and Sony, AND developers who are speaking in favor for their platforms.

Again you're taking otherwise rationale points of contention on my part and exaggerating them into speculative claims that were never stated or hinted at. We go by the paper specs and we can see which system has the more robust SSD I/O system. However it shouldn't be controversial to state that paper specs don't mean everything (and we don't even have all the SSD I/O hardware specs for either system yet), and that the system in this area that looks weaker on paper may punch above its weight in practice, as that's already an afforded optimistic assumption with the PS5's GPU.

You guys trying to twist that into implying XvA customizations/optimizations will suddenly make it leapfrog Sony's SSD I/O are being so disingenuous it makes my smarmy smile hurt.

I'm not sure some of you here can actually be serious.

Do you really think Sony and Cerny would build a whole console around the idea that you could use fewer CUs with more clock more efficiently, some of whom would even think it would be more expensive than the competition if they could have had the same thing as the XSeX? for less cost?

I think if the PS5 with its underlying technology also had 12 TF's, we would also hear completely different things from the developers.

Exciting times are ahead^^

You can only work with what your financing allows, and that comes down to what the big wigs decide. From the outset that limited them to a 36 CU design, though there was at least talks of a 48 CU design going by the hypothetical example Cerny mentioned (I mean there's not much other reason to use 48 CU GPU as your hypothetical).

PS4 and PS4 Pro BC also limited their GPU design to 36 CUs though, again, if they were considering a 48 CU design they maybe could've gone for a larger chip if the corporate heads decided it were worth the cost.

What you're asking here is no different than something trying to ask why they'd go with 14 Gbps GDDR6 chips instead of 16 Gbps GDDR6 or even HBM2 because, hey, "the vision must be realized!". The truth is, financial and economic realities ALWAYS take the cake in deciding what way system designs go for mass-market products like game consoles; there's only so low you can go with bulk pricing at the millions, only so much you can lose on BOM relative MSRP until you've screwed yourself over for the long-term etc.

I'm sure Cerny could've developed a quantum processor system with 32 GB HBM2E and 20 TF of power if he were designing the system for a very specific market segment, but that wouldn't sit well with the top brass at the company signing off on these budgets and R&D resources. That'd be like some high-level arcade machine 'ish. Sony aren't making modern-day arcade boards.
 
Last edited:
I agree with every point you made.

I just want to amend that Cerny didn’t say it was hard to saturate higher CUs. It was just that it was easier to saturate lower amounts.

Now this is a more reasonable conclusion. It's always easier to saturate a smaller cluster of CUs (or equivalents on Nvidia) than a larger one. A lot of that depends on the frontend, too, and thankfully AMD made giant gains with that for RDNA and especially RDNA2. Comparatively speaking, the frontend on the early GCN architectures like those in PS4 and XSX were not very great.

Real world effect would probably mean developers especially third party ones will be able to squeeze more performance out of the Series X as the gen goes on.

It will always come down to what the design needs are. From what I can see so far it will be easier to squeeze out most performance from Sony's platform since a lot of hardware there is "automating" the process to ease things on developers. However, MS's platform might have more "creative" potential when it comes to programming implementations that can be applied, including some taking advantage of quirks in their architecture like the "split" RAM pool. But these will require more time and effort to effectively utilize since much less of it is "automated" through hardware for devs.

What I personally think will happen is that third party will peak the 10.2 TF PS5 around mid gen while they would peak XSX’s 12TF a year or two later. This is in terms of GPU.

Don’t worry PS5 fans, the same principle will probably happen for the SSD as PC inevitably catches up to XSX speeds and later on PS5 speeds which means there’s no need to purposely gimp XSX and PS5 third party games just to make it work on a larger number of devices.

Would like to think with the architectural improvements both systems can hit near their theoretical peaks. Will they hit exactly those peaks? Probably not. But the closer, the better. We can already see some of the customizations and optimizations Sony's done to try ensuring they can on their end, and I'm sure we'll learn more about MS's answers with that in the coming weeks.

The thing facing SSDs on PCs right now isn't really lack of hardware; there's already drives from companies like Sabrent that are technically faster than what MS has (commercial drive) and Sony (non-commercial). The issue is more to do with the underlying file systems and certain aspects of PC designs (not aesthetic, more talking about how components are arranged in the motherboard designs etc.) that present some issues.

That's one of the reasons MS (and other companies like Nvidia) are doing what they are, and one reason I think MS went with a more software-orientated approach to resolve those bottlenecks: to cut down on the amount of potentially expensive proprietary hardware PC owners would need to buy. They're probably looking towards MB manufacturers and CPU/GPU manufacturers to standardize parts of their hardware to be compliant with their API stacks, and to make sure the software side on MS's end is scalable and powerful enough to not get in the way of what the hardware vendors produce.

It's a pretty different approach from Sony's but it will also allow them to scale and optimize it for specific market implementations like Xbox, Surface, Azure etc. And since a lot of it is software-driven, it can be updated relatively easily over time.
Now
 
Last edited:

killatopak

Gold Member
It's a pretty different approach from Sony's but it will also allow them to scale and optimize it for specific market implementations like Xbox, Surface, Azure etc. And since a lot of it is software-driven, it can be updated relatively easily over time.
This is gonna be something to look forward to in the future. There’s lot of potential with Xbox tech crossing over to their other ventures.

We’re not entirely sure which method is more cost effective but I can safely say that Xbox’s method has a lot higher chance of being adopted by the market which may prove to be an advantage in the long run.
 

jimbojim

Banned
Can you please stop with this console warring nonsense? Also, your username checks out. lol

What is so smart about it? That's why devs already have issues with PS5 at the beginning of the gen? They have to throttle the CPU to make sure that the GPU is able to run at a sustained clock:

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.


Throttling the CPU is at the beginning of the Gen not a huge problem, because games are still designed with jaguar in mind, but soon enough, when nextgen only games arrive, devs can't just throttle the CPU to have a sustained GPU core. There will be games that need 100% of the CPU clock.

By this, I would say, PS5 is one of the worst architectures.

Why
Do
You
Spread
The
FUD?

Why
Are
You
Doing
This??


Why you didn't quote the whole text, not part of it and take it out of context?

CPU throttling is cross-gen games related with Jaguar in minds :

There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.

 
Last edited:
This is gonna be something to look forward to in the future. There’s lot of potential with Xbox tech crossing over to their other ventures.

We’re not entirely sure which method is more cost effective but I can safely say that Xbox’s method has a lot higher chance of being adopted by the market which may prove to be an advantage in the long run.

I have my own hunch that Sony's is probably the cheaper one at least in terms of some component costs like the NAND modules, and they're also not using proprietary software standards like BCPack so there's R&D costs saved on that end.

Indeed though, MS's solution is targeted for market sector saturation; there's some aspects of PS5's setup that Sony could leverage in other products like certain audio systems (if they still had the VAIO line they could've leveraged the SSD I/O in that as well aiming it for targeted productivity), but overall it's a smaller range of diverse markets unless they license the tech out to other partners. Historically Sony hasn't done that unless it's a tech they were only a partner in from the start, like Cell and Blu-Ray.

They do benefit from the potentially massive gaming market saturation with PS5 if previous performance is anything to go by, though.
 
Why
Do
You
Spread
The
FUD?

Why
Are
You
Doing
This??


Why you didn't quote the whole text, not part of it and take it out of context?

CPU throttling is cross-gen games related with Jaguar in minds :




Yo be like

b24f8c31922ff525f8e4d4484e43337c.gif
 

Exodia

Banned
I think Ps5 can stream data per frame like the UE5 demo, and games that do made for ps5 only from sony will look a gen above anything on XSX.

There will be 2 classes of game visuals next gen.

Do you think 20 TF would of been able to run that UE5 demo at that detail - no.

Are you seriously trying to say the difference between PS5 exclusive games and XSX games will be like RDR vs. RDR2, TLOU vs TLOU2, GOW3 vs. Gears5?
Or are you just trolling?

tenor.gif
 
Last edited:

Exodia

Banned
They said they will release an early version on early 2021, but the final, full featured version in late 2021

That's not how Unreal Engine/Epic game releases works. The entire engine with its entire feature will be released, including the source code and the exact PS5 demo.
 
Last edited:

THE:MILKMAN

Member
That's not how Unreal Engine/Epic game releases works. The entire engine with its entire feature will be released, including the source code and the exact PS5 demo.

Direct from Epic's site:

Epic said:
Unreal Engine 5 will be available in preview in early 2021, and in full release late in 2021, supporting next-generation consoles, current-generation consoles, PC, Mac, iOS, and Android.

 
You are talking about a theoretical maximum (where all the CUs are simultaneously running at full load).
Any GPU that does that for a prolongued time is forced to throttle down.
There is no such thing as maximum sustained teraflops.

Stop with this stupidity FFS.

At all times you can always access up to 12.15 tflops.

This isnt hard. You are talking utilization.

Tflops have nothing to do with utilization.
 

FranXico

Member
At all times you can always access up to 12.15 tflops.

This isnt hard. You are talking utilization.

Tflops have nothing to do with utilization.
The teraflops number any manufacturer gives you is a maximum based on full utilization. If the frequency is fixed, you need full utilization to reach that amount of floating point operations. If the frequency is variable, you need both max frequency and full utilization to reach it.

It's not hard to understand the concept of theoretical maximum.

It isn't some "magic power" one can just "tap on at any time". Actual computation done on the GPU depends on the workload.
 
Last edited:

Ps5ProFoSho

Member
Someone got butthurt the other day and I almost got banned for being a Mr. Meany Pants. This is from 2017 and I'll just leave it here.
 
Yes I understand but the future RDNA 2 GPU die sizes are rumoured to be quite large, i can't wait to see the PS5 breakdown, Cenry mentioned a hefty size GPU

Your definitely right about that. I remember that Ceny said the PS5s CUs were around 40% larger than the Pros CUs. Not to mention there's that I/O within the chip.

Imt98hhiGyx2HAQx.jpg


I would definitely expect the chip to be large but smaller than the XSXs.
 
The PlayStation 5 is, without a shadow of doubt, not only the fastest but also the smartest and most inovative platform ever built.

The "best archtecture in history" is not a vain claim. Ps5 Will redefine How powerfull are measured and many Will be totally confused.

Ps5 was built under a different and unique philosophy, its not supposed tô be judged with same old criterias. It Will completely revolutionize the gaming development enviroment, giving creators to express their creation to its fullest. Thanks with the Speed being its core philosophy, wich is way more demanding and necessry these days, devs Will finally be free and no longer being forced to puta limitations and use old gamey tricks as the SSD and super customized I/o allows them tô stream alô the scenario with all the destila properly loaded witha a Blink of eyes

TBtWEhX.jpg
 

CurtBizzy

Member
Your definitely right about that. I remember that Ceny said the PS5s CUs were around 40% larger than the Pros CUs. Not to mention there's that I/O within the chip.

Imt98hhiGyx2HAQx.jpg


I would definitely expect the chip to be large but smaller than the XSXs.
He mentioned the CUs being 62 percent larger, that's why I'm suprised at the XSX die size
 

CurtBizzy

Member
Thanks for the correction.

Yeah those two chips are going to be pretty big.
The XSX die size is identical to the Xbox X, shouldn't the XSX be larger if each CU is 62 percent bigger ? Does Mark Cerny statement applies to the XSX ? I know he stressed that the PS5 has a Custom GPU and is fairly large. The upcoming future rdna 2 gpu die sizes are expected to be huge, almost twice the size as the XSX

GZFjlVn.png
 
Last edited:
The XSX die size is identical to the Xbox X, shouldn't the XSX be larger if each CU is 62 percent bigger ? Does Mark Cerny statement applies to the XSX ? I know he stressed that the PS5 has a Custom GPU and is fairly large. The upcoming future rdna 2 gpu die sizes are expected to be huge, almost twice the size as the XSX

GZFjlVn.png

Honestly if we are just looking at the GPUs the XSXs should be bigger than the PS5s.

But what surprises me is that you said the XSX Apu is the same size as the One Xs. I would expect it to be bigger.
 

LordOfChaos

Member
The XSX die size is identical to the Xbox X, shouldn't the XSX be larger if each CU is 62 percent bigger ? Does Mark Cerny statement applies to the XSX ? I know he stressed that the PS5 has a Custom GPU and is fairly large. The upcoming future rdna 2 gpu die sizes are expected to be huge, almost twice the size as the XSX

GZFjlVn.png

There's probably a bunch of uncore stuff (stuff between the GPU CUs and CPU cores) that doesn't need to scale up to the same degree, such that 7nm brought the total APU size down to even.
 

Gamerguy84

Member
I keep forgetting we have never seen an RDNA 2 card in action. Both machines are extremely powerful. The quoted bit is from Toms Hardware and is GCN vs RDNA 1.

How much do these changes matter when it comes to actual performance and efficiency? It's perhaps best illustrated by looking at the Radeon VII, AMD's last GCN GPU, and comparing it with the RX 5700 XT. Radeon VII has 60 CUs, 3840 GPU cores, 16GB of HBM2 memory with 1 TBps of bandwidth, a GPU clock speed of up to 1750 MHz, and a peak performance rating of 13.8 TFLOPS. The RX 5700 XT has 40 CUs, 2560 GPU cores, 8GB of GDDR6 memory with 448 GBps of bandwidth, and clocks at up to 1905 MHz with peak performance of 9.75 TFLOPS.

On paper, Radeon VII looks like it should come out with an easy victory. In practice, across a dozen games that we've tested, the RX 5700 XT is slightly faster at 1080p gaming and slightly slower at 1440p. Only at 4K is the Radeon VII able to manage a 7% lead, helped no doubt by its memory bandwidth
 

longdi

Banned
The XSX die size is identical to the Xbox X, shouldn't the XSX be larger if each CU is 62 percent bigger ? Does Mark Cerny statement applies to the XSX ? I know he stressed that the PS5 has a Custom GPU and is fairly large. The upcoming future rdna 2 gpu die sizes are expected to be huge, almost twice the size as the XSX

GZFjlVn.png


He did label as 'rDNA2' vs PS4 CU and not PS5 CU vs PS4 CU. Is Series X using rDNA2? :goog_hugging_face:

Besides PS4 CU is different from Xbox 1x/PS4p CU too.
 

CurtBizzy

Member
He did label as 'rDNA2' vs PS4 CU and not PS5 CU vs PS4 CU. Is Series X using rDNA2? :goog_hugging_face:

Besides PS4 CU is different from Xbox 1x/PS4p CU too.
I understand your point but even digital foundry was suprised by the die size, we will have to wait and see
 

longdi

Banned
At all times you can always access up to 12.15 tflops.

This isnt hard. You are talking utilization.

Tflops have nothing to do with utilization.

Exactly!

In an imaginery future game, say developers needs a sustained run of 3.4Ghz of AVX2* + 12Tf of compute + 2.4gbs i/o, Series X according to MS, can deliever to them always.

Can PS5 feed this game the constant power required?

*Zen2 runs avx2 in worse case, at the base clock, meaning it only loses its boost clocks. There is no lowered separate AVX2 clocks like Intel.
Tbf since no consoles made claims of sustained cpu load, except maybe Sony kinda hinted their thermal solution will eradicate thermal throttling... imo 3.4ghz is a fair number
 
Last edited:

longdi

Banned
I understand your point but even digital foundry was suprised by the die size, we will have to wait and see

In best case, Sony did make hardware customisation to rDNA2, increasing the size and perf/CU, beyond what Amd will launch in their new gpu, beyond Series X CU.

Perhaps this customisation resulted in 2.23ghz that can sustained its performance 95% of the time

This is good for gamers, i am happy and hats off to Mark Sony and i apologise to the astroturfers.

But PS4 and PS4p did not have such changes, its label 'rDNA2', im skeptical.

DF needs to ask tougher questions next round. :goog_hugging_face:
 
Last edited:

Tripolygon

Banned
Sorry I deal with independently verifiable facts not misinformation.

Every person who was in the nanite team or took part in its development was hired in 2019 or early 2020. Period.

The work on the I/O was also started in 2019.

Again I deal in independent verifiable facts not misinformation like you.
First Epic was not in financial trouble, they had a big backer in Tencent who owns 40% stake in the company and various other companies like Disney.

Umm Brian Karis left Naughty Dog and went back to Epic in 2014. He is the Technical Director of graphics at Epic. Nanite was based on years of research he had been doing.

Lumen started development before this gen started. It is the continuation of SVOGI technology that Epic wanted to launch with Unreal Engine 4 but was removed due to how expensive it was to run.

PS5 IO started development or at least the conceptual phase of it was finished in 2016 as that was when the patent was filed.

The virtualized geometry and SVOGI tech in Unreal Engine 5 has been an area of research at Sony since PS4 launched. One of the first console games to use similar GI tech was The Tomorrow Children. For the record Microsoft has also been researching this area of technology as well, they have lots of papers about it.

ly71Bng.jpg
It is fascinating to watch you turn years of research, collaboration and development into just mere months of happenchance talks. Consoles are no longer developed in isolation like they used to in previous generations. PS4 was a turning point for Sony. This is not to say Microsoft does not collaborate but in this instance Tim Sweeney singles out Sony as a close partner in this area.
Sweeney: We’ve been working super-closely with Sony for quite a long time on the storage architecture and other elements. It’s been our primary focus. But Unreal Engine 5 will be on all next-generation platforms, and so will Fortnite.
 
Last edited:

Exodia

Banned
First Epic was not in financial trouble, they had a big backer in Tencent who owns 40% stake in the company and various other companies like Disney.

Umm Brian Karis left Naughty Dog and went back to Epic in 2014. He is the Technical Director of graphics at Epic. Nanite was based on years of research he had been doing.

Lumen started development before this gen started. It is the continuation of SVOGI technology that Epic wanted to launch with Unreal Engine 4 but was removed due to how expensive it was to run.

PS5 IO started development or at least the conceptual phase of it was finished in 2016 as that was when the patent was filed.

The virtualized geometry and SVOGI tech in Unreal Engine 5 has been an area of research at Sony since PS4 launched. One of the first console games to use similar GI tech was The Tomorrow Children. For the record Microsoft has also been researching this area of technology as well, they have lots of papers about it.

ly71Bng.jpg
It is fascinating to watch you turn years of research, collaboration and development into just mere months of happenchance talks. Consoles are no longer developed in isolation like they used to in previous generations. PS4 was a turning point for Sony. This is not to say Microsoft does not collaborate but in this instance Tim Sweeney singles out Sony as a close partner in this area.

How about you actually read what i posted concerning the development of Lumen and Nanite rather than spreading misinformation?
Its always the ones who don't use/follow UE development. Those are the main ones to try and spectacularly fail at explaining to you what's going on with the engine. Unbelievable.
 
Last edited:
At all times you can always access up to 12.15 tflops.

This isnt hard. You are talking utilization.

Tflops have nothing to do with utilization.
Why do you think async compute has being extensively used this gen ? Because current CU occupancy was only around 40%. Async compute allows higher occupancy like maybe 45% or so.
 

geordiemp

Member
Are you seriously trying to say the difference between PS5 exclusive games and XSX games will be like RDR vs. RDR2, TLOU vs TLOU2, GOW3 vs. Gears5?
Or are you just trolling?

tenor.gif

I was respoding to a troll with a troll. However, Microsft dev on back compat now at Holo lens :



I dont believe XSX will be streaming data per frame at all (milliseconds) and will be X seconds of potential gameplay. That will make a potentil asset quality difference between games that stream data in frame and those that stream in gameplay chunks.
 
Last edited:

Tripolygon

Banned
How about you actually read what i posted concerning the development of Lumen and Nanite rather than spreading misinformation?
Its always the ones who don't use/follow UE development. Those are the main ones to try and spectacularly fail at explaining to you what's going on with the engine. Unbelievable.
I read what you wrote. It is pure conjecture and folly. Some people getting hired in 2017 to 2019 does not mean the work had not started before then. You do the same thing I see lots of people do where you find disparate information put them together and form a conclusions that are just outright wrong. You know how software and game development starts? A few core people start the project and the team keeps increasing until the final year it ships.

You started your premise on

Remember that Epic games had money issues till Fornite BR blew up at the end of 2017
This is categorically false. Epic had tons of cash going back to 2012 when Tencent bought 40% stake in the company for over 300 million.

Lumen is not lightmass based GI, it is SVOGI. Unreal Engine 4 switched from SVOGI to Lightmass based GI and that was way back in 2014.

This is lightmass
Most games approximate global illumination by pre-calculating it through a system called light maps, generated offline and essentially 'baked' into the scene via textures. With this, a scene has global illumination, but lights cannot move and the lighting and objects affected by it are completely static. The lighting is essentially attached to the surface of the objects in the game scene. In addition to this, this lightmap only affects diffuse lighting, so specular lighting - reflections like those found on metals, water and other shiny materials - have to be done in a different manner, through cube maps or screen-space reflections.

This is what Lumen is doing
"Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware."

To achieve fully dynamic real-time GI, Lumen has a specific hierarchy. "Lumen uses a combination of different techniques to efficiently trace rays," continues Wright. "Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer."

Lumen uses a combination of techniques then: to cover bounce lighting from larger objects and surfaces, it does not trace triangles, but uses voxels instead, which are boxy representations of the scene's geometry. For medium-sized objects Lumen then traces against signed distance fields which are best described as another slightly simplified version of the scene geometry. And finally, the smallest details in the scene are traced in screen-space, much like the screen-space global illumination

You trying to minimize Tim Sweeney statement on how they collaborated with Sony
1) The discussions with Sony/MS about on whats needed for next gen graphic and storage arch started ~4 years. This is simply Epic games like other developers telling Sony what they wanted. This is nothing special. When Epic games got confirmation from MS and Sony that they will use SSD. They began development on nanite. Development WAS on a PC and had NOTHING to do with the PS5 till 2019 when the specs, devkits and api shipped. How is it you people cannot understand?

2) The demo project by the special projects team was started late 2019. This is why it says "we've been working on with Sony over the past months"

You still spreading the idea that the demo runs on a laptop with lots of details that you pulled out of nowhere. This is how you catch a liar. They start adding embellishments and details that was not present in the original speculation in the first place.
Nanite has already been proven to work on a laptop at 1440p at 40 fps using a (most-likely sata) SSD that's probably plugged through USB. This is without on board nvme ssd, no hardware decompression, no custom designed heat-sink to sustain throughput, two sets of memory bottle-neck and no directstorage to eliminate cpu and gpu bottlenecks.

Where do you think development of Nanite happened? On devkits? Devkits weren't here 3, 4, or 5 years ago. They only came last year. It happened on PCs!

Once they finally got the devkits last year, they worked on improving their IO after being motivated by Sony's fast SSD specs as they wanted to take full advantage of it.

Since Epic ALWAYS showcases a demo every next gen on the new PlayStation. When that time came Q4 2019. That was when the collaboration on 'Lumen in the land of nanite' started.
And you apparently know more details on when Epic and Sony collaboration started than Tim Sweeney the CEO and graphics programmer of the company. Talk about spreading misinformation.
Where do you think development of Nanite happened? On devkits? Devkits weren't here 3, 4, or 5 years ago. They only came last year. It happened on PCs!

Once they finally got the devkits last year, they worked on improving their IO after being motivated by Sony's fast SSD specs as they wanted to take full advantage of it.
Games are developed on PC regardless of devkits. And you even know when Epic got finalized dev kits too.
 
Last edited:

geordiemp

Member
I read what you wrote. It is pure conjecture and folly. Some people getting hired in 2017 to 2019 does not mean the work had not started before then. You do the same thing I see lots of people do where you find disparate information put them together and form a conclusions that are just outright wrong.

You started your premise on


This is categorically false. Epic had tons of cash going back to 2012 when Tencent bought 40% stake in the company for over 300 million.

Lumen is not lightmass based GI, it is SVOGI. Unreal Engine 4 switched from SVOGI to Lightmass based GI and that was way back in 2014.

This is lightmass


This is what Lumen is doing


You trying to minimize Tim Sweeney statement on how they collaborated with Sony

You still spreading the idea that the demo runs on a laptop with lots of details that you pulled out of nowhere.

And you apparently know more details on when Epic and Sony collaboration started than Tim Sweeney the CEO and graphics programmer of the company. Talk about spreading misinformation.

Games are developed on PC regardless of devkits. And you even know when Epic got finalized dev kits too.

Yeah I gave up, the events of hiring more engineers in 2018, sarting that demo in earnest later and talking to Epic for 4 years deeply about graphics and SSD streaming are not mutually exclusive.

I believe what Tims weeny said over xbox discord.
 
Last edited:

ZywyPL

Banned
He mentioned the CUs being 62 percent larger, that's why I'm suprised at the XSX die size

62% more transistors, which themselves are about half of the size of the 14nm ones found in PS4/Pro. Same rule applies for the CPU and the I/O controllers, that's why XBX APU despite being way more powerful than 1X is still the same size. So the PS5 GPU will actually be even smaller than the one found in PS4 Slim.
 

pasterpl

Member
PS4 gen not PS4 games. That includes XBO games which are games the XSX will get majority for the first two years. Even first party games aren’t an exemption.

majority of games Within first 12-24months, for both consoles, will be 3rd party cross gen games, obviously Sony will release some 1st party next gen exclusives, and there will be some 3rd party next gen only titles but vast majority will be cross gen 3rd party games

I am also wondering if the rumoured $500M halo infinite budget is so high because they are working on 2 versions of the game in parallel (one for this gen and 2nd next gen only)?
 

kuncol02

Banned
I have my own hunch that Sony's is probably the cheaper one
If AMD didn't upgraded RDNA2 clock speeds a lot, then I wouldn't be surprised if opposite is true. That clock is really end of power curve for current RDNA cards and would lower yields significantly. We will see more when RDNA2 cards finally hit market. In addition if final PS5 is anything like devkit, then it will be much more expensive to manufacture than XBox.
 

Ascend

Member
I understand your point but even digital foundry was suprised by the die size, we will have to wait and see
It is an impressive die size, and is a testament to the advances AMD has made. At 360 mm2, you can guesstimate the size of the GPU. A Ryzen chiplet is about 75 mm2. That leaves about 285 mm2 for the GPU of the XSX

We have the 5700XT with 40 CUs at 251 mm2. The XSX has 30% more CUs, but only a 13% larger size. So AMD must have made some great advances here. So yes, the die size is surprising. And this is one of the reasons people are thinking that finally AMD will be able to compete with nVidia, but, that's another topic for another thread.
 

Exodia

Banned
I read what you wrote. It is pure conjecture and folly. Some people getting hired in 2017 to 2019 does not mean the work had not started before then. You do the same thing I see lots of people do where you find disparate information put them together and form a conclusions that are just outright wrong. You know how software and game development starts? A few core people start the project and the team keeps increasing until the final year it ships.

You have absolutely no clue what you are talking about.
Brian already detailed the members of the Nanite team and those who worked on developing it. Everyone in this team was hired in 2019/2020



You started your premise on.
This is categorically false. Epic had tons of cash going back to 2012 when Tencent bought 40% stake in the company for over 300 million.

If you are not apart of the UE community. You don't know the struggle for available engineering resources to develop new features. Epic has been very open about it. Epic engineers literally complained they didn't have enough available resources. But of-course you are oblivious to that since you are not apart of the community and didn't follow the engine development. So why are you speaking on something you lack any knowledge on? Google searches just exposes your lack of knowledge on the subject. It means nothing.

Lumen is not lightmass based GI, it is SVOGI. Unreal Engine 4 switched from SVOGI to Lightmass based GI and that was way back in 2014.
This is lightmass
This is what Lumen is doing

I never said lumen was lightmass. Again your lack of knowledge showing itself. This stems from the fact you don't use Unreal and not apart of the community.
All you do is look up definitions. If you actually read my post you would see I said "Now I should note that Lumen is a combination of pre-existing features into one with improvements of-course. "

In a previous thread I already explained IN DETAIL what Lumen consisted of.


Notice that SSGI was developed IN 2019. Ray tracing which is seperate was also developed in 2019. We community members knew. Why?

BECAUSE EPIC GAMES TOLD US THEY HIRED A BUNCH OF PEOPLE JUST TO DEVELOP IT.

A specific team was hired to work on improving Lightmass, another separate team was hired to develop Ray tracing and another for the replacement to lightmass ("there are plans to replace existing Lightmass underway.") They literally talked about it in the livestreams and on their forums. You are not apart of the community that's why you clearly have no clue.

And you apparently know more details on when Epic and Sony collaboration started than Tim Sweeney the CEO and graphics programmer of the company. Talk about spreading misinformation.

I have more details because i'm a unreal engine user and a community member that has followed the development of the engine since 4.0 and watched all live-streams and routinely follow and post of the forum. Project managers and epic developers routinely on the stream and on the forum diverge internal details some of which i have posted. This includes who they are hiring, what's priority for development, what isn't being developed anymore. Heck Daniel Wright literally told us he wasn't working on dynamic GI anymore. LITERALLY. This was the biggest discussion point in the community. When they restarted development, they told us. Ofcourse you didn't know that. How could you?
 
Last edited:
Top Bottom