• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HW| does Xbox Series X really have 12 Teraflops ?

onQ123

Member
I know exactly what it is, trust me. We saw the results in Doom Eternal pretty clearly and the comments by id a pretty respected developer were very clear also.
Now we're getting 4k 60fps with in game RT, so some smart hardware choices were made.
As long as you know & yes it's smart design so was the PS4 Pro hardware enhancements for checkerboard rendering at the time.

It was attacked but now look at the whole industry 😂
 

Godfavor

Member
I think that it's not the console but the API that is the culprit.

Dx11 which had more of the instructions fixed, devs didn't need to do a lot of optimization to run the games great without issues, as the API was standardized, experienced devs know exactly what to do and get the most out of it.

As dx12 ultimate is closer to the metal, devs need more time to optimize their engines and code for dx12, dx12 has the advantage for higher performance than dx11 but needs ton of work from the devs. Most devs wouldn't want to spent more time optimizing when they could release a game with some hiccups and 10% lower performance. As it is "good enough" for them in the pre release testing period, it won't really matter.

Optimizing the game engine "to the metal" it is not like that it could be done in the final stage before release (which is the time that devs optimize performance and getting bugs fixed), rather than making the game run great from the start with dx12 in mind.

Taking also into account the diversity of PC hardware (AMD, Nvidia, Intel) AND the Xbox series X and S, it is really no wonder why Xbox and PC are in a bad shape relying on a single api for everything.
 

onQ123

Member
I think that it's not the console but the API that is the culprit.

Dx11 which had more of the instructions fixed, devs didn't need to do a lot of optimization to run the games great without issues, as the API was standardized, experienced devs know exactly what to do and get the most out of it.

As dx12 ultimate is closer to the metal, devs need more time to optimize their engines and code for dx12, dx12 has the advantage for higher performance than dx11 but needs ton of work from the devs. Most devs wouldn't want to spent more time optimizing when they could release a game with some hiccups and 10% lower performance. As it is "good enough" for them in the pre release testing period, it won't really matter.

Optimizing the game engine "to the metal" it is not like that it could be done in the final stage before release (which is the time that devs optimize performance and getting bugs fixed), rather than making the game run great from the start with dx12 in mind.

Taking also into account the diversity of PC hardware (AMD, Nvidia, Intel) AND the Xbox series X and S, it is really no wonder why Xbox and PC are in a bad shape relying on a single api for everything.
In the PS4 vs Xbox One days people swear out that once it started using DX12 it was going to catch up to PS4 now people blaming DX12 for Xbox Series X not flexing it's muscles on PS5.
 
Last edited:

Godfavor

Member
In the PS4 vs Xbox One days people sweated out that once it started using DX12 it was going to catch up to PS4 now people blaming DX12 for Xbox Series X not flexing it's muscles on PS5.
There was a huge difference in raw power in favor of PS4 vs Xbox one, there was no way that Xbox could match it
 

Fafalada

Fafracer forever
Devs still haven't been able to really grasps Direct X 12, so I don't expect them to utilize given HW/API. On Playstation, you simply have no other choice. It is more of Windows vs Apple software mentality.
That doesn't really line up with the 'need to optimize Xbox more' though. The slightly lower level API on PS actually makes it more work to do the same things there - and while that 'may' open additional optimization opportunities - it doesn't come with *less* effort to do them, more the opposite.

it doesn't make sense that a 3080 needs more works or performs worse than the 3060
3060 is specifically designed to perform worse than 3080, so they can charge less for it.
In case of two consoles - neither is 'designed' to be faster/slower than the other - they simply optimized for roughly the same constraints (Power, Cost, Yields) differently.
 

DenchDeckard

Moderated wildly
How many belly flops?

Drop Lol GIF by America's Funniest Home Videos
Meme Fail GIF

monica belly flop GIF by Australian Survivor
dive belly flop GIF
Pool Party Swimming GIF by Party Down South
Drunk Jump GIF by Brimstone (The Grindhouse Radio, Hound Comics)
belly flop GIF
splash GIF
im out leap of faith GIF by Barstool Sports
man pool GIF
Red Bull Pool GIF by Formula 1
Cbs Love GIF by LoveIslandUSA
 
Last edited:

M1chl

Currently Gif and Meme Champion
That doesn't really line up with the 'need to optimize Xbox more' though. The slightly lower level API on PS actually makes it more work to do the same things there - and while that 'may' open additional optimization opportunities - it doesn't come with *less* effort to do them, more the opposite.


3060 is specifically designed to perform worse than 3080, so they can charge less for it.
In case of two consoles - neither is 'designed' to be faster/slower than the other - they simply optimized for roughly the same constraints (Power, Cost, Yields) differently.
I am not sure if I am getting you, pretty much all the dx12 ports runs pretty badly, optimize on Xbox more, not really, just optimize at all is enough. And by that, it means calling actual SDK calls which was added for this console. Which is encouraged, but not required. It compiles PC code just fine.

Also Playstation API, really does not allows you to call those APIs, which are no longer there, since they've been removed for specific SKU (PS4, PS4 Pro, PS5, ...), this lines with what Apple is doing for their metal API and that's why you can run Resi 8 on low powered device like MacBook pretty well.

Optimization to target HW is one thing, but another one, is to use specifically subset of APIs designed for the target HW.
 

Fafalada

Fafracer forever
Good stuff. As a non-gaming developer, I use Visual Studio 2022 every day. I've always been a huge fan of that IDE, but for me, VS 2022 has been a buggy mess.
VS still has that cadence of good/bad release doesn't it. I vaguely recall 2019 was the 'good' one, 2017 sucked etc. - but it's been awhile since I've been allowed near code. Wasn't sure if the pattern still holds.
 

Lysandros

Member
but there's also the potential that people expecting more simply misunderstood the paper specs and only focused on a couple of metrics. Failed to understand the other important factors in gaming hardware performance, etc.
I definitely think this is the one. I was always at odds with the "PS5 is performing beyond its specs" or "XSX is performing below its specs" narrative. A truly objective look to this famous "paper spec" altogether with some basic knowledge about the architectures shows an undeniably balanced picture. In the situation where the base hardware metrics are so evenly distributed, even a slight difference in efficiency (be it hardware or software based) can make the difference.
 
Last edited:

DeepEnigma

Gold Member
As long as you know & yes it's smart design so was the PS4 Pro hardware enhancements for checkerboard rendering at the time.

It was attacked but now look at the whole industry 😂
A forward thinker, that Cerny guy.

I also remember the backhanded compliments and downplaying DF (especially Alex/Dictator) did at the time... only to fast forward to current year with them wanting more of it to optimize performance for their ray traced Holy Grail.
 
Last edited:

Fafalada

Fafracer forever
I am not sure if I am getting you, pretty much all the dx12 ports runs pretty badly, optimize on Xbox more, not really, just optimize at all is enough. And by that, it means calling actual SDK calls which was added for this console. Which is encouraged, but not required. It compiles PC code just fine.
I mean that kinda goes with what I was saying - GNM APIs are a lot less forgiving (though there were ways to hang-yourself with mixing PS4/PS4Pro calls sometimes, it's not exactly common case).
As for optimization I was strictly speaking towards GPU execution (although even for CPU - to get real benefits, it can take awhile) - the last time I was hands-on, GNM path just took - way longer than DX12 to get into decent shape. Yes - by the end we were able to do things that are flat-out impossible on DX12/Vulkan or anything similar, but it was a lot of work getting there.

I never got to see this up-close on new consoles - but I imagine same is true for Raytracing now, PS versions likely have access to things that can just do more (though I doubt multiplats are using that much).
 

Alebrije

Member
Teraflops has been more a marketing tool than a real gamechanger at the moment you compare PS5 vs XboneSX

The same way 4k, Ray tracing ,8K ..its on consoles boxes.

By example , once you know how full RT works on a PC vs the one those consoles offer you clearly know is just a marketing tool since its RT is prety lame and demands a lot of resources.
 
Last edited:

Skifi28

Member
A forward thinker, that Cerny guy.

I also remember the backhanded compliments and downplaying DF (especially Alex/Dictator) did at the time... only to fast forward to current year with them wanting more of it to optimize performance for their ray traced Holy Grail.
Yeah, it's quite fascinating how things turned out in retrospect. From making fun of "faux k" to celebrating fake frames. I mean sure, CB was a little rough around the edges, but it was still pretty decent as a first generation reconstruction method and is even used today quite effectively.
 

Crayon

Member
In the end, Cerny and other developers who said the same thing were absolutely right, but then it was only a reason for ridicule and even portals like DF helped with it (there were even doubts about hardware RT on PS5... see it now on Ghostwire tokyo better than on Xbox xDD) this was 2020 in the forums... now it's time to pick up the cable when PS5 has shut up.

PS5-FUD.jpg


:pie_thinking:

Where did this list come from? Is it a compilation of fud after the fact or...
 

M1chl

Currently Gif and Meme Champion
I mean that kinda goes with what I was saying - GNM APIs are a lot less forgiving (though there were ways to hang-yourself with mixing PS4/PS4Pro calls sometimes, it's not exactly common case).
As for optimization I was strictly speaking towards GPU execution (although even for CPU - to get real benefits, it can take awhile) - the last time I was hands-on, GNM path just took - way longer than DX12 to get into decent shape. Yes - by the end we were able to do things that are flat-out impossible on DX12/Vulkan or anything similar, but it was a lot of work getting there.

I never got to see this up-close on new consoles - but I imagine same is true for Raytracing now, PS versions likely have access to things that can just do more (though I doubt multiplats are using that much).
True, I haven't said it is easier, I am just saying that it is only way how to release something and not that you are doing stuff on PC then ctrl c v and done, it compiles lets ship it. Also I would like to say that difficulty is relative concept, because for example in this thread there was little bit on langues wars, I can do Rust/C++ just fine, but big C# project scares the shit out of me. Despite people are saying that those two are "hard". I guess not for me, nowadays I am struggling pretty badly with swift, I am new to it and wrote less than 50k lines, so those intricacies are now know to me yet.

MS libraries in general is pretty messy, with PS you spend more time developing with MS you spend more time debugging.
 

Bigfroth

Member
I heard that a snag devs complain about is the weird split memory that it has meaning a portion of the ram runs at a different speed causes some headaches for some.
 

M1chl

Currently Gif and Meme Champion
I also feel the need to expand on "split memory", it isn't really split memory, because it is the same chip, just with less wires, meaning it will have predictable latency, where with classic PC like build you are never sure if the data you crunched in RAM is ready to be deployed to VRAM, you are wasting cycles. This could be curse or blessing, strength of split memory pool is that you are not wasting "slots", thus you can have AF16x on thrift store pc basically without hit to FPS, but even console nowadays cannot do it. AF is heavily depended on feedback of the relative position of camera to the plane where the texture is, this is something you can't afford in unified pool, well you can but other things will suffer. It is balancing act more so than PC architecture. I am finding difficulties even on my current dev/port machine which is M2 Max with 64GB of memory (thus the high end of the speed of M2 equipped devices) and even there you have to deal with scheduler and all that kind of annoying shit. I find it difficult personally. Because both CPU and GPU is really crazy fast, it sucks that the memory is like that.

Sadly this has wild implication, that physics and all that cool stuff is dialed back, because you can't afford it when you litter you unified RAM with nice textures. GRANTED, the new SSD implementation on the new consoles (and to degree on PC with RTX IO...inshallah it will happened soon) helps with this limitation quite a bit. But not even NVMe SSD are close to RAM speeds and latency.
 

Topher

Gold Member
VS still has that cadence of good/bad release doesn't it. I vaguely recall 2019 was the 'good' one, 2017 sucked etc. - but it's been awhile since I've been allowed near code. Wasn't sure if the pattern still holds.

Seems that way. 2022 has been a shitshow.
 
While I think Cerny is right to a degree that it's not all about flops, something just doesn't add up about the Xbox Series X. It is almost as if there is a chunk of the performance being lost to something running in the background. It does feel over engineered compared to the PS5 with its split memory.
 

Crayon

Member
I mean, measuring clocks, having some program like GPUZ to tell how many CUs are there, etc.

I'm getting out of my lane here but I THINK it starts with how many floating ops a compute unit can do, then the clock speed and number of cu's is factored in. That's why people call it a theoretical maximium.
 
Last edited:

Crayon

Member
It seems RDNA2 perform much better at highr clocks and MS made a mistake by the choosing conservative 1.8ghz cockspeed

The ps5 also has that deterministic throttling. That clockspeed was considered pretty high at the time they revealed it. The idea was to allow the thing to be boosting more of the time. Idk if it really adds up to performance. I suspected it just let them save money on the psu and cooler because they wouldn't need as much overhead for power or cooling. Lol remember this was supposed to make it HORRIBLY DIFFICULT TO DEVELOP FOR.
 

SlimySnake

Flashless at the Golden Globes
Picture slightly unrelated of current development


Fvpc5JxXoAEg61F
Hogwarts, TLOU, and Star Wars ALL got PC patches this week where they reduced the high vram usage, fixed CPU performance and added better performance for midrange machines.

At the end of the day, the devs are shipping broken messes on PC and some consoles because the sales split for PC is roughly 15% and 20% on Xbox consoles. They are clearly not putting their best resources towards these platforms.
 

01011001

Banned
Hogwarts, TLOU, and Star Wars ALL got PC patches this week where they reduced the high vram usage, fixed CPU performance and added better performance for midrange machines.

At the end of the day, the devs are shipping broken messes on PC and some consoles because the sales split for PC is roughly 15% and 20% on Xbox consoles. They are clearly not putting their best resources towards these platforms.

Developers/Publishers need to be put into a situation where releases like these have more serious consequences.

I am still convinced of my proposal from another thread:

-No Day 0/1/2 patches, gold master sent in for certification = launch version

-No patches allowed until 2 months after launch

-Massive certification fees for patches bigger than 2GB

Just like Nintendo had to distance it's NES from the Atari 2600 days due to the masses of low quality software, digital storefronts should try to distance themselves from the broken "release now, fix later" releases of today.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
The amount of negative xbox threads on here this week I'm starting to wonder if Sony is paying people to start new negative threads at this point.

The economy is in trouble, inflation is a bitch, so don't blame us Sony fans from making a bit of money on the side. For all the work we're doing on this forum for Sony, I'd say we're underpaid and deserve more. We Sony Ponies should set up a union and stop creating pro-PS5 threads until we get a 10% raise and substantial health benefits.
 

SlimySnake

Flashless at the Golden Globes
In the end, Cerny and other developers who said the same thing were absolutely right, but then it was only a reason for ridicule and even portals like DF helped with it (there were even doubts about hardware RT on PS5... see it now on Ghostwire tokyo better than on Xbox xDD) this was 2020 in the forums... now it's time to pick up the cable when PS5 has shut up.

PS5-FUD.jpg


:pie_thinking:
Lol I remember most of these. The FUD game was absolutely nuts this go around with so many xbox insiders spreading a lot of this shit.

That said, what I find interesting is that some of these leaks were right or half right. It's either broken clock is right twice or day or some rumors are true while others are lapped on by nutjob fanboys to pile on. The RDNA1 debacle was exactly that. Sony IS missing some features like hardware VRS support, but it has the two things that matter the most, the perf per watt gain of RDNA2.0 that allowed them to take the GPU to insanely high clocks, and the all important hardware RT cores. But because that one thing was true, CERNY LYING, SONY PANIC BOOSTING ClOCKS! Someone call the cops!

So yeah, not everything is true, but not everything is false or concern trolling either. Other things that turned out to be true from this list:

- Spiderman Miles was indeed running on PS4 and a standalone DLC expansion.
- PS5 doesnt have hardware VRS
- Sony did run into production issues.
- BOM did increase leading them to release two SKUs a $100 apart.
- Dualsense battery life is indeed shorter than DS4 especially if the game uses haptics like astrobot.
- Sony did end up investing $200 million just a few months after Tim Sweeny's comments.
- Phil did say he felt good about his conference. its just that now we know he doesnt look for actual video games when coming up with his conferences so yeah.
- Sony did moneyhat A LOT of third party games but obviously not because they were afraid as Ghostwire performing better on PS5 shows.

I think fucking Dusk Golem was the worst with his bs RE8 predictions. I trusted him because he got everything else right about the game. His reasoning was even more insane.... to balance out the bad PR xbox was getting. Like gtfo.
 

DenchDeckard

Moderated wildly
Hogwarts, TLOU, and Star Wars ALL got PC patches this week where they reduced the high vram usage, fixed CPU performance and added better performance for midrange machines.

At the end of the day, the devs are shipping broken messes on PC and some consoles because the sales split for PC is roughly 15% and 20% on Xbox consoles. They are clearly not putting their best resources towards these platforms.

Yup, xbox and pc are just an afterthought at the minute and it needs to change.

If devs are happy to take the customers money on day one you should make sure those versions are up to scratch.

It's the worst thing about the industry this generation imo.
 
Top Bottom