• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next Gen Consoles Will Be Absolutely Insane.

//DEVIL//

Member
Will we finally achieve the dream?
4K 60
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

The 4090 is twice as powerful as the 3080.

That means in theory the 4090 is 4x more powerful than a ps5

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol
 
Last edited:
They also need to address how to mitigate the tanking of performance of games with:

-High foliage: Grass, Tree leaves, and then the displacement of the grass when walking through it and tree leaves and the wind blowing against them, on top of sunlight highlighting the leaves, and rain drops hitting the leaves, grass, flowers
-Water: Waves, rain drops, and proper displacement of water when walking through water, and proper splash when jumping in water
-Hair, and hair strands looking natural and organic without looking like glued patches on a skull, or dried linguine/spaghetti. Being able to naturally fold when moving against the wind or wind blowing against the hair. Animal fur also looking natural and velvety

When you try to 'simulate' this with real time physics it absolutely crushes performance. Perhaps the ML cores or NPU can help with this, I am not sure.


Facial Expression and General Walking and Running:
Characters still have this stiff manikin look when talking. I call it the sasori no jitsu effect. Lifeless dolls with chakra strings attached to them. Imagine a game looking distinguishable from a cinematic movie with cinematic motion:


Instead of games looking like plastic dolls, they look lifelike with human imperfections: skin pores, skin oil, sweat, scar marks, skin discoloration, with full range of facial expression utilizing facial muscles.

Better transition animations with walking and running: Lets face it, most games lack natural cinematic natural movement and look stiff. Uncharted series and Last of US are the exception. Most games also lack proper transition animations from moving to right to left-left to right, running to walking, walking to running. Characters should also 'huff n puff' and have heavy breathing animations-but i guess that would take away the time and attention span from progressing through the game.
 

sankt-Antonio

:^)--?-<
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

The 4090 is twice as powerful as the 3080.

That means in theory the 4090 is 4x more powerful than a ps5

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol
To be honest I expect (as a consumer) that the PS5 pro next year has at least 2x the CPU/GPU power of the PS5. If they can't make that happen, why even bother with consoles moving further? Maybe that's also a reason they all hopp on the PC train, if a console can't be a substantial upgrade to its predecessor how to marked that to people?
 

PeteBull

Member
At the end of the day what makes a game fun is not tflops it is the developer.


Ofc, but if hardware is superlacking, like to not look far, for example switch, that recently got nier automata port which dips to 25fps in 1080p(docked, portable has even lower res-720p iirc, which is obviously expected) even after many cuts of all kids(lighting/geometry/textures/even particle effects and grass went 2d from 3d that was present even on old weak af xbox one s ) u cant blame devs for games looking like utter garbo, its simply hardware being super underpowered(switch in docked mode is still about 3x weaker from og xbox one that was weak af even back at its launch in 2013).
 
Last edited:

Sosokrates

Report me if I continue to console war
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

The 4090 is twice as powerful as the 3080.

That means in theory the 4090 is 4x more powerful than a ps5

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol
The PS6/XSX2 GPU will be a little bit more powerful then a 4090.

They Will probably use sub Nm process.
 

tusharngf

Member
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

The 4090 is twice as powerful as the 3080.

That means in theory the 4090 is 4x more powerful than a ps5

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol
3080 level seems achievable. They need to increase their tdp limit. 600 dollar console can do a 3080 level performance easily in next 2-3 yrs.
 

Haggard

Banned
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

The 4090 is twice as powerful as the 3080.

That means in theory the 4090 is 4x more powerful than a ps5

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol
you should pay some more attention to the roadmaps of the foundries alone.
2025 TSMC is planning to roll out their N2 process which on its own is supposed to cut power draw by over 50% compared to n4. Something on the powerlevel of a current 4090 will be ~200w by 2025-2026 and easily fit into a console formfactor. Then we still have another ~2 years until the next console`s hardware has to be finalized and we`ve not talked about the GPU manufacturer`s design improvements, yet.
4090+ power for the next gen consoles isn`t an issue from an architecural standpoint. The question is how the hardware prices will develop from here on.

They Will probably use sub Nm process.
Very doubtful considering what we`re actually fabricating at right now and how problematic physics become past real 2nm node sizes with our classic technologies.
But what we currently see in nm values on the foundry roadmaps are basically just marketing terms anyways and a somethim something picometer node might be a good product name. There is lots of room for improvement until we actually hit the physical limits with our classic semiconductor tech.
 
Last edited:

Sosokrates

Report me if I continue to console war
Uh, not gonna happen. Read up on the physics behind this if you`re interested. But what we currently see in nm values on the foundry roadmaps are basically just marketing terms anyways. there is lots of room for improvement until we actually hit the physical limits with our classic semiconductor tech.
Thats what I thought, I thought when u go sub nm you get quantum tunneling, but the folks that make them are already laying roadmaps and seem confident they will get there.




https://www.tomshardware.com/news/i...ntil-2036-from-nanometers-to-the-angstrom-era


https://www.google.com/url?sa=t&sou...breakthrough&usg=AOvVaw0yC-Tta96nGWgtJpErDNsn


https://www.google.com/url?sa=t&sou...0033768.html&usg=AOvVaw2SIEu7aiFiEYDleaSr3Q4r
 

Haggard

Banned
Even according to those sources actual sub nm processes require a complete change in the architecture and are expected sometimes in the 30s. Until then we´ll get below the 10an in name only.
However I´ll happily eat crow if someone comes around the corner with new materials/solutions to this.
 
Last edited:

Sosokrates

Report me if I continue to console war
Haggard Haggard

But sub nm is probably to late for next gen.
N2 might be sufficient for next gen like you say, maybe some enhanced N2

wLMvYiu.jpg


1QCA6FM.jpg



https://www.google.com/amp/s/www.techpowerup.com/295925/tsmc-announces-the-n3-finflex-n3e-and-n2-nodes-and-3dfabric?amp
N2 Technology - TSMC's N2 technology represents another remarkable advancement over N3, with 10-15% speed improvement at the same power, or 25-30% power reduction at the same speed, ushering in a new era of Efficient Performance. N2 will feature nanosheet transistor architecture to deliver a full-node improvement in performance and power efficiency to enable next-generation product innovations from TSMC customers. The N2 technology platform includes a high-performance variant in addition to the mobile compute baseline version, as well as comprehensive chiplet integration solutions. N2 is scheduled to begin production in 2025.

Im not sure N2 will be good enough to do 4x ps5/xsx at about 200w, maybe 3x. We dont know what the logic area reduction will be from N3 to N2.

I think even if this is the case they will get 4x raster performance on next gen, they will probably use a enhanced N2 and maybe a bit more power.

I would not be surprised that next gen comes in 2030, we are nearly 3yrs into the gen and it feels like its barely started, the os5 and xsx basically feel like ps4 pro 2 and x1x2.
 
Last edited:

//DEVIL//

Member
you should pay some more attention to the roadmaps of the foundries alone.
2025 TSMC is planning to roll out their N2 process which on its own is supposed to cut power draw by over 50% compared to n4. Something on the powerlevel of a current 4090 will be ~200w by 2025-2026 and easily fit into a console formfactor. Then we still have another ~2 years until the next console`s hardware has to be finalized and we`ve not talked about the GPU manufacturer`s design improvements, yet.
4090+ power for the next gen consoles isn`t an issue from an architecural standpoint. The question is how the hardware prices will develop from here on.


Very doubtful considering what we`re actually fabricating at right now and how problematic physics become past real 2nm node sizes with our classic technologies.
But what we currently see in nm values on the foundry roadmaps are basically just marketing terms anyways and a somethim something picometer node might be a good product name. There is lots of room for improvement until we actually hit the physical limits with our classic semiconductor tech.
State of the art tech cpu + new consoles don’t go in the same sentence.
Will see anyway
 
Last edited:

Haggard

Banned
State of the art tech cpu + new consoles don’t go in the same sentence.
I described the situation in 2026 according to the current roadmaps.
That will already be old stuff by the time the next consoles get their specs finalized.
 
Last edited:

//DEVIL//

Member
3080 level seems achievable. They need to increase their tdp limit. 600 dollar console can do a 3080 level performance easily in next 2-3 yrs.
Assuming they do in 2 to 3 years, they are going to match a gpu that is 5 years old by then ( 3080 is been out since 2 years and add 2 to 3 years you mentioned ) - that’s 5 years .

By the time this APU exist, we will be hitting the new 5000 series GPU ( which is actually on a lower nod too )

Imagine comparing a 3080 to a 5000 series gpu …. When the 4090 is already twice as powerful as a 3080.


That means even in 3 years from now with that imaginary APU in the next Gen console or ps5 pro whatever, it will still can’t do unreal engine 5 60 frames 4K since the 3080 can’t.

Honestly the console state in terms of power is a joke and so behind in terms of hardware it’s not funny .

But then again it’s 500$,,, no complaints. And the way it read for nvme is super fast and awesome
 

tusharngf

Member
Assuming they do in 2 to 3 years, they are going to match a gpu that is 5 years old by then ( 3080 is been out since 2 years and add 2 to 3 years you mentioned ) - that’s 5 years .

By the time this APU exist, we will be hitting the new 5000 series GPU ( which is actually on a lower nod too )

Imagine comparing a 3080 to a 5000 series gpu …. When the 4090 is already twice as powerful as a 3080.


That means even in 3 years from now with that imaginary APU in the next Gen console or ps5 pro whatever, it will still can’t do unreal engine 5 60 frames 4K since the 3080 can’t.

Honestly the console state in terms of power is a joke and so behind in terms of hardware it’s not funny .

But then again it’s 500$,,, no complaints. And the way it read for nvme is super fast and awesome
I remember 1080ti being the top dog in 2016-2018. I think current gen consoles are closer to that level. Who knows we might even get 3070 level PRO console in next 2yrs. RDNA 3's top gpu will reach 90-100TF range so we can at least expect PRO console at 20-25TF level.
 
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

Not really; in gaming you're rarely running FP32 100% of the time. In real-world gaming scenarios the 3080's about 40% - 50% more capable than a 2080.

The 4090 is twice as powerful as the 3080.

It might be more, but again for gaming scenarios it won't actually be that in practice. It'll be lower.

That means in theory the 4090 is 4x more powerful than a ps5

Well I've seen some reports for 4090 around 82 TF; assuming in gaming perf real-world scenarios you're getting about 60% of that it's effectively 49.2 TF which is about 5x PS5 but not all parts of the graphics pipeline are going to be driven by mesh shading routines; fixed function will still matter a lot especially depending on the engine and the design of the game in question, and the difference between PS5 and a 4090 in that respect won't be anything near a 500% surplus in TF the 4090 has over a PS5.

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol

Realistically 10th-gen consoles will probably aim for no more than 50 TF, and honestly I don't think you need much more than that for gaming. AAA dev budget, time, and manpower labor are the bigger bottlenecks at this point. I can't see them having more than 10240 shader cores, and that's probably pushing it even at say 2nm (not all the logic scales linearly and really the nodes are more for marketing if anything. For example the memory controllers will definitely not be at 2nm on the chip).

Honestly I would be severely disappointed if the only thing 10th-gen brings are more powerful consoles and higher resolution. I'm hoping there's a bigger push for VR & AR to be mainstream and that means some cheap headset as standard. I think it'll finally be time by when we reach 10th-gen, which is probably not going to launch until 2028 anyway.

Haggard Haggard

But sub nm is probably to late for next gen.
N2 might be sufficient for next gen like you say, maybe some enhanced N2

wLMvYiu.jpg


1QCA6FM.jpg



https://www.google.com/amp/s/www.techpowerup.com/295925/tsmc-announces-the-n3-finflex-n3e-and-n2-nodes-and-3dfabric?amp


Im not sure N2 will be good enough to do 4x ps5/xsx at about 200w, maybe 3x. We dont know what the logic area reduction will be from N3 to N2.

I think even if this is the case they will get 4x raster performance on next gen, they will probably use a enhanced N2 and maybe a bit more power.

I would not be surprised that next gen comes in 2030, we are nearly 3yrs into the gen and it feels like its barely started, the os5 and xsx basically feel like ps4 pro 2 and x1x2.

You all have to remember, node shrinks aren't the only thing that will save on power consumption. That's like strictly counting on Moore's Law in modern day, it won't get you very far.

Chipletization, better & smarter data processing sub-systems, packaging configurations for chiplets, interconnects (i.e UCIe), possibly some form of PIM (Processing-In-Memory) technology, deeper PNM (Processing-Near-Memory) designs (think PS5's SSD I/O sub-system but more intricate) etc. are going to be able to combine in a console design alongside a shift to smaller nodes to bring better performance at manageable power consumption.

There will always be tradeoffs; don't expect GPUs obscenely larger than 10,240 shader cores for example, because the smaller nodes cost more per mm2 than the bigger ones. But it'll all somehow work out.
 
Last edited:

sachos

Member
How could path tracing affect the gameplay design. It's really just used for lighting.
I was thinking about having more dynamic objects on the scene without fear of them looking out of place against baked lighting, if everything is path traced then everything can move in real time and still look good. Maybe you can design gameplay around realistic shadows or reflections.
 

BreakOut

Member
Mid settings texture lol


Honestly speaking if PS6 is a 3080 level then Color me impressed.

The 3080 as it’s is around twice as powerful as 2080 give or take few percentage

The 4090 is twice as powerful as the 3080.

That means in theory the 4090 is 4x more powerful than a ps5

For that kind of power, the gpu is the size of a house. No matter how much of technology advancement we reach, we will never be able to get something this powerful on a console even if it 5 years from now unless AMD manage to go 3nm nods . Which not I don’t see this happening,, I don’t see anyone willing to sell a kidney to afford such an expensive node

The future is grim lol
I have a feeling, as 8K becomes more of a thing and RT becomes desired we are gonna see 4K 60. Without RT, but I’m one of the people that don’t really care about RT. It’s nice when implemented but never worth the frame rate to me.
 

PeteBull

Member
I have a feeling, as 8K becomes more of a thing and RT becomes desired we are gonna see 4K 60. Without RT, but I’m one of the people that don’t really care about RT. It’s nice when implemented but never worth the frame rate to me.
Imagine in 6-7years RT wont be like now, massive hit to fps/resolution, but minimal one, back in the days antialiasing was super costly too, and even antisotropic filtering, nowadays those are close to free (worst case scenario few % lower performance, usually not even 2% less fps).
 
Last edited:

Kurt

Member
To be fair, the ps5 and xbox series have barley scratched the surface on what they can do. We have only had a handfull of current gen games. None of which are really doing much new from a visual or design standpoint.
Could be, but we are already 2 years from launch... so i have some big concerns that it wont be.
 

ACESHIGH

Banned
3070 has the advantage of being really fast in ray tracing (around %30-60 depending on the title. if a game has complex ray tracing it is going to be near %50-60. if a game has more simplistic ray tracing, it will be around %30, which is almost as much as how fast the 3070 is over ps5 in rasterization).

and we know that eventually ps5/xsx will get special raster hack/optimizations that will get more performance out of their hardware. but that's another topic (see: gtx 970 being 2-2.5x faster over ps4 between 2013-2016 games, but once nextgen started, it eventually was only 1.5-1.7x faster than ps4. this has to do with both console side optimizations and lack of gpu side optimizations for older arch)

ray tracing... is cool. when you can use it. up until now, there are no troubles. dx12 commands that %10-15 of available vram budget must be left to background applications. that gives us 6.5 gb vram of usable budget. it was a fine budget to add ray tracing on ps4-oriented games. but nextgen textures and games will demand undisputed 10 gb of memory buffer in future.

in those cases, you will run into situations where 3070 is unable to run ray tracing alongside with nextgen textures. sometimes even sacrificing huge time on textures won't even yield proper results. so that advantage will go waste at some point. and if you can't utilize the ray tracing in such cases, 3070 just becomes a gpu that is barely %25-35 faster than consoles and even that advantage can diminish quickly if Ampere is left behind by NV/devs or consoles get more performance out of their hardware all of a sudden (it happened before, will happen again)

so yeah, sorry, 3070 is not an good example. it actually made PC gaming seem real bad when it was running into all kinds of VRAM related troubles in spiderman remastered. you had to sacrifice and compromise on textures to run ray tracing at 4k/upscaled (which is what PS5 is capable of, thanks to having enough memory buffer to fit both ray tracing calculations and high quality assets/4k lods all together). i myself have a 3070. this gpu deserved 10-12 gb vram buffer. 8 gb is too tiny for what it is capable of.

for the case of complex ray tracing: 8 gb buffer is already not enough for cyberpunk even at 1440p with high textures in certain locations of the map. this is a freaking lastgen based game that has lastgen textures. how do you think 3070 will be able to upgrade both textures and keep complex ray tracing?

if such is the case, what is the point of having that big ray tracing performance advantage at all_? it was only relevant on lastgen based ray tracing games. I also had the pleasure of playing and experiencing them. but card will shit the bed once actual nextgen games start to hit.

ps5 is meant for future. if we're going to compare 3070 and ps5, than we have to see what future holds for both hardware. I just think that 3070 won't be able to capitalizate on its ray traicng advantage in future.

this is gtx 770 ps4 all over again. go see how people thought of a 2 gb 770 over ps4. whenever u say something about its tiny vram buffer, all people were like "more vram wont make ps4 faster lolol". in the end, the decrepit gpu cannot even match ps4 in true nextgen (for lastgen) games like god of war, rdr2 and many more. now ask the same people if they would prefer having a 770 or ps4 and they would ridicule 770 on how decrepit of a gpu is. it is what it is. that gpu too had enormous power over ps4. thanks to its tiny vram buffer, it cannot even shine. 4 gb 770 fairs better but it was a rare gpu. and nvidia did not make the same mistake this time (no 16 gb 3070 in sights). they're literally skimping on vram on a supposed "4080" tier card by giving it 12 gb (in reality it is a 4070.)

im not going say 3070 will fall into the same situation (2 gb vram against 4 gb budget and 6.5 gb vram against 10 gb budget. in this case, 3070 won't be decrepit like 770. but it wont be able to enjoy its premiums aside from DLSS)

Yep. Built a PC with a GTX 760 in December 2013to match next gen consoles. After all if consoles had a modified 7790 and 7850 so I bought a slightly more powerful GPU to account for console optimization.

It was great through 2017 but then it started to show its age. First due to its Low VRAM buffer (already couldn't match console textures in 2015 Batman Arkham knight) but later the Kepler arch started aging like milk with some games, Doom 2016 was a massive offender.

I should have gone with an R9 280 instead. That GPU still matches console performance to this day. I can't complain because I never had a 7th gen console and the GTX 760 absolutely crushed 7th gen PC ports, so I spent a lot of time playing those before I got to 8th gen games.

I sincerely hope I am wrong but I think the RTX 20 GPUs are going to experience something similar to Kepler back then. With the caveat that DLSS 2.0 will be there to help in some games.

Thats why I am still playing the waiting game before building a new PC. My RX 580 should be enough for 1080p series S style settings at least next year

All those massive benchmark numbers based on 8th generation games don't mean much to me. I need at least 10 to 20 AAA next gen only gsmes running on PC with DX storage to understand what kind of PC I should be building. Cross gen is taking too long. 2023 should be the first proper next gen year.
 

yamaci17

Member
Yep. Built a PC with a GTX 760 in December 2013to match next gen consoles. After all if consoles had a modified 7790 and 7850 so I bought a slightly more powerful GPU to account for console optimization.

It was great through 2017 but then it started to show its age. First due to its Low VRAM buffer (already couldn't match console textures in 2015 Batman Arkham knight) but later the Kepler arch started aging like milk with some games, Doom 2016 was a massive offender.

I should have gone with an R9 280 instead. That GPU still matches console performance to this day. I can't complain because I never had a 7th gen console and the GTX 760 absolutely crushed 7th gen PC ports, so I spent a lot of time playing those before I got to 8th gen games.

I sincerely hope I am wrong but I think the RTX 20 GPUs are going to experience something similar to Kepler back then. With the caveat that DLSS 2.0 will be there to help in some games.

Thats why I am still playing the waiting game before building a new PC. My RX 580 should be enough for 1080p series S style settings at least next year

All those massive benchmark numbers based on 8th generation games don't mean much to me. I need at least 10 to 20 AAA next gen only gsmes running on PC with DX storage to understand what kind of PC I should be building. Cross gen is taking too long. 2023 should be the first proper next gen year.

not matching textures already started for 8 gb cards. however this time textures look more graceful than they did back then. back then you either ran ps4 textures or you just ran shit textures. nowadays med/high textures seem okay. series s will play a part in this too (can't have super shit looking textures on a nextgen console after all)

also 8 gb (6.4-7.5 gb usable depending on the engine) against 10 gb (ps5) and 2 gb against 4 gb (ps4) puts the odds towards a bit of ampere/turing. it shouldn't be dramatic as it happened with kepler. however it will still happen to some degree.

on terms of uarch, its a complete mystery, no one can know how ampere turing will age. kepler really seemed hugely gimped by design though, considering it does not even support directx 11.1 feature set. at least turing and ampere has directx 12.2 featureset, so you can say that their potentials are not even used yet, whereas there was no potential with kepler to begin with.

i suspect ampere and turing will age similar to pascal, capable and pushing games for a long time. in terms of textures: sacrifices will be have to made. 6.5-7 gb budget won't be able to match the 10 gb the consoles are going to allocate fully. only thing remains as a question is how will in-between textures look. if they look decent at 1440p, these cards should get away for a long time.
 

Raven77

Member
By 2026 at this rate developers will still be trying to crush games into the One X architecture and wondering why games look so much better on PC.

It's criminal that there are titles slated for a 2024 release which will support the VCR Xbox One.

This right here. Future power of pointless when they WILL keep making games for 10 year old systems.

I have purchased at least 1 new system at launch every generation for the first 25 years. This current generation has ruined it for me. I won't be buying the PS6 or the Xbox 1 X Series 2 S One or whatever completely moronic naming convention they come up for the next one to utterly confuse every single person on the planet.

I'll buy both of them 3 years after launch, and all the worthwhile games of the previous 3 years at a huge discount.
 

Tqaulity

Member
Honestly speaking if PS6 is a 3080 level then Color me impressed.
3080 level seems achievable. They need to increase their tdp limit. 600 dollar console can do a 3080 level performance easily in next 2-3 yrs.
Some low expectations around here and not a great sense of how computer hardware evolves.

You guys do realize that AMD’s current top end RDNA 2 GPU (6900XT) is ~23TFLOPs and roughly matches an RTX 3090! (News flash AMD and Nvidia TFLOP ratings are not 1:1) So 2x the current console GPUs will be around that mark (I.e 20-25 TFLOPs). That makes sense because real world perf delta from PS5 to 3090 is only around 2x (despite what the specs and TFLOPs say). This also lines up with the rumors for the mid gen refresh perf targets of + 2x raster. So 6900XT/3090 is the minimum for the “Pro” consoles in ~2024. This notion that an actual PS6 (years after that) only being a 3080 level is one of the most ridiculous things I think I’ve ever seen on this forum.

Furthermore, RDNA3’s mid-tier 7600/7700 card is rumored to match the 6900XT within a few months. The top end RDNA3 cards will be roughly tripling the perf of the current 6900XT flagship which is a huge jump similar to what we’re seeing on the Nvidia side. Now the 7600/7700 will not have quite the form factor and thermals to fit in a console in 2022…but by 2024? More than doable.

PS6 perf will target perf that is higher than a RTX 4090 today. You can count on that!
 

Xmengrey

Member
All that power and nothing to take advantage of it because of the cross gen/Series S obsession.

"gaming for everyone" at the expense of progress.
To be honest we are hitting a wall graphically can't push fidelity too high lest you cripple framerates, can't exclude too many people or else you lose money. Even graphical progress has a price.

Making AAA games is only getting more expensive so companies are trying to offset the costs.
This is why Microsoft and Sony are offloading all their exclusives onto PC where as previously at least the 7th gen for Microsoft and always for PlayStation they locked their games behind their console.

Its why Sony and MS are becoming more service oriented I honestly suspect even if we have next gen consoles(PS6 and next gen Xbox) they will at best mid range PC parts of the time that can be put in an APU, dev tools that make development faster and easier thus cheaper with a focus on scalability.
They will rely less on console hardware to make money only being available to those who want it and if you want the highest graphics then just get a PC, if you want high graphics for a console you got it, just want to game, don't care about graphics you got the lower end options for that or the Cloud if you don't want hardware at all.


Not so sure if both consoles will have an S variant but I suspect they will for more options and the S variant will be as strong as PS5 in terms of raw power at least but with 2026 -2027 tech
 

Xmengrey

Member
The PS6/XSX2 GPU will be a little bit more powerful then a 4090.

They Will probably use sub Nm process.
They will probably only put in what will make these consoles sub 600 dollars and the hardware will only be what the most popular games like COD Next Gen demand for easier development and cheaper development.
They will probably try to have as many 9th gen and 10th Gen cross gen titles as possible I suspect throughout the entire generation as games get more expensive to produce.
Next gen tools will focus on scalability up and down to at least Xbox Series S.
Next gen consoles will be like new PCs and iterations on the old with a focus on higher framerates and more resolution support.

So next gen will be up to 8K/60 gaming
Current gen up to 4K
The console budget will probably mostly go to CPU and GPU upgrades but will probably be GDDR6 and maybe a SSD upgrade.
 
Last edited:

Tqaulity

Member
With the recent announcement of the 4090 RTX cards I am absolutely blown away at how fast and how big the leaps are in technology when it comes to GPUs and just tech in general. I know there are 2 threads on the 4090 already but I wanted to make one that specifically covers next gen systems and by next gen I am talking about the PS6 and the Next Xbox.

IMPORTANT THINGS TO NOTE - I know tFlops is not really a a perfect metric to explain how powerful a leap is in the GPU power but it's the only metric that's easiest to digest and give you a some sort of a ballpark. There are many other factors that should also be taken into account when you are talking about overall GPU performance increase.

Also, there are other factors to also consider how powerful next-gen hardware systems and not just the GPU. CPU and RAM/Memory/SSD will be another major factor, but this is strictly a GPU thread so we are only talking about GPU increases here.


First and foremost - let's lay everything out on the table here too see what we are looking at for the last 2 generations of consoles.

The PS4 and Xbox One launched in 2013

The PS4 was sitting at measly 1.84 tFlops
Xbox One was sitting at even lower 1.31 tFlops

The PS4 Pro later launched in 2016 and was sitting at 4.2 tFlops
The Xbox One X launched a year later in 2017 with 6.2 tFlops

The PS5 and Xbox Series X launched in 2020.

The PS5 is sitting at 10.28 tFlops (10.3 if we round up, so let's just stick with 10.3)
The Xbox Series X is at 12.3 tFlops (2 tFlops higher than the PS5) and is a bit higher in tFlops measurement than the 2080 RTX Super which is sitting at 11.1 tFlops. A 2080 Ti however, is 13.5 tFlops which makes it superior to both the PS5 and the Xbox Series X when talking about the GPU specifically.

Now let's look at the 3000 series and the 4000s series since we have the official numbers now.

3090 Ti is sitting at 40 tFlops which is 4x more than the PS5 and Xbox Series X and was launched earlier this year in March 2022.

3090 is sitting at 36 tFlops which is almost a 4x increase than the PS5 and Xbox Series X and was launched on September 17th 2020, 2 months before the launch of the PS5 and Xbox Series X. The consoles we already locked in to basically utilize the power of GPUs that were equivalent to the 2000 series so 1 generation behind.

3080 is sitting at 34.1 tFlops which again is a massive increase compared to the PS5 and Xbox Series X.

Another difference with these GPUs is the amount of ram and type of ram, but let's be honest RAM was never a problem anymore for console systems. The last time we had major ram/memory issues was the PS3 and 360 era and we are well past that and I highly doubt we will ever that problem again.

The PS4 and Xbox One biggest issue was the CPU, it was super outdated crappy Jaguar x86.

With how weak and pathetic PS4 and Xbox One is looking these days the wizards over at the Naughty Dog and Sony Santa Monica still have the best looking games to date. The Last of Us Part II and God of War Ragnarok are very good examples of that.

BWh1xcA.jpg

UUybFaW.png


We have seen absolutely nothing yet what the PS5 and Series X could truly do until we get those true current-gen goodies.

Anyways - back to the GPUs

We now have the specs for the 4090 and its releasing next month on October 12th with the following specs:

4090 has 84.3 tFlops that is more than double the increase of the 3090 Ti which was 40.3 tFlops.

Fast forward to 2024 we will be getting the 5000 series which will once again make the 4090 probably look like a little baby and the trend seems to always double the performance of the previous generation of the GPUs. According to Nvidia the 4090 is 2-4 times more powerful than the 3090 Ti depending on the application/game/software.

I am suspecting the 5000 series to finally break the 100 tFlops barrier, I mean that's guaranteed at this point. Don't forget there will also be a 4090 Ti and honestly that will probably very close to 100 tFlops already.

IMPORTANT THINGS TO NOTE - In 2026 it will be the 6000 series. When are the new consoles coming? I have no clue, but I am probably expecting them around 2026-2028 with 2028 at the latest and 2026 at the earliest. If the new consoles come out in 2026 then they will most likely be utilizing and be equal to the power of the 5000 series in terms of GPUs specifically, if they release in 2028 then the 6000 series. But then again, there might be some sort of a crazy curveball, I am just going of the last 10 years of how everything went. The PS6 has already entered R&D/Concepting in 2021. The PS5 started development in 2015 which is 1.5 years or so after the launch of the PS4.

What are you all expecting for the power of the GPUs out of next-gen systems?
Agree with your overall assessment but it's not fruitful to compare Nvidia TFLOPs to the AMD based consoles. The more interesting thing here is that looking at the rumors and leaks for RDNA 3, we may see a >3x jump in overall throughput from 6950 XT to 7900XT!! The 7900XT is looking like it could be >70TFLOPs which can be compared directly to the AMD based consoles. So we'll be jumping from ~10TFLOPs in consoles to ~70TFLOPs in the top of line card in 2023! That's a 7x perf gap from top end PC to console which is something we almost never see until maybe the very end of a console generation. If that trend were to continue, then by the next PC GPU generation (i.e RDNA4, Ada Next), we could easily be seeing well over 200TFLOPs. So the by the time we get to a PS6/Xbox Series Next, we'll be looking at GPU performance based on at least that generation and could definitely see consoles with greater than 100TFLOPs of GPU performance. That's a 10x increase gen on gen coming from a PS5 for example. To your point OP, that would be insane but it's entirely possibly in the next 5 years or so.

Here is a pretty good breakdown on how the leap in PC graphics performance we're seeing with Ada and RDNA3 will positively impact the consoles, starting with the mid gen refreshes. We could expect to see a RTX 3090 level of perf at a minimum based on current AMD trends and roadmap! Pretty cool!

PS5 and Xbox Series X Refreshes Are Coming...BE AFRAID! BE VERY AFRAID!!
 

PeteBull

Member
Agree with your overall assessment but it's not fruitful to compare Nvidia TFLOPs to the AMD based consoles. The more interesting thing here is that looking at the rumors and leaks for RDNA 3, we may see a >3x jump in overall throughput from 6950 XT to 7900XT!! The 7900XT is looking like it could be >70TFLOPs which can be compared directly to the AMD based consoles. So we'll be jumping from ~10TFLOPs in consoles to ~70TFLOPs in the top of line card in 2023! That's a 7x perf gap from top end PC to console which is something we almost never see until maybe the very end of a console generation. If that trend were to continue, then by the next PC GPU generation (i.e RDNA4, Ada Next), we could easily be seeing well over 200TFLOPs. So the by the time we get to a PS6/Xbox Series Next, we'll be looking at GPU performance based on at least that generation and could definitely see consoles with greater than 100TFLOPs of GPU performance. That's a 10x increase gen on gen coming from a PS5 for example. To your point OP, that would be insane but it's entirely possibly in the next 5 years or so.

Here is a pretty good breakdown on how the leap in PC graphics performance we're seeing with Ada and RDNA3 will positively impact the consoles, starting with the mid gen refreshes. We could expect to see a RTX 3090 level of perf at a minimum based on current AMD trends and roadmap! Pretty cool!

PS5 and Xbox Series X Refreshes Are Coming...BE AFRAID! BE VERY AFRAID!!
Just glad i didnt bite on ur standard ps5, aka ps5 pleb version;D
 
OP is making the wrong assumptions that consoles still won't have cut down CPU/GPUs and will be on Xbox 6 by the time they reach 40tflops.

Also, TLOu2 nor GOW look that impressive for diminishing returns. Edited stills can look good but still not enough to avoid the uncanny look, and in gameplay it's not even close. We still have not reached Monsters inc level of game graphics yet in 2022, we are nowhere in the vicinity of diminishing returns.
 

ChiefDada

Gold Member
People are still focused on TFLOPs likes its 2003, when in reality hardware/software tech that prioritizes memory/bandwidth efficiency should be the primary focus. I would often squabble with VFX Veteran, but he was absolutely right about RT based lighting being the final hurdle for real time graphics; I think most would agree we've reached satisfactory levels for materials and geometry. When you look at research as to why real time lighting is lagging, you will see that memory is the issue, not compute. TFLOP/memory bandwidth proportional increase have been so lopsided it renders the tflop increase useless and/or inefficient for memory latency sensitive applications such as RT. That is why new memory management paradigms such as PS5 I/O integration will pay off once cross gen is officially behind us.

Honestly speaking if PS6 is a 3080 level then Color me impressed.

PS5 is already competing with 3070 in RT workloads... PS6 GPU will blow 3080 out of the water. In fact mid range 30 and 6000 cards are proving to be not so great investments in the long-term for current gen.
 

//DEVIL//

Member
People are still focused on TFLOPs likes its 2003, when in reality hardware/software tech that prioritizes memory/bandwidth efficiency should be the primary focus. I would often squabble with VFX Veteran, but he was absolutely right about RT based lighting being the final hurdle for real time graphics; I think most would agree we've reached satisfactory levels for materials and geometry. When you look at research as to why real time lighting is lagging, you will see that memory is the issue, not compute. TFLOP/memory bandwidth proportional increase have been so lopsided it renders the tflop increase useless and/or inefficient for memory latency sensitive applications such as RT. That is why new memory management paradigms such as PS5 I/O integration will pay off once cross gen is officially behind us.



PS5 is already competing with 3070 in RT workloads... PS6 GPU will blow 3080 out of the water. In fact mid range 30 and 6000 cards are proving to be not so great investments in the long-term for current gen.
People like to dream. Lol at ps5 being compared to a 3070 when it’s 2070 level
 
With a tremendous backlog of consoles, handhelds and games, I don't think I'll bother buying anymore systems other than Super Switch or Switch 2.
 

ChiefDada

Gold Member
People like to dream. Lol at ps5 being compared to a 3070 when it’s 2070 level

In memory intensive workloads it can outclass 3070. We've seen this in Spiderman Remastered (RT+ Ultra Textures), we'll see it again in Miles Morales and it will be even more pronounced in future current gen titles where memory swapping will be much more volatile... it is what it is.
 

//DEVIL//

Member
In memory intensive workloads it can outclass 3070. We've seen this in Spiderman Remastered (RT+ Ultra Textures), we'll see it again in Miles Morales and it will be even more pronounced in future current gen titles where memory swapping will be much more volatile... it is what it is.
A port being optimized on PlayStation from a Sony first party studio and not as good in optimization on pc doesn’t make the ps5 a better hardware

The ps5 is not even close to a 3070’s fart.
 

Haggard

Banned
In memory intensive workloads it can outclass 3070. We've seen this in Spiderman Remastered (RT+ Ultra Textures),
Lol... All we've seen is that the ps5 is not as far behind as one would think.... Until you compare things like the RT resolution and range among other things and realize that the ps5 is producing potato quality in comparison to be able to not grind to a slideshow...... The moment RT is involved the ps5 barely matches a 2060....
 
Last edited:

StreetsofBeige

Gold Member
New console will come out in about 5 years (that would make it a 7 year life span). Assuming there's no mid gen refresh, us console gamers will be playing on 10-12 TF boxes while PC gamers in 2027 will have GPUs that probably do 100TF+ easy. A 4090 says it's FP32 at 80 TF already. Crazy.

Unless Sony and MS bump up the specs a lot for PS6 and Series Y, PC will have a gigantic power advantage right off the bat next gen.
 

ChiefDada

Gold Member
A port being optimized on PlayStation from a Sony first party studio and not as good in optimization on pc doesn’t make the ps5 a better hardware

Well yeah, there's only so much you can do to optimize when you have two platforms with divergent philosophies for memory management. And that's the point. The "better hardware" debate becomes useless in situations where games are designed tot take advantage of a specific platform unique features.

Anyways, I can tell where this discussion is going so I'll stop here. And what better way to close than with this treasure of a statement here:

The ps5 is not even close to a 3070’s fart.
 

ChiefDada

Gold Member
Lol... All we've seen is that the ps5 is not as far behind as one would think....

Yes, hence why the terms "outclass", as I have used, and "punching above it's weight" would be appropriate here.

Until you compare things like the RT resolution and range among other things and realize that the ps5 is producing potato quality in comparison to be able to not grind to a slideshow...... The moment RT is involved the ps5 barely matches a 2060....

I do agree PS5 hardware doesn't prioritize RT as much as I feel will be required by the end of the generation, which is why I expect mid gen refresh.
 

//DEVIL//

Member
Well yeah, there's only so much you can do to optimize when you have two platforms with divergent philosophies for memory management. And that's the point. The "better hardware" debate becomes useless in situations where games are designed tot take advantage of a specific platform unique features.

Anyways, I can tell where this discussion is going so I'll stop here. And what better way to close than with this treasure of a statement here:
That speaks of how badly the dev team is behind the port. not how good the PS5 hardware is. because in 99% of the games, the 3070 will smoke the PS5. this is not really even something to discuss or debate. I remember gamers nexus was comparing some PS5 games to a GTX 1070. which goes back to the same point. speaks badly about the dev porting the game to PS5.
 

Ronin_7

Banned
Don't focus that much on the future, live the present and play the games currently available that already look great.

+ The future is GaaS Live Service trash so I hope you're ready for shitty games full of MTX & Battle Pass.
 

yamaci17

Member
In memory intensive workloads it can outclass 3070. We've seen this in Spiderman Remastered (RT+ Ultra Textures), we'll see it again in Miles Morales and it will be even more pronounced in future current gen titles where memory swapping will be much more volatile... it is what it is.
memory swapping happens because 3070 does not have the same budget as ps5 has to video games in spiderman

if you contain settings to your own budget (high textures instead of ultra textures), 3070 will still outperform the ps5 by a large margin. memory swapping is not an INTENDED or WANTED behaviour. THAT is the crucial part. you cannot design games around memory swapping. memory swapping was a thing since 2013s. it always butchered the card's performance. it always happened whenever you breached your VRAM budgets /which is what happened in spiderman. it is not special/

you actively nerf and destroy your card's performance by forcing it to memory swap. when the card memory swaps, it loses a huge portion of its actual performance to a point a comparison becomes moot (similar to how a comparison between gtx 960 2 gb and ps4 at ultra textures is moot. at ultra ps4 equivalent textures in rdr 2, gtx 960 slows down to a crawl. it does not mean ps4 is punching above 960 2 gb. it just meants you're gimping 2 gb 960's performance. 4 gb 960 performs as how you would except out of a "960" chip and punchs above ps4. this example alone should explain the situation to you

ps5 is not punching above its weight. it simply has more weight, in terms of total vram budget, compared to 8 gb variant cards

vram budget =/= performance

you simply sacrifice huge amounts of performance if you are hellbent on using super high quality textures on limited vram budget. this hold true for older gpus, it will also hold true for 8 GB GPUs. its a simple math.

memory swapping is not and never will be a standard in PC gaming environment. even a mere 3060 has 12 gb memory which is enough budget to cover all needs of all futuregen console ports. the standard is instead lowering the texture quality to fit your own memory budgets. simply breaching budget and causing card to lose huge amounts of performance is not a proper metric of saying "ps5 is outperforming 3070 in memory intensive applications". you're so wrong in your judgment, memory intensive does not mean high memory usage at all, it just means memory bandwidth intensive, which is not the case here.

you conciously make the game consume super high memory by breaching your allocated vram.

say you upgraded rdr2's textures in a way that it uses 16 gb of vram. it would destroy the 3070's performance to a crawl due to not having enough buffer.

this has nothing to do with anything punching their above weight. its a misconception spread by certain "reviewers".

please stop the misinformation from spreading and understand actual meanings behind such incidents
 
Last edited:
Top Bottom