• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Tech Demo Detailed, Called 'Hugely Impressive'

diffusionx

Gold Member
A rising tide lifts all ships.

I'm a console gamer AND a PC gamer, and am balanced enough to say the next gen of consoles will dramatically raise the bar for new titles available to PC gamers too. It's not like PC gamers have been living the 'next gen' life for the past two years, and PS5 owners are about to join in. Anyone presenting that is disingenuous. Yes, multi-platform titles can have superior FPS on a high end PC but to see a difference between the graphical bells and whistles from a One X or a Pro you really have to reach.

In about 13 months you'll see what next gen REALLY looks like, and it's not PC gaming circa 2017.

This might be true. But at the same time, I remember playing BF3 it at 1080p/60fps on my PC after I got a 660ti, which was not a high end GPU by any means. Then over a year later I got Battlefield 4 on PS4. It was 900p and 40-50fps at best. And it didn't look any better than BF3 PC - it actually looked worse. So it really didn't feel like I was getting next gen on my new console in 2013. Maybe this gen just sucked and the PS5 will be that awesome. I'm skeptical though especially when Sony's big selling point is a SSD.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
just a though. On PC you have slow hdds 5400 somewhat worse than what you can find on consoles, hdds, fast hdds and ultra fast ssd/etc. Do you honestly think that loading witcher's 3 map in 10sec compared to 70 seconds on ps4 isn't "incorporated"?
You're talking about loading time benefits when SSDs can be used as a cache in next generation games and will also prevent devs from having to duplicate assets
 

Justin9mm

Member
If we're again going to be getting 60@1080 or 30@4K, I'll be sticking with 1080p.

My TV only does 1080p at any rate, and I've read that since I sit roughly 10 feet away upgrading to a 4K TV would be a complete waste of time (although I'll be missing out on HDR).
I thought there were some 1080p HDR TV's? Maybe I'm thinking monitors.
 

ManaByte

Member
kIZqnGV.png
 
I dont think you played it but Killzone 2 looked equally good when it was released, if not better. Its on Youtube too.
kz2e3.png

screen1.gif


You could maybe argue that the lighting and shadows in Killzone 2 looked better but that's about it. The texture quality, volumetric effects and detailed animations such as the hair moving in the wind wasn't achieved back then. I remember playing it and thinking it looked great for a console exclusive at the time but there's a noticeable gap between real-time game graphics and CGI.
 

Winter John

Gold Member
"it is possible that the finished results will come closer to 60 fps in the future."

Hugely impressive.

Generational leap.



giphy.gif
 
Tech demos, right....





I grant you that Camcom's demo is something, I wonder how they came to it.

As for the Wizard demo, I have no doubt, this is all canned captured animation, and if you look at the best looking PS4 games you will find that this is in the same ballpark (Horizon Zero Dawn, Uncharted 4, Spider Man, Bloodborne, etc. I would even still put Killzone:SF in there)

And with the last games for CURRENT gen consoles, we can safely say we have finally reached this level of fidelity, literally almost a decade and one hardware generation later. So whatever Sony and MS are going to show when revealing their hardware, I'm not getting fooled this time around. Hell, today you can't even trust in "gameplay" trailers, because you never know if the game won't be seriously downgraded just a few months later. I've learnt to just sit and wait for the final product to be released, it's simply sooo much better to be positively surprised rather than negatively disappointed.
You are being fooled, the Samaritan demo was running on top of the line PC hardware on March 20th 2012 - 11 months before the PS4 was announced - and it looks pretty good like all Unreal Engine demos, but it also has clear flaws compared to the top games this year, the version of the video you have is of such bad quality that it hides any kind of problems with the rendering (it's real time, it's not perfect).

Here is the 1080p version:


As for the 2010 unreal engine 3 demo - it looks great, but it uses a lot of dept of field effects to hide anything that could reveal its actual age (again, not a slight against Epic, they are always close to the cutting edge of technology). Otherwise, it seems very similar to games using the engine back then, have you seen Gears of War 2 & 3? They do different things, but they certainly are as impressive.
 

ManaByte

Member
Bloody hell mate, you need to understand we're not at this level yet. What do you want from a $600 machine? To turn wine into glacier water?

It happens at every new console launch. People think the $600 box is going to make $3000 gaming rigs obsolete. People need to have a realistic expectation when they're buying a game console that goes under the TV.
 

sendit

Member
It happens at every new console launch. People think the $600 box is going to make $3000 gaming rigs obsolete. People need to have a realistic expectation when they're buying a game console that goes under the TV.

Agreed. However, you don't often see games that push current PC hardware aside from brute forcing frame rates and resolution. That's where this perceived notion when a new generation console replaces hardware that is ~8-10 years old.
 
Last edited:
It happens at every new console launch. People think the $600 box is going to make $3000 gaming rigs obsolete. People need to have a realistic expectation when they're buying a game console that goes under the TV.

You're right. It makes them better. Everyone wins. I understand the sentiment of wanting your purchase to last as long as possible which is why low cost options should still exist but at some point, half step bandaid consoles that are more iterative than revolutionary become problematic in that they're symptomatic of a larger issue. The ps4/xbone generation has already gone on far too long IMO. The OG Xbox One should be derided for the awful decision that it was (overpriced, underpowered hardware which didn't target any group that *actually* exists particularly well).
 
Tech demos being tech demos. With the 360, people walked away claiming it was rendering individual blades of grass. With the GameCube, the demo Rebirth blew minds with intricate forestry. ((Yawn)) Been around the block with this since Intellivison was mocking the graphics of Atari.
PS3 did it:

What's so weird about it? Even BoTW does it on Wii U/Switch.

I mean if really ps5 turns to be using somewhat better ssd than most pcs, will that mean that most pc games will be watered down? I mean if you think that could happen bcos most multiplatform games are build with x1 limitiations in mind so now we will have ps5 users calling pc hdds users a pleb for their storage option? Just a random thought, not trolling or anything. Recall when activision tried to enforce 6gb of ram in cod ghost.
I'm not going to ask how old you are (you sound quite young), but there used to be an era (late 80s/early 90s) where consoles were ahead of PCs in terms of sprite acceleration, loading times (cartridge vs floppy disc) etc.

Looks like we're going back to that era, at least in regards to the storage medium.

PCs might be able to brute force it with PCIe 4.0 NVMe, unless it's more custom than that. What's concerning right now is the lack of PCIe 4.0 options in the PC space. There's only X570 and nothing from Intel.

SSDs haven't been incorporated into a game's development on PC though
Star Citizen is an exception. Take a look at the latest DF video.

It's the only PC game right now that deserves the "PCMR" tag. Everything else is a spruced up console port.

Tech demos, right....


Detroit Become Human characters (especially Markus) have the same quality as this demo.

Heavy Rain surpassed this tech demo:



Too bad the crown was only at ~30fps.
You probably don't remember Crysis 1 performance on 2007 PCs.

That's what consoles do, they push the graphics envelope.

Imagine if a modern Crysis equivalent pushed Intel 9900k & RTX 2080 Ti to the point that it could barely render 1080p at 30 fps.
 

One such rumor has started a fervor of hype with a user by the name of Kleegamefan giving a first-hand account of a demo for a game that is being developed for the PlayStation 5. For the sake of transparency, it should be made perfectly clear that this is a single, alleged description from a mostly anonymous user, but it would seem that the rest of the users on the forum believe the initial report.

According to the post, the game that Kleegamefan saw the demo of was so visually stunning that it's difficult to put the quality of the graphics and physics engines to words, thanks to both the game's engine and the PlayStation 5's incredibly powerful SSD. From a number of replies, one of the most impressive feats of the game seems to have been the way that the game engine casts shadows, allegedly casting individual shadows from each leaf of a small bush, intricately expressing the movement of every branch and leaf as they swayed in the wind. This has led some commenters to speculate that these types of effects are achieved by ray-tracing, but if the claims from Kleegamefan can be believed, the effect is an incredible sight, almost unmistakable from watching live footage of foliage blowing in the breeze.

Kleegamefan also stresses in their posts that the demo they saw was unfinished and the astounding experience with the game is far from being the shipped project. For some players obsessed with framerates and resolutions, it may be disappointing to note that the game was running somewhere between 25-30 fps, although the resolution was reportedly 2160p and considering that this was an early demo, it is possible that the finished results will come closer to 60 fps in the future. Either way, Kleegamefan sums up posts best by stating that it likely won't be the specs of the PlayStation 5 that will blow expectations, but the quality of the games that will capture audience attention as more information comes to light.
If you think 30fps is disappointing then you need to fly to Pluto and start a new world because you don't belong in this one!
 
PS3 did it:

What's so weird about it? Even BoTW does it on Wii U/Switch.

Mario Sunshine did it too. Insert more examples of grass here: [ ] I don't recall Flower being used to demo the PS3 technology, given that it came out 2 years after the system's launch, but assuming it was: those 200,000 blades of grass... fantastic... but also an extremely far cry from the processing required to render an entire field, such as the background in the example you provided. I think what you may be sidestepping (or choosing to ignore) is that people see X and assume Y, which of course is one aspect (goal?) of tech demos.

The "X" I previously referred to was the latest Tiger Woods game coming out with the launch of the 360. It made a point to zoom in close on the golf ball surrounded by thousands of individual blades of grass blowing blissfully in the breeze. The associated "Y" was that people literally believed the game was simulating all that grass all the time. Omigosh the processing powarr! Do not read this as a suggestion that everyone was fooled... but people perhaps not well versed in the rendering technology and limitations of the time? You betcha.
 

Bryank75

Banned
I just want to give everyone here a heads-up.... we will all be buying RDR2 again.

You think the PC version was the ultimate one? That thing is gonna look like horse-manure against the next-gen version.
 
Sony always show some BS demo at new console launches, it’s tradition!

Killzone 2, Toy Story Graphics, Deep Down, UC4 will be 60fps and look just like this!

Never fails to fool the loyalists so why not?!
 
Killzone 2 didn't quite, but ...


This is what next gen should look like, killzone Shadow fall tried but as you can see here in this cgi trailer smoke is volumetric explosions are volumetric while in realtime it's flat sprites, 2nd major difference is the polycount in cgi everything is round while in realtime you can tell the edges
 

Meowzers

Member
It happens at every new console launch. People think the $600 box is going to make $3000 gaming rigs obsolete. People need to have a realistic expectation when they're buying a game console that goes under the TV.

Though it's pretty much consoles that dictate how games look. If all of a sudden there were no consoles, then games would take a massive leap if they were made for said $3000 gaming rigs.

Like what Wu-Tang said. Consoles Rule Everything Around Me.
 

ZywyPL

Banned
This just showed up on my YT recommendations lol:




A good reminder to not to get too excited indeed. That being said - I wonder what will be the bottleneck this time around? Because there's always one, and a false promise to overcome it always follows (power of the Cell, power of the Cloud, 8GB GDDR5 etc.).
 

Journey

Banned
I dont think you played it but Killzone 2 looked equally good when it was released, if not better. Its on Youtube too.


LoL

I think Killzone E3 2005 bullshit video might even have featured Ray Tracing... on a freakin PS3!





Expectations





Killzone2.jpg






VS.


Reality





Killzone2RT.jpg
 

sendit

Member
Though it's pretty much consoles that dictate how games look. If all of a sudden there were no consoles, then games would take a massive leap if they were made for said $3000 gaming rigs.

Like what Wu-Tang said. Consoles Rule Everything Around Me.

Agreed, which is why I believe a ~8 year console life cycle is way too long. It should be 4 max.
 

Kenpachii

Member
This is a concise post on the PS4 and AMD back in 2012/13 and can get an idea for 2020, possibly.


So whats the current most powerful mobile gpu they got that is under 100w?

SSDs haven't been incorporated into a game's development on PC though

They have, try black desert online out and that rts space game without ssd u wil hit brick walls in the maps.
 
Last edited:

Kenpachii

Member
I think it will be based on one that gets revealed next year. Nothing that is out now.

That's actually interesting what you posted.

Power consumption of the 7970M is in the same ballpark as the GeForce GTX 675M. According to Dell, users can expect a 100 Watt TDP from the AMD GPU. The subsequently large heat output suits the 7970M as an option only for large laptops or DTRs (desktop replacements) that have relatively powerful cooling solutions.

While the original power consumption of a 7870 where that chip is based on was only 130 watt.

Now if you look at this list:

index.php


U can see that a 5700 sits at 162 watt, so lets say it cuts down to 130 watt. I could see this happening with dropping some clocks and maybe a better optimized architecture it will drop below the 100w with lower clocks.
 
Last edited:
Mario Sunshine did it too. Insert more examples of grass here: [ ] I don't recall Flower being used to demo the PS3 technology, given that it came out 2 years after the system's launch, but assuming it was: those 200,000 blades of grass... fantastic... but also an extremely far cry from the processing required to render an entire field, such as the background in the example you provided. I think what you may be sidestepping (or choosing to ignore) is that people see X and assume Y, which of course is one aspect (goal?) of tech demos.

The "X" I previously referred to was the latest Tiger Woods game coming out with the launch of the 360. It made a point to zoom in close on the golf ball surrounded by thousands of individual blades of grass blowing blissfully in the breeze. The associated "Y" was that people literally believed the game was simulating all that grass all the time. Omigosh the processing powarr! Do not read this as a suggestion that everyone was fooled... but people perhaps not well versed in the rendering technology and limitations of the time? You betcha.
I'm not ignoring anything. Maybe XBOX 360 didn't have the compute power to simulate so many grass blades in real time. Some people argue that MEMEXPORT provided adequate GPGPU capabilities to Xenos.

Cell SPUs vs Xenon VMX128 is no contest in this area.

GPGPU compute (the spiritual successor of Cell stream processing) is the way they do it on Wii U/Switch and these are considered rather "weak" consoles.

That being said - I wonder what will be the bottleneck this time around? Because there's always one, and a false promise to overcome it always follows (power of the Cell, power of the Cloud, 8GB GDDR5 etc.).
It's not a false promise when something eventually delivers.

Cell delivered with games like Uncharted 2, it easily overcame RSX deficiencies.

GPGPU compute and CPU multi-threading also delivered with games like Uncharted 4, DOOM, Gears 5 etc.

Even PS3's Blu-Ray that was mocked to death due to costs eventually delivered... hell, we need 2 * 50GB discs these days!

Where's the false promise there? Just because some piece of tech cannot be understood by the masses (trust me, it happens every gen), it doesn't mean it was phony.

Agreed, which is why I believe a ~8 year console life cycle is way too long. It should be 4 max.
4 years max?

Good luck with modern AAA game dev cycles taking as long as 5-7 years.
 

sendit

Member
4 years max?

Good luck with modern AAA game dev cycles taking as long as 5-7 years.

Devs are fully capable of iterating what they have done on to newer hardware while still supporting the previous hardware it was originally intended for. I wouldn't mind if they went to a model similar to how phones are iterated every year with incremental updates while completely blurring what a generation is. What makes this even easier is the use of standardized parts and not some exotic Cell CPU.
 
Last edited:

diffusionx

Gold Member
Devs are fully capable of iterating what they have done on to newer hardware while still supporting the previous hardware it was originally intended for. I wouldn't mind if they went to a model similar to how phones are iterated every year with incremental updates while completely blurring what a generation is. What makes this even easier is the use of standardized parts and not some exotic Cell CPU.

A hardware platform that iterated incrementally on a continual basis using standardized parts and interfaces sounds great. Maybe even open it up and let people decide what parts they want to use in their system.
 
Last edited:
Top Bottom