• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

A Real-Time Particle Physics based game for Xbox Series X and high end PC's by Grant Kot

Bernkastel

Ask me about my fanboy energy!
Hi I'm the dev of the tech demo. I've also recently started working with an artist and sound engineer (will show up in future devlogs). I do these devlogs because I value feedback and suggestions and they can help shape the final game. I see some concerns about lack of gameplay, and we're working on fleshing out among other things. Currently the idea is an elemental bending combat (like Avatar) game with local multiplayer/co-op (with cloud streaming you can also do it over the internet).

We might focus on the Chinese classical elements instead of the more well-known system: Wood, Fire, Earth, Metal & Water. Wood is kind of hinted at with the voxel displacement video and some of my older fibers demos. Fire, I still have lots of work to do. Earth and Metal in some of the solid destruction demos, and then water in the latest demos.

Some moves you might be able to do: Collect and place blocks/shoot out your material, apply forces to your material (attract, repel, drag, vortex), Other move suggestions welcome.

Perhaps there will also be a non-natural faction that chooses to attack via weapons like swords, spears, arrows. Maybe you won't play as them but there can be some kind of tower defense mechanic. Like Fortnite with the zombies before it became battle royale. I really want the gameplay to take advantage of the fact that the whole world can be taken apart to individual voxels or be manipulated and rearranged. So it might be a waste to play as someone who shoots arrows.

As for the console wars, not interested in fueling that. I started to work on this game more seriously when I saw all the Threadrippers coming out last year and yes it will be able to scale up as well as scale down based on your specs. I think when I said "high-end" it got taken too seriously. When I saw the XSX specs I was like, wow, the specs better than my PC on paper and will probably be super affordable, and it uses DirectX, which my game already uses, so might as well roll that into my list of targets. As a small indie dev, I think it's probably best to focus on a single platform and get it right. Lastly, barrier to entry for PS5 is pretty high. No publicly available Graphics API specs or code and pricy devkits.

The current demos are still DX11 and do not require any advanced features. The GPU usage is ~30% and both CPU and GPU code can be optimized further, so I have quite some room to upscale things. I do have some plans to upscale with DX12U. The DX12U feature I'm most excited about is mesh shaders (of course will also be using other features). With mesh shaders instead of being limited to cubes I can have mesh based blocks with LOD. For example, the bricks demo but it can upscale from the cubes to much more geometrically detailed rocky blocks. Or grass and leaves via geometry as opposed to transparent textures.
Glad to see a post from the dev himself. Very excited of what comes from this project.
 

LordKasual

Banned
I highly doubt we'll see physics like this used liberally in larger next-gen games, but I did notice a possible use for stuff like this:




FF7R seems to have fluid dynamics on almost every body of water, but it's really hard to tell if it's "baked" or just real-time + restrictive because you can't directly interact with it.

But for example in these sewers, in every filling/draining scene the water fills/drains dynamically based on the scene and it looks really amazing.

I think it's a really nice touch, even in areas where water isn't flowing that heavily you can tell they used some kind of physics engine to simulate its look
 
Last edited:

HarryKS

Member
Maybe he thinks the Xbros will spend money on it if he panders enough.

The Psbros did with that demo from back in the day. The shadow thing.
 

D.Final

Banned
Physics based animations and deformations like this will be a game changer next gen. Right now a lot of destruction and physics applied to objects are baked and animated, that requires a lot of time and effort.

I fully agree
It will be interesting
 

CatLady

Selfishly plays on Xbox Purr-ies X
There is a game in development, I'm just in the early stages. The videos aren't trailers, but devlogs. Currently most of the work has been building out the game engine. Once that is further along I will be building a game on top of it.

What you're doing looks quite interesting. Please keep updating us.
 

supernova8

Banned
There is a game in development, I'm just in the early stages. The videos aren't trailers, but devlogs. Currently most of the work has been building out the game engine. Once that is further along I will be building a game on top of it.

Are you building the engine with a specific type of game in mind or is the game not really set in stone yet?
 
There is a game in development, I'm just in the early stages. The videos aren't trailers, but devlogs. Currently most of the work has been building out the game engine. Once that is further along I will be building a game on top of it.
Cool beans - thank you for the response. I look forward to seeing more.
 

LordKasual

Banned
The more I look at this, the more I can't help but think that its application is way too compute-restricted to be used extensively in any modern console.

Unless you've got some serious optimization to do, I notice that even the sandbox you're constricting the particles to is relatively small.

And I assume this is only running simulation logic -- in an actual game, unless it's actually a sandbox game, I imagine you'd need to free up even more CPU to get everything consistent and working.


So K kotsoft what are you actually planning on using this for? Are you basing game around this technology, or just proofing it with this tech demo for application elsewhere?
 

kotsoft

Neo Member
supernova8 supernova8 , the game isn't set in stone yet, currently still exploring what is possible/not. There is currently the physics simulation aspect, but there is also the possibility of bringing chemical reactions into the mix. As well as some AI creature evolution. There will probably be a big focus on building & user-generated content, like Minecraft or Dreams.

LordKasual LordKasual , the game will be based around this tech. Regarding the small region, I have been planning on taking a stab at implementing chunk-based open-world loading. The main difficulty with this is handling happens at the seams between chunks. Also, the simulator can handle sparse scenarios. The box container can be much bigger, particle count is the main bottleneck.

I'm also looking into adaptive resolution simulation, which should be especially helpful for liquids. If you double the radius of the particles, you can simulate the same volume with 1/8 the number of particles. For solids/elastics, it's a bit more difficult because the splitting and merging can mess with the internal stresses.

For enemies/life, there are two options. High instance count, small swarm based creatures, where each particle renders the mesh for the creature, or low instance count, giant creatures made up of large amounts of particles. The giant creatures can probably be more intelligent, with neural networks, and maybe their vision could use the voxel cone tracing I use for lighting. They could have a sense of touch as well, by reading in the pressures/temperatures measured by the particles they are made up of.

I won't deny there's still a ton of work to be done, and a lot of problems that still need to be solved.
 

LordKasual

Banned
LordKasual LordKasual , the game will be based around this tech. Regarding the small region, I have been planning on taking a stab at implementing chunk-based open-world loading. The main difficulty with this is handling happens at the seams between chunks. Also, the simulator can handle sparse scenarios. The box container can be much bigger, particle count is the main bottleneck.

I'm also looking into adaptive resolution simulation, which should be especially helpful for liquids. If you double the radius of the particles, you can simulate the same volume with 1/8 the number of particles. For solids/elastics, it's a bit more difficult because the splitting and merging can mess with the internal stresses.

This is something i've wondered if developers have ever toyed with when it comes to large bodies. I think its obvious that something like the ocean or a large lake wouldn't need the entire body of water simulated, just mostly the surface to cut down on the amount of particles, but as for how far you can stretch this i wouldnt really know. And if you compromise with particle size it changes the style of how the water looks i assume.

What i'm really wondering is, how can this kind of tech be optimized in a way that would allow it to be applied (and adaptable) to larger scales? Without solely focusing on clever / restrictive use of the particle budget?

My guess is that it would boil down to the creation of a kind of technique that would be able to "cull" simulation interactions to save compute time/density. Something similar to the way you're using Dx12 to handle triangle LOD in a way that lets you have less triangle drawing going on than you would without the mesh shaders.

Do you know if anything like that currently exists, or is even feasible for interactions like this? I imagine it would be extremely complicated, but it's the only way I see such simulations making it to normal game situations without brute forcing it through hardware
 
Last edited:

supernova8

Banned
supernova8 supernova8 , the game isn't set in stone yet, currently still exploring what is possible/not. There is currently the physics simulation aspect, but there is also the possibility of bringing chemical reactions into the mix. As well as some AI creature evolution. There will probably be a big focus on building & user-generated content, like Minecraft or Dreams.

Thanks for taking the time to respond! I'm pretty sure this is the limit of my ability to discuss game engines, so I'll bow out and just wish you luck!
 

kotsoft

Neo Member
This is something i've wondered if developers have ever toyed with when it comes to large bodies. I think its obvious that something like the ocean or a large lake wouldn't need the entire body of water simulated, just mostly the surface to cut down on the amount of particles, but as for how far you can stretch this i wouldnt really know. And if you compromise with particle size it changes the style of how the water looks i assume.

What i'm really wondering is, how can this kind of tech be optimized in a way that would allow it to be applied (and adaptable) to larger scales? Without solely focusing on clever / restrictive use of the particle budget?

My guess is that it would boil down to the creation of a kind of technique that would be able to "cull" simulation interactions to save compute time/density. Something similar to the way you're using Dx12 to handle triangle LOD in a way that lets you have less triangle drawing going on than you would without the mesh shaders.

Do you know if anything like that currently exists, or is even feasible for interactions like this? I imagine it would be extremely complicated, but it's the only way I see such simulations making it to normal game situations without brute forcing it through hardware

Here is a video I made showing some adaptive resolution simulation capability. This feature is still very early stage and there's still lots of work to be done. Basically just hacked it together since you brought it up.

 

kotsoft

Neo Member
I'm almost finished porting over the lighting system. Here are a couple of tests showing emissive materials. First one you can see some of the red glow reflecting off the rocks. Second you can see the slime refracted through the water. It can hit 60FPS but I have a lot of optimization work to do still to get rid of framerate dips in certain scenarios. I'll also need to model some more meshes/create mesh generators to fully take advantage of the mesh shader system.


 
Last edited:
Still waiting on the next gen Red Faction. Yet to see a game use destruction to such detail and effect.

This video is just fluid simulation. It's not solid if the building is made of jello.
 

reptilex

Banned
I'm almost finished porting over the lighting system. Here are a couple of tests showing emissive materials. First one you can see some of the red glow reflecting off the rocks. Second you can see the slime refracted through the water. It can hit 60FPS but I have a lot of optimization work to do still to get rid of framerate dips in certain scenarios. I'll also need to model some more meshes/create mesh generators to fully take advantage of the mesh shader system.




Do you actually render all particles or do you have a "surface/visible particle" only system which allows for optimisation? This is amazing, actually this looks more "next-gen" than most games showed for the new consoles and cards.

Also have you looked at the PanthaRei engine and what is does with Fire?
 

kotsoft

Neo Member
Such fun tech. So, has anybody toyed with the tech demo? This would melt my sad little PC so I just have to be happy with the clips...
I'm going to be doing my best to make sure this game works with a few different cloud streaming services to increase accessibility to the game. I also plan on hosting a split-screen multiplayer demo on Parsec Arcade.

Do you actually render all particles or do you have a "surface/visible particle" only system which allows for optimisation? This is amazing, actually this looks more "next-gen" than most games showed for the new consoles and cards.

Also have you looked at the PanthaRei engine and what is does with Fire?

Thanks! Currently I render all particles and rely on early depth stencil reject to save rasterization cycles but I can definitely optimize by doing a surface only system. I would just need to make sure I don't accidentally cull any particles that should actually be visible and avoid weird popping artifacts.

I just checked out Panta Rhei, that's probably a better way to render gases than my method, though I can still sort of render volumetric smoke etc. I'm initially going to try to do everything using this one technique, but if necessary, I am still open to bringing in other simulation and rendering methods.
 
Last edited:

svbarnard

Banned
This question is burning in my mind, so is this what games are going to look like in the 9th generation of video game consoles? Can the Xbox series S play something like that? I mean if video games were to look like that on a console in the next few years I would be so amazed!

Thing is the Xbox series S has just a little bit more memory bandwidth than the original Xbox one which came out 7 years ago!!!! Can video games really look like that on the series S? I need to know I can't stop thinking about it!
 

CamHostage

Member
I'm going to be doing my best to make sure this game works with a few different cloud streaming services to increase accessibility to the game. I also plan on hosting a split-screen multiplayer demo on Parsec Arcade.

Heh, you know, I actually never even thought of cloud streaming being a PC solution, but I'm a laptop guy (and a cheapo at that,) so that would be something! I'm also much more a console guy, so I wish you all the luck with your hopes to bring along an Xbox product at some point.

Also have you looked at the PanthaRei engine and what is does with Fire?

Funny, I was kind of thinking of Mr. Kot's project when the Deep Down 7 Years Later thread surfaced. That project and this one remind me of the times in that sweet spot of gaming generational shifts where developers just find some wholly new aspect to play with and build a whole game specifically around it, rather than the typical project that starts off with a character or a franchise or just the idea of having lights go down for it's eventual "IN A WORLD..." epic trailer.

Not related to either this new Liquid Crystal Demo or the old Panta Rhea tech, but if you're interested in physics simulation (and are, like me, dying for some next-gen tech flexes that we sadly have been robbed of with so little GDC this year,) you can check out videos of a VFX technology plugin by JangaFX called EmberGen. (*It does not "export volumetric data to game engines" at this time, so it's not for gameplay use yet, but they say they intend to.) It's tech for smoke, fire, explosions, magic, and other things that glow while swirling in the breeze.
 
Last edited:

kotsoft

Neo Member
This question is burning in my mind, so is this what games are going to look like in the 9th generation of video game consoles? Can the Xbox series S play something like that? I mean if video games were to look like that on a console in the next few years I would be so amazed!

Thing is the Xbox series S has just a little bit more memory bandwidth than the original Xbox one which came out 7 years ago!!!! Can video games really look like that on the series S? I need to know I can't stop thinking about it!
I'm speculating based on paper specs. It should be possible to keep the same level of physics simulation, as the XSS CPU still looks very powerful. Maybe graphics settings would need to be lowered, but I still have a lot of optimization work planned. My game doesn't use much memory (no textures, no open world streaming yet, instanced meshes), so I should be able to stick to the faster subset and the memory bandwidth there looks ok (8GB @ 224 GB/s).

Heh, you know, I actually never even thought of cloud streaming being a PC solution, but I'm a laptop guy (and a cheapo at that,) so that would be something! I'm also much more a console guy, so I wish you all the luck with your hopes to bring along an Xbox product at some point.



Funny, I was kind of thinking of Mr. Kot's project when the Deep Down 7 Years Later thread surfaced. That project and this one remind me of the times in that sweet spot of gaming generational shifts where developers just find some wholly new aspect to play with and build a whole game specifically around it, rather than the typical project that starts off with a character or a franchise or just the idea of having lights go down for it's eventual "IN A WORLD..." epic trailer.

Not related to either this new Liquid Crystal Demo or the old Panta Rhea tech, but if you're interested in physics simulation (and are, like me, dying for some next-gen tech flexes that we sadly have been robbed of with so little GDC this year,) you can check out videos of a VFX technology plugin by JangaFX called EmberGen. (*It does not "export volumetric data to game engines" at this time, so it's not for gameplay use yet, but they say they intend to.) It's tech for smoke, fire, explosions, magic, and other things that glow while swirling in the breeze.
EmberGen is really cool. I think we will be seeing it pop up in various games. There might not be volumetric export yet but people can still export image sequences. On their site they say Bluepoint is using their tools.
 
Top Bottom