• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD - Primitive Shaders vs Mesh Shaders

If I had to pick between two, Rift Apart & HFW would be in that running. I probably lean to HFW though; the level of detail and fluidity in that game, given it's cross-gen, is impeccable. The expansion will be PS5-only so I'm really interested to see how it utilizes that system without needing to account for PS4 like the base game does.
I have yet to play either HZD or FW. They are on my list for some point this year and I have heard the sentiment you shared from others. Sony really does have talented folks making games.
 

SlimySnake

Flashless at the Golden Globes
Lol


Other than Fortnite what game is using Nanite and Lumen? Almost none.

By the time they do the industry will catch up, especially Sony first parties.
Why havent they? Several first party sony studios have released next gen only games. none of them have even tried to do anything remotely close to Lumens and Nanite.

Epic doesnt make games but they've recently demo after demo with nanite integration. The first nanite demo is literally 3 years old now. They are so far ahead of everyone, it's not even funny.

Best thing for Sony, MS and all third party studios is to give up and pay Epic the royalties to reduce dev time and get the fucking next gen engine thats already up and running. These 300-400 person studios dont have R&D teams like Epic does. Epic has devs making entire levels in 3 days while Sony and MS are taking 5 years using last gen development tools.
 

Loxus

Member
Why havent they? Several first party sony studios have released next gen only games. none of them have even tried to do anything remotely close to Lumens and Nanite.

Epic doesnt make games but they've recently demo after demo with nanite integration. The first nanite demo is literally 3 years old now. They are so far ahead of everyone, it's not even funny.

Best thing for Sony, MS and all third party studios is to give up and pay Epic the royalties to reduce dev time and get the fucking next gen engine thats already up and running. These 300-400 person studios dont have R&D teams like Epic does. Epic has devs making entire levels in 3 days while Sony and MS are taking 5 years using last gen development tools.
Unreal Engine 5 launched in April 2022, I don't think we would of seen any UE5 titles before that date.

I do think it may be a possibility that future PS Studio games may implement UE5's Nanite and Lumen in to their Engines based on this, Epic reveals how it's been working with Sony for years on Unreal Engine 5

And the fact that they invested $1 billion into Epic.
Sony Goes Big On Epic Games With Whopping $1 Billion Investment

Plus other investments they did with Epic earlier.
This isn’t the only significant investment Sony has made in Epic. In 2020, Sony invested $250 million in Epic for reasons that, at least publicly, could only be defined as “a bunch of business buzzwords.” A year later, Sony poured a further $200 million in for much of the same. Today’s investment, the third to date, is more than twice the prior two—combined.
 

DaGwaphics

Member
I know. Mesh Shaders have been around since 2018 and we still don't have them getting used.
On the XSX we have SFS which will absolutely help with memory management, especially on the Series S, yet no one had adopted it.
I don't know how much is required on the engine level to use them, or if it's just required on the API alone.

Last-gen and the older PC hardware that can't handle some of the new tech (at least in a way that results in performances gains as the tech was designed to do) slows the adoption. I don't think Sony's devs are so special that they know the tech better than anyone else, they just have a very specific platform to target and can make the most of that.
 

DenchDeckard

Moderated wildly
Always good how much Microsoft thinks ahead, it may not always pay off but they always introduce next gen tech and features.
 

winjer

Gold Member
In this blog, AMD explains how their Next Generation Geometry pipeline relates into Mesh Shades.


Here is a snippet, as there is a lot of info in this blog.

Mesh Shaders were introduced to Microsoft DirectX® 12 in 2019[1] and to Vulkan as the VK_EXT_mesh_shader extension in 2022. Mesh Shaders introduce a new, compute-like geometry pipeline, that enables developers to directly send batches of vertices and primitives to the rasterizer. These batches are often referred to as meshlets and consist of a small number of vertices and a list of triangles, which reference the vertices. Conceptually these meshlets are very similar to the primitive subgroups we explored earlier in that they can also be used to represent a small part of a larger mesh, but as these meshlets are completely user-defined, they can also be used to render procedural geometry such as terrains or tessellated geometry such as subdivision surfaces. The latter will be covered in more detail in a later entry to this blog post series. In this first installment, we will focus on rendering a triangle mesh using mesh shaders.

Mesh Shaders introduce two new API shader stages:

  • An optional Task or Amplification Shader, which controls which and how many mesh shader thread groups to launch.
  • The Mesh Shader which outputs the aforementioned set of vertices and primitives to the rasterizer.

Mesh Shaders and NGG​

Before taking a look at amplification shaders, we briefly look at how mesh shader fit into the NGG pipeline.

As discussed in the first section, the NGG pipeline consists of two shader stages, the surface shader and the primitive shader. Primitive shaders are used to process primitive groups and are able to export both vertex attributes as well as primitive connectivity information – i.e., primitives – to the primitive assembler via the shader export. It is easy to see how this functionality can be directly used to map mesh shaders onto the NGG pipeline.

The DispatchMesh command directly specifies a three-dimensional grid of mesh shader thread groups to be launched, which in turn directly map to primitive groups. Thus no vertex de-duplication or reuse scanning is needed before launching a mesh shader thread group. We discussed previously that mesh shader launches are thus quite similar to compute shader dispatches. However, mesh shader thread groups are still launched through the geometry engine. The geometry engines in this case is still responsible for tracking and managing allocations in the shader export and managing the state of the primitive assembler (primitive mode, culling, etc.). Most importantly, as the number of vertices and primitives are not known before launching a mesh shader, the geometry engine receives the actual vertex and primitive count and forwards it to the primitive assembler.

mesh_shaders_on_rdna_graphics_cards-html-_images-mesh-shader-dataflow.svg


As the duties of the geometry engine when using mesh shaders differs vastly from those of traditional vertex shading, the geometry engine implements a special fast launch mode, which bypasses any vertex reuse checking and primitive subgroup formation stages.
 
Top Bottom