• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

LocalRay - the future of RT? PS5 / XSX will use it possibly?

pawel86ck

Banned






LocalRay uses proprietary, patented algorithms designed from the bottom up for VR/AR physically accurate ray tracing technology. This new approach is based on the elimination of the acceleration structures (AS), which are a core component in every ray tracing system today. This elimination reduces the expensive traversals time, and saves the repeating reconstructions of AS for every major change in the scene. Both, traversals and reconstruction, which are stoppages for real-time, are now a thing of the past.

Simply, it requires fewer rays. This AS replacement is called the DAS (dynamically aligned structures), and it’s proprietary to Adshir, but it just uses a conventional GPU pipeline.

We designed it bottom to go after low-end devices. We have multiple deals across console (PS5 maybe?) and mobile sectors, but we are not allowed to talk about it until the second half of the year (coincidentally thats exactly when next gen consoles should be revealed).
 
Last edited:

pawel86ck

Banned
I don't think Xbox Series X will use LocalRay since Microsoft has their own patented version of Ray Tracing.
Localray is software RT, so no HW is needed and because of that MS can always use it in the future (of course if it will provide better results).
 
Last edited:

Jigsaah

Gold Member
yup, it's just a proprietary path finding implementation through their DirectX API, right?
I'm not too knowledgeable about the inner workings of it, but I know they don't have any patented raytracing. AMD does have their own version...but not Microsoft.
 

darkinstinct

...lacks reading comprehension.
This could explain everything. If one of them uses LocalRay (which LocalRay says they do) they would have to reserve a chunk of their CUs for this specific task. Which would perfectly fit Cerny's statement of "There is ray-tracing acceleration in the GPU hardware". It's not hardware-accelerated raytracing like how Microsoft describes their solution, it's really a form of acceleration in the hardware (shaders, specifically). So PS5 might have 12 TF as well, but they reserve a chunk of it for raytracing and end up at 9.2 TF plus raytracing. XSX uses dedicated cores (which apparently will also appear in a newly leaked APU for Surface) and has 12 TF plus raytracing. Explains why there was no mention of raytracing in the AMD test, because it's a software solution and "full chip" is only the part of the APU that is actually available to devs.

So a 48 CU APU running at 2 GHz, with 12 CU reserved for raytracing.
 

M1chl

Currently Gif and Meme Champion
Well I can see this running on the smartphones, not entirely sure why would you want to go back to software based solution, when Raytracing is expensive even with hardware acceleration. I mean we have smooth video playback now, because it uses hardware acceleration, running something like 4k video from youtube/Netflix even on todays CPU is something which would take a lot of resources.
 

killatopak

Gold Member
We already know that both next gen consoles have HW raytracing so it wouldn‘t make sense to use a software based one which hogs resources.
 

BitDust

Member
IMHO RTRT is still a far cry from what should really deliver. Every demo I look at it clearly looks fake and it doesn't simulate well materials' reflections. Full ray tracing took 40 mins to render compared to the dynamic one at 60 fps and the difference is pretty clear. Moreover light absorption is still something out of reach from what I'm seeing where everything becomes just like a mirror. I believe it will take this whole nextgen console life to the next one to get the real deal.
 

darkinstinct

...lacks reading comprehension.
We already know that both next gen consoles have HW raytracing so it wouldn‘t make sense to use a software based one which hogs resources.

"There is ray-tracing acceleration in the GPU hardware" is not necessarily the same as "There is hardware-accelerated raytracing in the GPU". Not at all. Cerny left the door wide open for a software based solution when he could've just said "We have hardware accelerated raytracing" just like Microsoft does it. That's the common way to talk about it, so it's weird that Cerny doesn't talk about it like that. And running LocalRay on the GPU shaders would very much be raytracing acceleration in the GPU hardware.
 

pottuvoi

Banned
"There is ray-tracing acceleration in the GPU hardware" is not necessarily the same as "There is hardware-accelerated raytracing in the GPU". Not at all. Cerny left the door wide open for a software based solution when he could've just said "We have hardware accelerated raytracing" just like Microsoft does it. That's the common way to talk about it, so it's weird that Cerny doesn't talk about it like that. And running LocalRay on the GPU shaders would very much be raytracing acceleration in the GPU hardware.
Not really.
It would be like running software rasterization pipeline on GPU hardware.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
"There is ray-tracing acceleration in the GPU hardware" is not necessarily the same as "There is hardware-accelerated raytracing in the GPU". Not at all. Cerny left the door wide open for a software based solution when he could've just said "We have hardware accelerated raytracing" just like Microsoft does it. That's the common way to talk about it, so it's weird that Cerny doesn't talk about it like that. And running LocalRay on the GPU shaders would very much be raytracing acceleration in the GPU hardware.

:LOL: are you seriously trying to stir this shit up again?
 

darkinstinct

...lacks reading comprehension.
So, yesterday brought more evidence that Sony is indeed using Adshir LocalRay. Microsoft showed games running with raytracing. Something that according to the CEO of Adshir is not possible for their industry leading console customer yet, because the hardware is not finished (he also said they could start talking about it around July). There is one console manufacturer that says they are not ready yet to talk about their hardware. Sony. Despite being (way) ahead in development, now they are behind, why is that?

Because they built a console without raytracing capabilities and made a last minute decision to support it? Think about it, if Sony has a 8 TF Ariel as the baseline, they can't just run raytracing via shades on it, performance tanks. XSX is pulling the equivalent of 25 TF with raytracing enabled. So what do you do? You create a modified design. You raise the clockspeeds to get more raw performance (similarly to how XSX still uses the GPU to actually calculate the lighting with raytracing) and you add hardware support for that LocalRay raytracing. Maybe another 3 or 4 TF, with their own memory pool. Something like the PS4 Pro GPU cores? But working exclusively for raytracing. And there you have it, a PS5 targeting 12 to 13 TF, that does feature raytracing, looks very close or even slightly ahead from an outsider's perspective.

And it explains why Microsoft even started to talk up their raytracing performance in equivalent terms "if you were using shades to calculate rays" to 25 TF. Sony will be at 7 to 9.2 TF + 4 TF for raytracing. Microsoft at 12.1 TF + 13 TF for raytracing.

It also explains the cooling cost for PS5 and the devkit design and the rumors about dual APUs/GPUs. It also explains what one insider called a difficult memory setup for PS5 (I'd expect 12 GB for Oberon and another 8 GB for the raytracing part) and why Sony suffers from parts pricing.

I think Oberon ends up at 32 CU @ 1.85 GHz with a PS4 Pro GPU added on top for a total of 7.6 + 4 TF, 11.6 TF. This also fits with the latest supposed performance test results of PS5 with Oberon being in the range of a 5600 XT. And it explains why Pachter says it could be $800 - because to him it looks like a next gen console with a PS4 Pro added on top.
 

-kb-

Member
So, yesterday brought more evidence that Sony is indeed using Adshir LocalRay. Microsoft showed games running with raytracing. Something that according to the CEO of Adshir is not possible for their industry leading console customer yet, because the hardware is not finished (he also said they could start talking about it around July). There is one console manufacturer that says they are not ready yet to talk about their hardware. Sony. Despite being (way) ahead in development, now they are behind, why is that?

Because they built a console without raytracing capabilities and made a last minute decision to support it? Think about it, if Sony has a 8 TF Ariel as the baseline, they can't just run raytracing via shades on it, performance tanks. XSX is pulling the equivalent of 25 TF with raytracing enabled. So what do you do? You create a modified design. You raise the clockspeeds to get more raw performance (similarly to how XSX still uses the GPU to actually calculate the lighting with raytracing) and you add hardware support for that LocalRay raytracing. Maybe another 3 or 4 TF, with their own memory pool. Something like the PS4 Pro GPU cores? But working exclusively for raytracing. And there you have it, a PS5 targeting 12 to 13 TF, that does feature raytracing, looks very close or even slightly ahead from an outsider's perspective.

And it explains why Microsoft even started to talk up their raytracing performance in equivalent terms "if you were using shades to calculate rays" to 25 TF. Sony will be at 7 to 9.2 TF + 4 TF for raytracing. Microsoft at 12.1 TF + 13 TF for raytracing.

It also explains the cooling cost for PS5 and the devkit design and the rumors about dual APUs/GPUs. It also explains what one insider called a difficult memory setup for PS5 (I'd expect 12 GB for Oberon and another 8 GB for the raytracing part) and why Sony suffers from parts pricing.

I think Oberon ends up at 32 CU @ 1.85 GHz with a PS4 Pro GPU added on top for a total of 7.6 + 4 TF, 11.6 TF. This also fits with the latest supposed performance test results of PS5 with Oberon being in the range of a 5600 XT. And it explains why Pachter says it could be $800 - because to him it looks like a next gen console with a PS4 Pro added on top.

But we know both consoles are RDNA2 which includes AMDs raytracing. Your also strongly assuming that Sonys raytracing solution (which I think is just RDNA2) is weaker / not as performant as AMD's (which is exactly what Microsoft is using) it could be the exact opposite for all we know.
 

Panajev2001a

GAF's Pleasant Genius
So, yesterday brought more evidence that Sony is indeed using Adshir LocalRay. Microsoft showed games running with raytracing. Something that according to the CEO of Adshir is not possible for their industry leading console customer yet, because the hardware is not finished (he also said they could start talking about it around July). There is one console manufacturer that says they are not ready yet to talk about their hardware. Sony. Despite being (way) ahead in development, now they are behind, why is that?

Because they built a console without raytracing capabilities and made a last minute decision to support it? Think about it, if Sony has a 8 TF Ariel as the baseline, they can't just run raytracing via shades on it, performance tanks. XSX is pulling the equivalent of 25 TF with raytracing enabled. So what do you do? You create a modified design. You raise the clockspeeds to get more raw performance (similarly to how XSX still uses the GPU to actually calculate the lighting with raytracing) and you add hardware support for that LocalRay raytracing. Maybe another 3 or 4 TF, with their own memory pool. Something like the PS4 Pro GPU cores? But working exclusively for raytracing. And there you have it, a PS5 targeting 12 to 13 TF, that does feature raytracing, looks very close or even slightly ahead from an outsider's perspective.

And it explains why Microsoft even started to talk up their raytracing performance in equivalent terms "if you were using shades to calculate rays" to 25 TF. Sony will be at 7 to 9.2 TF + 4 TF for raytracing. Microsoft at 12.1 TF + 13 TF for raytracing.

It also explains the cooling cost for PS5 and the devkit design and the rumors about dual APUs/GPUs. It also explains what one insider called a difficult memory setup for PS5 (I'd expect 12 GB for Oberon and another 8 GB for the raytracing part) and why Sony suffers from parts pricing.

I think Oberon ends up at 32 CU @ 1.85 GHz with a PS4 Pro GPU added on top for a total of 7.6 + 4 TF, 11.6 TF. This also fits with the latest supposed performance test results of PS5 with Oberon being in the range of a 5600 XT. And it explains why Pachter says it could be $800 - because to him it looks like a next gen console with a PS4 Pro added on top.

... and you were complaining about Concern Trolling in the XSX 16 GB of RAM thread :LOL: ... not self aware much?!
 
This could explain everything. If one of them uses LocalRay (which LocalRay says they do) they would have to reserve a chunk of their CUs for this specific task. Which would perfectly fit Cerny's statement of "There is ray-tracing acceleration in the GPU hardware". It's not hardware-accelerated raytracing like how Microsoft describes their solution, it's really a form of acceleration in the hardware (shaders, specifically). So PS5 might have 12 TF as well, but they reserve a chunk of it for raytracing and end up at 9.2 TF plus raytracing. XSX uses dedicated cores (which apparently will also appear in a newly leaked APU for Surface) and has 12 TF plus raytracing. Explains why there was no mention of raytracing in the AMD test, because it's a software solution and "full chip" is only the part of the APU that is actually available to devs.

So a 48 CU APU running at 2 GHz, with 12 CU reserved for raytracing.
This post gave me Coronavirus. Jesus
 
Top Bottom