• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

I don't remember shader compilation being a thing for 7th gen and before, what happened?

SimTourist

Member
This seems like a new thing that appeared at some point, PC bros complaining about shader compilation stutter and long ass shader compiling sequences at startup. I don't remember this being a thing at all until PS4/Xbox One generation. Why did it suddenly become a big problem?
 

Drew1440

Member
Was it something to do with shaders being fixed function up until the 8th generation? I know the 360 had a unified shader model for its GPU, but the PS3 didnt and had to rely on the Cell CPU to compensate.
 
IIRC DirectX 12 changed how shader compilation worked, giving devs more control over the process. Before that it mostly happened under the hood. That's good and can lead to improved performance if the devs have a lot of experience and really know what they're doing, but also kind of bad if they don't. Lots of ways to fuck up and end up with something suboptimal.
 

Guilty_AI

Member
It was. I remember UE4 DX11 games having it. People just didn't care because shader comp is innocuous most of the time and go away after playing the game for 3 minutes or so.

Most of the stutter issues i see today aren't even shader comp related, its just that people took to calling any and every type of stutter "shader compilation", and DF started making sure they'd highlight them on each and every video they made, no matter how imperceptible it would be otherwise.
 
Last edited:
I remember first time trying to run my old Wii games on Dolphin and see the shader compilation stutters thinking that's very bad. I can't even thought that official PC game releases will suffer from that in future. Let alone Unreal Engine for 2 generations.
 
Top Bottom