Monday, July 22, 2019

Customizing the Lit Shader in Unity's Lightweight Render Pipeline

Unity used to have a "Surface Shader" system by which you could write a function that would specify certain material properties based on various textures, slider settings, etc., and then Unity would compile the numerous shaders needed to support various functions like shadows, deferred rendering, etc. without you having to manually write each one. It also let you write custom lighting functions for that Surface Shader framework.

Surface Shaders were part of the original Unity built-in render pipeline. But, the new Scriptable Render Pipelines, namely the High Definition RP and the Lightweight RP, don't have Surface Shaders. They do have a graphical node-based shader programming facility called Shader Graph. Besides the usual problems with graphical programming paradigms (namely, a few lines of textual code can into verbose visual spaghetti code, with confusing lines connecting many blocks spread out over multiple screens), Shader Graph restricts you to the particular lighting models Unity provides in the form of Master Nodes. Some have tried to work around this by doing lighting calculations in the blocks and feeding the final result into the "emission" input of a master node; however, getting lighting info into the Shader Graph is tough. Another options is to save the HLSL generated by the Shader Graph compiler and modify that; unfortunately, the compiler assigns the links between nodes arbitrary variable name that look like license plate numbers, making the generated HLSL difficult to follow.

These problems with Shader Graph lead to the sad conclusion that if you want to create your own lighting models, you need to copy and modify one of Unity's hand-crafted ubershaders, even if that means duplicating a lot of effort in different shader passes. 

If you want to implement a custom lighting model, you'll be spending a lot of time trying to understand Lighting.hlsl. It contains LightweightFragmentPBR (used by Lit shader) and LightweightFragmentBlinnPhong (used by Simple Lit shader). Lighting.hlsl imports from Common.hlsl, EntityLighting.hlsl, ImageBasedLighting.hlsl, Core.hlsl, and Shadows.hlsl.

We'll also talk explore dynamic vertex and texture coordinate modifications. These are easily implemented with Shader Graph, but we might as well address them while we're here.

LWRP Lit Shader Passes

Lit.shader consists of four passes: ForwardLit, ShadowCaster, DepthOnly, and Meta; they import LitForwardPass.hlsl, ShadowCasterPass.hlsl, DepthOnlyPass.hlsl, and LitMetaPass.hlsl, respectively. All four passes import from LitInput.hlsl.

Forward Lit Pass: The forward lit pass vertex shader is LitPassVertex and its fragment shader is LitPassFragment. These are defined in LitForwardPass.html, which imports from Lighting.hlsl. LitPassFragment calls InitializeStandardLitSurfaceData (defined in LitInput.hlsl), InitializeInputData (defined in LitForwardPass.hlsl), MixFog (defined in Core.hlsl), and LightweightFragmentPBR (defined in Lighting.hlsl).

Shadow Caster Pass: The shadow caster pass vertex shader is ShadowPassVertex and its fragment shader is ShadowPassFragment. These are defined in ShadowCasterPass.hlsl, which imports from Core.hlsl and Shadows.hlsl. ShadowPassFragment calls SampleAlbedoAlpha and Alpha (defined in SurfaceInput.hlsl). The Alpha function contains a clip command; it looks like ShadowCasterPass uses BaseMap only to the extent that it provides alpha clipping. ShadowPassFragment returns 0, so it doesn't really do anything else interesting.

Depth Only Pass: The depth only pass vertex shader is DepthOnlyVertex, and its fragment shader is DepthOnlyFragment. These are defined in DepthOnlyPass.hlsl, which imports Core.hlsl. DepthOnlyFragment calls SampleAlbedoAlpha and Alpha, which are defined in SurfaceInput.hlsl. The Alpha contains a clip command; it looks like DepthOnlyPass uses BaseMap only to the extent that it provides alpha clipping. (This was the case with ShadowCasterPass). DepthOnlyFragment returns 0, so it doesn't really do anything else interesting.

Meta Pass: The meta pass vertex shader is LightweightVertexMeta, and its fragment shader is LightweightFragmentMeta. These are defined in LitMetaPass.hlsl, which imports MetaInput.hlsl. LightweightVertexMeta calls MetaVertexPosition, which is defined in MetaInput.hlsl. LightweightFragmentMeta calls MetaFragment (defined in MetaInput.hlsl), InitializeStandardLitSurfaceData (defined in LitInput.hlsl), and InitializeBRDFData (defined in Lighting.hlsl).

If unity_MetaFragmentControl.x is true, it returns a (possibly tweaked) albedo value. If unity_MetaFragmentControl.y is true, it returns an emission value. This is confusing because the comments at the top say "y = return normal" and I don't see where it's
computing or storing a normal.


Passes for Other LWRP Shaders

SimpleLit.shader consists the same four passes: ForwardLit, ShadowCaster, DepthOnly, and Meta; they import SimpleLitForwardPass.hlsl, ShadowCasterPass.hlsl, DepthOnlyPass.hlsl, and SimpleLitMetaPass.hlsl, respectively. Note that the ShadowCaster and DepthOnly passes import the same hlsl files as the Lit shader, whereas the ForwardLit and Meta passes import hlsl files specialized for the SimpleLit shader. All four passes import from SimpleLitInput.hlsl instead of LitInput.hlsl. The SimpleLit specialized versions of files lack the additional fields and functions needed for a full PBR model. It invokes LightweightFragmentBlinnPhong (in Lighting.hlsl) instead of LightweightFragmentPBR, which is used by Lit.shader.

BakedLit.shader only has three passes: BakedLit, DepthOnly, and Meta.

Unlit.shader only has three passes: Unlit, DepthOnly, and Meta.

I won't explore BakedLit or Unlit shaders further here. I also won't explore the hey have even less functionality; I will not explore the ParticlesLit, ParticlesSimpleLit, ParticlesUnlit, or Terrain shaders.


Dynamic Vertex Modification

Forward Lit Pass: You could insert your vertex, normal, and tangent modification code
just before these lines in LitVertexPass (in LitForwardPass.hlsl), modifying the
positionOS, normalOS, and tangentOS fields in the "input" Attributes structure as desired:

VertexPositionInputs vertexInput = GetVertexPositionInputs(input.positionOS.xyz);
VertexNormalInputs normalInput = GetVertexNormalInputs(input.normalOS, input.tangentOS);

VertexPositionInputs and VertexNormalInputs are defined in Core.hlsl as:

struct VertexPositionInputs {
    float3 positionWS; // World space position
    float3 positionVS; // View space position
    float4 positionCS; // Homogeneous clip space position
    float4 positionNDC;// Homogeneous normalized device coordinates
};

struct VertexNormalInputs {
    real3 tangentWS;
    real3 bitangentWS;
    float3 normalWS;
};

The associated functions are defined as (note VertexNormalInputs is overloaded):

VertexPositionInputs GetVertexPositionInputs(float3 positionOS) {
    VertexPositionInputs input;
    input.positionWS = TransformObjectToWorld(positionOS);
    input.positionVS = TransformWorldToView(input.positionWS);
    input.positionCS = TransformWorldToHClip(input.positionWS);
   
    float4 ndc = input.positionCS * 0.5f;
    input.positionNDC.xy = float2(ndc.x, ndc.y * _ProjectionParams.x) + ndc.w;
    input.positionNDC.zw = input.positionCS.zw;
       
    return input;
}

VertexNormalInputs GetVertexNormalInputs(float3 normalOS) {
    VertexNormalInputs tbn;
    tbn.tangentWS = real3(1.0, 0.0, 0.0);
    tbn.bitangentWS = real3(0.0, 1.0, 0.0);
    tbn.normalWS = TransformObjectToWorldNormal(normalOS);
    return tbn;
}

VertexNormalInputs GetVertexNormalInputs(float3 normalOS, float4 tangentOS) {
    VertexNormalInputs tbn;

    // mikkts space compliant. only normalize when extracting normal at frag.
    real sign = tangentOS.w * GetOddNegativeScale();
    tbn.normalWS = TransformObjectToWorldNormal(normalOS);
    tbn.tangentWS = TransformObjectToWorldDir(tangentOS.xyz);
    tbn.bitangentWS = cross(tbn.normalWS, tbn.tangentWS) * sign;
    return tbn;
}

Shadow Caster Pass: There's two places you could insert your vertex and normal modification code. One possible hack would be to put your code at top of GetShadowPostionHClip (in ShadowCasterPass.hlsl), before the lines:

float3 positionWS = TransformObjectToWorld(input.positionOS.xyz);
float3 normalWS = TransformObjectToWorldNormal(input.normalOS);

A more elegant and clearer approach would be to modify the positionOS ad normalOS fields in the "input" Attributes structure just before this line in ShadowPassVertex:

output.positionCS = GetShadowPositionHClip(input);

Depth Only Pass: You could insert your vertex modification code in DepthOnlyVertex (in DepthOnlyPass.hlsl) just before the line

output.positionCS = TransformObjectToHClip(input.position.xyz);

For clarity, I'd probably write something like:

float3 modifiedPosition = SomeModificationOf(input.position.xyz);
output.positionCS = TransformObjectToHClip(modifiedPosition);

The stock depth only pass doesn't contain incorporate normals, which are often useful in creating interesting vertex deformations. Perhaps you could just add a "float3 normal : NORMAL;" line to the Attributes structure. I haven't tried this, though.

Meta Pass: You shouldn't need to touch the meta pass, since by definition meta passes are only invoked by the lightmapper, and hence are only useful on static geometry.


Dynamic Texture Coordinate Modification

Forward Lit Pass: The best place to modify texture lookups will depend on your particular goals. LitPassFragment has the line:

InitializeStandardLitSurfaceData(input.uv, surfaceData);

You could potentially modify the input.uv argument there, and that would more or less modify every texture lookup accordingly. For more fine-grained control, you could
modify InitializeStandardLitSurfaceData (in LitInput.hlsl), and modify the uv argument
inputs, as desired, in lines like:

half4 albedoAlpha = SampleAlbedoAlpha(uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap));
half4 specGloss = SampleMetallicSpecGloss(uv, albedoAlpha.a);
outSurfaceData.normalTS = SampleNormal(uv, TEXTURE2D_ARGS(_BumpMap, sampler_BumpMap), _BumpScale);
outSurfaceData.occlusion = SampleOcclusion(uv);
outSurfaceData.emission = SampleEmission(uv, _EmissionColor.rgb, TEXTURE2D_ARGS(_EmissionMap, sampler_EmissionMap));

SampleOcclusion and SampleMetallicSpecGloss are defined in LitInput.hlsl. SampleAlbedoAlpha, SampleNormal, and SampleEmission are defined in SurfaceInput.hlsl.

Shadow Caster Pass and Depth Only Pass: ShadowPassFragment (in ShadowCasterPass.html) and DepthOnlyFragment (in DepthOnlyPass.html) both the line:

Alpha(SampleAlbedoAlpha(input.uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap)).a, _BaseColor, _Cutoff);

You could modify the input.uv argument as desired. You would only need to do this if
you are using a clip test with the alpha channel and want to animate it.

Meta Pass: You shouldn't need to touch the meta pass, since by definition meta passes are only invoked by the lightmapper, and hence are only useful on static geometry.


Getting Ready to Customize the Lighting Model

Basically, you'll need to make your own variation of LightweightFragmentPBR, and variations of the macros and functions is calls, if needed.

Make a new "Shader" folder inside your main Assets folder if you haven't done so already.

Via dragging and dropping, copy Packages/Lightweight RP/Shaders/Lit.shader into your new "Shader" folder. Rename your copy of Lit.shader and change the Shader "Lightweight Render Pipeline/Lit" line at the top of the file as desired; I changed it to MyLit.hlsl and Shader "My Custom/My Lit" -- but you can use whatever names you want.

Sadly, the copying will break a bunch of filename include references. Search on #include in MyLit.hlsl, and change the four instances of the lines #include "LitInput.hlsl" to

#include "Packages/com.unity.render-pipelines.lightweight/Shaders/LitInput.hlsl"

Change the instance of #include "DepthOnlyPass.hlsl" to

#include "Packages/com.unity.render-pipelines.lightweight/Shaders/DepthOnlyPass.hlsl"

Change the instance of #include "ShadowCasterPass.hlsl" to

#include "Packages/com.unity.render-pipelines.lightweight/Shaders/ShadowCasterPass.hlsl"

Change the instance of #include "LitMetaPass.hlsl" to

#include "Packages/com.unity.render-pipelines.lightweight/Shaders/LitMetaPass.hlsl"
       
Via dragging and dropping, copy Packages/Lightweight RP/Shaders/LitForwardPass.hlsl into your new "Shader" folder. Rename your copy of LitForwardPass.hlsl. I changed it to MyLitForwardPass.hlsl -- but you can use whatever name you want.

In MyLit.shader, change #include "LitForwardPass.hlsl" to #include "MyLitForwardPass.hlsl"

The above assumes that you're not doing any dynamic vertex modification. If you are, you'll need to create a custom MyDepthOnlyPass.hlsl and MyShadowCasterPass.hlsl and change the various #includes appropriately. The above also assumes you're not doing any dynamic texture modifications that includes alpha clip tests; these will also necessitate custom MyDepthOnlyPass.hlsl and MyShadowCasterPass.hlsl files.

At this point, you should be able to take a material with the original LWRP Lit shader, switch it to your custom shader, and it should maintain all of its various textures and parameter settings and perform identically.

There's a couple of minor tweaks you may want to make before doing some major shader hacking:

1) To keep the vertex and fragment shader naming conventions consistent, in MyLit.shader, I'd recommend changing

#pragma vertex ShadowPassVertex
#pragma fragment ShadowPassFragment

to something like

#pragma vertex MyShadowPassVertex
#pragma fragment MyShadowPassFragment

2) If you want to modify the custom material inspector, make a new "Editor" folder inside your main Assets folder if you haven't done so already. Copy (via drag and drop) Packages/Lightweight RP/Editor/ShaderGUI/Shaders/LitShader.cs and Packages/Lightweight RP/Editor/ShaderGUI/ShadingModels/LitGUI.cs. Change the names of the files to MyLitShader.cs and MyLitGUI.cs (or whatever else you'd like).

In MyLitShader.cs and MyLitGUI.cs, change all the instances of LitGUI to MyLitGUI. In MyLitShader.cs, and LitShader in the class declaration to MyLitShader.

At the bottom of MyLit.shader, change the CustomEditor line to read

CustomEditor "UnityEditor.Rendering.LWRP.ShaderGUI.MyLitShader"

Coming up with a custom material inspector from scratch would be extremely painful, but it's not too taxing to make minor tweaks to an already working custom inspector.


Actually Customizing the Lighting Model

I recommend leaving the original Lighting.hlsl file, as well as the files it includes, intact. You can copy structures, functions, macros, etc. from those files, paste them into your MyLitForwardPass.hlsl, change the names, and then make your modifications; it seems easiest to have everything in one spot. If you wind up developing a lot of different shaders that share various bits of custom lighting code, you might want to break those out into a separate MyLighting.hlsl include file, or something like that.

Digging into MyLitPassFragment in MyLitForwardPass.hlsl, you'll want to drill down into InitializeStandardLitSurfaceData (defined in LitInput.hlsl), InitializeInputData (defined in MyLitForwardPass.hlsl), MixFog (defined in Core.hlsl), and LightweightFragmentPBR (defined in Lighting.hlsl).

MixFog (defined in Core.hlsl) is a one-line function that calls MixFogColor. It interpolates between the color computed by the shader and unity_FogColor.rgb according to fogFactor and some defines (FOG_LINEAR, FOG_EXP, FOD_EXP2). If you want to inject more complicated fog models, this would be the place to do it; otherwise, you probably don't need to change this. (That all said: I can't figure out where to turn on fog in the LWRP anyway! I will save that for another day).

InitializeStandardLitSurfaceData reads from various textures to populate a SurfaceData structure. If you want dynamic texture coordinate modifications that are particular to each texture, this would be a good place to do it. If you want to apply the same texture coordinate modification to everything, you might want to do it in the call to InitializeStandardLitSurfaceData (in MyLitPassFragment) instead, and leave the InitializeStandardLitSurfaceData function itself alone.

SurfaceData is defined in SurfaceInput.hlsl as:

struct SurfaceData {
    half3 albedo;
    half3 specular;
    half  metallic;
    half  smoothness;
    half3 normalTS;
    half3 emission;
    half  occlusion;
    half  alpha;
};

This definition is preceded has the comment "Must match Lightweight ShaderGraph master node." This is interesting, because it represents a connection point between the LWRP ShaderGraph framework and the primary hand-optimized, heavy-on-the-multicompile LWRP Lit shader. SurfaceData is quite reminiscent of the SurfaceOutput, SurfaceOutputStandard, and SurfaceOutputStandardSpecular structures of the Unity's old text-based Surface Shader system, which would make InitializeStandardLitSurfaceData analogous to the Surface Shader's surface functions.

InitializeInputData populates an InputData structure (defined in Input.hlsl) containing various quantities needed for lighting calculations:

struct InputData {
    float3  positionWS;
    half3   normalWS;
    half3   viewDirectionWS;
    float4  shadowCoord;
    half    fogCoord;
    half3   vertexLighting;
    half3   bakedGI;
};


LightweightFragmentPBR

Now we're at the main guts of MyLitPassFragment, namely the call to LightweightFragmentPBR, which is found in lighting.hlsl. It first calls InitializeBRDFData (in Lighting.hlsl), which pre-computes a bunch of values and stores them in a BRDFData struct:

struct BRDFData {
    half3 diffuse;
    half3 specular;
    half perceptualRoughness;
    half roughness;
    half roughness2;
    half grazingTerm;

    // We save some light invariant BRDF terms so we don't have to recompute
    // them in the light loop. Take a look at DirectBRDF function for detailed explaination.
    half normalizationTerm;     // roughness * 4.0 + 2.0
    half roughness2MinusOne;    // roughness² - 1.0
};

InitializeBRDFData sets the grazingTerm to saturate(smoothness + reflectivity); I confess I have no idea where that comes from.

The lighting calculation goes through four steps:

1) It adds the effect of global illumination (via a call to GlobalIllumination, found in Lighting.hlsl) and the effect of the "main light" (via a call to LightingPhysicallyBased, also found in Lighting.hlsl). In the process, it calls MixRealTimeAndBakedGI, which calls SubtractDirectMainLightFromLightmap (both defined in Lighting.hlsl); this implements an interesting but really complicated hack that I'm not going to explore further here.

2) It loops through the remaining lights, calling LightingPhysicallyBased on each one.

3) Unity's LWRP and built-in pipeline have a fairly complicated light hierarchy, in which up to four "unimportant lights" can be calculated with per-vertex lighting. It adds the results of those unimportant lights via a vertexLighting field in the inputData structure. This is assigned by the line

inputData.vertexLighting = input.fogFactorAndVertexLight.yzw;

in InitializeInputData (found in MyLitForwardPass.hlsl).

fogFactorAndVertexLight, which is part of the Varying structure that connects the vertex shader and the fragment shader through interpolation, is assigned in MyLitPassVertex, which calls VertexLighting, which in turn calls LightingLambert (both are defined in Lighting.hlsl).

4) It adds the emissive color.


Global Illumination

The GlobalIlumination function is short enough that I'll include the code here:

half3 GlobalIllumination(BRDFData brdfData, half3 bakedGI, half occlusion, half3 normalWS, half3 viewDirectionWS) {
    half3 reflectVector = reflect(-viewDirectionWS, normalWS);
    half fresnelTerm = Pow4(1.0 - saturate(dot(normalWS, viewDirectionWS)));

    half3 indirectDiffuse = bakedGI * occlusion;
    half3 indirectSpecular = GlossyEnvironmentReflection(reflectVector, brdfData.perceptualRoughness, occlusion);

    return EnvironmentBRDF(brdfData, indirectDiffuse, indirectSpecular, fresnelTerm);
}

Here, the fresnelTerm uses dot(normalWS, viewDirectionWS), which is what is customary for indirect light, in contrast with direct light, which dot products the half vector with the view vector, or equivalently, the half vector with the light vector.

GlobalIllumination is called from this line in LightweightFragmentPBR (in Lighting.hlsl):

half3 color = GlobalIllumination(brdfData, inputData.bakedGI, occlusion, inputData.normalWS, inputData.viewDirectionWS);

In InitializeInputData (in MyLitForwardPass.hlsl), we find the line:

inputData.bakedGI = SAMPLE_GI(input.lightmapUV, input.vertexSH, inputData.normalWS);

SAMPLE_GI is a macro that invokes SampleLightmap on objects with lightmaps and SampleSHPixel (for spherical harmonic lighting) on objects without lightmaps.

GlossyEnvironmentReflection does a lookup into a cube map, choosing a mip level appropriate with to the roughness.

EnvironmentBRDF is short enough that I will include it here:

half3 EnvironmentBRDF(BRDFData brdfData, half3 indirectDiffuse, half3 indirectSpecular, half fresnelTerm) {
    half3 c = indirectDiffuse * brdfData.diffuse;
    float surfaceReduction = 1.0 / (brdfData.roughness2 + 1.0);
    c += surfaceReduction * indirectSpecular * lerp(brdfData.specular, brdfData.grazingTerm, fresnelTerm);
    return c;
}

I have no idea where the surfaceReduction term comes from.

Direct Illumination

LightingPhysicallyBased (in Lighting.hlsl) has two overloaded variations:

half3 LightingPhysicallyBased(BRDFData brdfData, half3 lightColor, half3 lightDirectionWS, half lightAttenuation, half3 normalWS, half3 viewDirectionWS) {
    half NdotL = saturate(dot(normalWS, lightDirectionWS));
    half3 radiance = lightColor * (lightAttenuation * NdotL);
    return DirectBDRF(brdfData, normalWS, lightDirectionWS, viewDirectionWS) * radiance;
}

half3 LightingPhysicallyBased(BRDFData brdfData, Light light, half3 normalWS, half3 viewDirectionWS) {
    return LightingPhysicallyBased(brdfData, light.color, light.direction, light.distanceAttenuation * light.shadowAttenuation, normalWS, viewDirectionWS);
}

It looks like most of the work is done in the curiously misspelled DirectBDRF (it should be DirectBRDF), which implements an approximate Cook-Torrance model.


Where to Start Putting In a Custom Lighting Model

LightingPhysicallyBased handles direct lights; that's the easiest place to put in modified diffuse models (Oren-Nayer, Minnaert, Disney Diffuse, etc.), modified specular models, or completely weird lighting models. A good exercise would be to replace the optimized-for-mobile LWRP Lit model and replace it with as much as you can of the HDRP Lit model.

Weird lighting models are less likely to make sense with any of the global illumination facilities like lightmaps, reflection probes, and light probes. But it's possible that you could partially incorporate modified diffuse and modified specular models. Unity's lightmappers are largely black boxes, so it would be difficult to calculate lightmaps under anything except for the usual Lambertian light assumption. Similarly, Unity computes the convolved mipmaps for reflection probes using its particular specular BDRF assumptions. One could potentially explore ImageBasedLighting.hlsl and try creating convolved mipmaps for other functions, but I except that it would be a ton of work, and few people would be able to tell the difference for most conventional specular models. 

One could try replacing the use of the spherical harmonic light probes with reflection probes with mipmaps crafted for diffuse lighting instead of specular lighting. If you set Texture Shape in the texture inspector to Cube, a Convolute Type dropdown menu appears with the options Specular (Glossy Reflections) and Diffuse (Irradiance). The Diffuse option results in much more blurring per mipmap level, as expected. Interestingly, that option doesn't seem to be used anywhere in stock Unity.

1 comment:

  1. This is really nice to read content of this blog. A is very extensive and vast knowledgeable platform has been given by this blog. I really appreciate this blog to has such kind of educational knowledge.Vertex Lighting

    ReplyDelete