DISCLAIMER: This page is written by Unity users and is built around their knowledge. Some users may be more knowledgeable than others, so the information contained within may not be entirely complete or accurate.
Structures
Unity's shaders use structures to pass information down the rendering pipeline.
Application to Vertex Shader Structure (appdata)
The first structure passes raw information about the geometry being rendered to the vertex shader. To create your own appdata structure, they must conform to the formatting and limitations outlined below. See the built-in appdata structures below for an example.
Format
Structure:
struct [Name]
{
[One or more Fields]
};
Field:
[Type] [Name] : [Tag];
Acceptable Fields
The following fields can be written in the structure in any order.
Type(s) | Name | Tag | Notes |
float4 | vertex | POSITION | The position of the vertex in local space (model space) | |
float3 | normal | NORMAL | The normal of the vertex | |
float4 | texcoord | TEXCOORD0 | The UV coordinate for that vertex. Mesh being rendered must have at least one texture coordinate. The third and fourth floats in the vector represent a 3rd UV dimension and a scale factor, and are rarely if ever used. |
float4 | texcoord1 | TEXCOORD1 | A second set of UV coordinates for the vertex. (Only two are supported.) Always present, but often not used. |
float4 | tangent | TANGENT | All meshes have either calculated or imported tangents. |
float4 | color | COLOR | The color value of this vertex specifically. Mesh must have colors defined, otherwise they default to <TODO>. |
<TODO: Complete this list>
Built-In
Base structure, contains the least amount of data that most shaders will use.
struct appdata_base
{
float4 vertex : POSITION; // The vertex position in model space.
float3 normal : NORMAL; // The vertex normal in model space.
float4 texcoord : TEXCOORD0; // The first UV coordinate.
};
Tangents included - tangents are used to rotate the normals of normal maps when the normal maps are also rotated. Use this structure if you wish to intervene in their calculation process and manipulate them. If you do not want to manipulate tangents you may use the base structure instead since they will be calculated anyway. <TODO: Determine if the tangents need to be transformed if they are manipulated here, or if they can be left in model space.>
struct appdata_tan
{
float4 vertex : POSITION; // The vertex position in model space.
float3 normal : NORMAL; // The vertex normal in model space.
float4 texcoord : TEXCOORD0; // The first UV coordinate.
float4 tangent : TANGENT; // The tangent vector in model space (used for normal mapping).
};
All the possible fields you can derive from a mesh about to be rendered are in this structure.
struct appdata_full
{
float4 vertex : POSITION; // The vertex position in model space.
float3 normal : NORMAL; // The vertex normal in model space.
float4 texcoord : TEXCOORD0; // The first UV coordinate.
float4 texcoord1 : TEXCOORD1; // The second UV coordinate.
float4 tangent : TANGENT; // The tangent vector in Model Space (used for normal mapping).
float4 color : COLOR; // Per-vertex color
};
Vertex Shader to Fragment Shader Structure (v2f)
The second structure contains information generated by the vertex shader which is passed to the fragment shader. The vertex shader calculates and returns these values on a per-vertex basis. An interpolator then then calculates these same values on a per-pixel basis when the connected polygons are rasterized. The interpolated values are then used by the fragment shader. To create your own v2f structure, they must conform to the formatting and limitations outlined below. See the built-in v2f structures below for an example.
Format
Structure:
struct [Name]
{
[One or more Fields]
};
Field:
[Type] [Name] : [Tag];
or
[Type] [Name];
Acceptable Fields
The following fields can be written in the structure in any order.
Type(s) | Name | Tag | Description | Notes |
float4 | pos | SV_POSITION | The position of the vertex after being transformed into projection space. | Structure must contain exactly one field tagged SV_POSITION. |
float3 | | NORMAL | The normal of the vertex after being transformed into view space. | Structure must contain exactly one field tagged with NORMAL if the subsequent surface or fragment shader uses normals. <TODO: Verify> |
float4 | uv | TEXCOORD0 | First texture coordinate, or UV. | |
float4 | | TEXCOORD1 | Second texture coordinate, or UV. | Currently only two UV coordinates are supported per vertex, but you may bypass this by defining custom fields which act as additional UV coordinates. |
float4 | | TANGENT | Tangents are used to correct normal maps when they are viewed from different angles. Normal maps have incorrect values when they are rotated without processing from tangents. | |
float4, fixed4 | diff | COLOR0 | Vertex color, interpolated across the triangle. This value could correspond to anything depending on how the fragment shader interprets it. | |
float4, fixed4 | spec | COLOR1 | Vertex color, interpolated across the triangle. This value could correspond to anything depending on how the fragment shader interprets it. | |
Any | Any | User-defined fields which can be assigned any value. | Custom fields can have any type and any name, but may not have a tag. The upper limit on the number of custom fields is not known. <TODO: Research> |
<TODO: Complete this list>
Built-In
This structure is designed specifically for implementing image effects. See also vert_img.
struct v2f_img
{
float4 pos : SV_POSITION;
half2 uv : TEXCOORD0;
};
struct v2f_vertex_lit
{
float2 uv : TEXCOORD0;
fixed4 diff : COLOR0;
fixed4 spec : COLOR1;
};
<TODO: Complete list.>
Surface/Fragment Shader to Lighting Shader Structure (SurfaceOutput)
The third and final structure contains pixel values returned by either a surface or fragment shader. They are read as input to a lighting shader (such as Lambert, BlinnPhong or a custom lighting model) which then returns a single RGBA color value.
Format
Structure:
struct [Name]
{
[One or more Fields]
};
Field:
[Type] [Name];
Note that tags are not used in SurfaceOutput structures.
Acceptable Fields
The following fields can be written in the structure in any order.
Type(s) | Name | Description | Notes |
float3, fixed3, half3 | Albedo | The reflectance color and intensity of diffuse lighting. Diffuse lighting approximates the appearance of rough surfaces. Diffuse lighting calculations are multiplied with this value. | |
float3, fixed3, half3 | Normal | A vector representing the direction the surface is facing in screen space. | |
float3, fixed3, half3 | Emission | The color and intensity of emissive lighting. The emissive color will appear even in a completely black scene with no lights. Emissive lighting appears as though light is being created from the surface itself and is generally the most apparent in the absence of light. Glow-in-the-dark objects and computer instruments are examples of surfaces which might use emissive lighting. | |
float, fixed, half | Specular | The reflectance color and intensity of specular lighting. Specular lighting approximates the appearance of shiny surfaces. Specular lighting calculations are multiplied with this value. | |
float, fixed, half | Gloss | The sharpness of specular lighting. The higher this value is, the smaller the specular highlights will be. The range is 0-1, where 1 creates a pin-point highlight, and 0 is broad/flat. <TODO: Determine the exponent values this range uses.> | |
float, fixed, half | Alpha | Used for transparency, if render states are set up for alpha blending and the lighting shader interprets it as transparency (and it does by default). | |
Any | Any | User-defined fields which can be assigned any value. | Custom fields can have any type and any name, but may not have a tag. The upper limit on the number of custom fields is not known. <TODO: Research> | |
<TODO: Can different types be used other than the ones listed? Example, Albedo using float4 instead of float3.>
Built-In
Default structure; must be used unless you have implemented your own custom lighting shader.
struct SurfaceOutput
{
half3 Albedo;
half3 Normal;
half3 Emission;
half Specular;
half Gloss;
half Alpha;
};
Surface Shader input structure (Input)
The input structure Input generally has any texture coordinates needed by the shader. Texture coordinates must be named "uv" followed by texture name (or start it with "uv2" to use second texture coordinate set).
Additional values that can be put into Input structure:
- float3 viewDir - will contain view direction, for computing Parallax effects, rim lighting etc.
- float4 with COLOR semantic - will contain interpolated per-vertex color.
- float4 screenPos - will contain screen space position for reflection effects. Used by WetStreet shader in Dark Unity for example.
- float3 worldPos - will contain world space position.
- float3 worldRefl - will contain world reflection vector if surface shader does not write to o.Normal. See Reflect-Diffuse shader for example.
- float3 worldNormal - will contain world normal vector if surface shader does not write to o.Normal.
- float3 worldRefl; INTERNAL_DATA - will contain world reflection vector if surface shader writes to o.Normal. To get the reflection vector based on per-pixel normal map, use WorldReflectionVector (IN, o.Normal). See Reflect-Bumped shader for example.
- float3 worldNormal; INTERNAL_DATA - will contain world normal vector if surface shader writes to o.Normal. To get the normal vector based on per-pixel normal map, use WorldNormalVector (IN, o.Normal).
Functions
ShaderLab comes packaged with built-in, or "intrinsic" functions. Many of them are based on the intrinsic functions provided by shader languages like CG, GLSL and HLSL, while others are unique to ShaderLab.
<TODO: Complete this section.>
See Also
Preprocessor Directives
Preprocessor directives are special statements which tell the compiler specifically how to handle the code. They are similar to tags and render states. Below is a list of different directives:
Preprocessor Directive | Option | Argument | Description | Notes |
#include | | (Path of filename in quotations) | Includes code written in another file with the extension .cginc. For example,#include "UnityCG.cginc"is commonly used and contains several helper functions. You may write your own CGINC files as well. SeeBuilt-In CGINC files for a list of include files already provided by Unity. | |
#pragma | target | 2.0 or default | Compiles the shader under shader model 2. Model 2 has more limitations than 3 but is more compatible. Uses shader model 1.1 for vertex shaders. | Vertex: 128 instruction limit. Fragment: 96 instruction limit (32 texture + 64 arithmetic), 16 temporary registers and 4 texture indirections. |
3.0 | Compiles the shader under shader model 3. Model 3 is more powerful and flexible than 2 but is less compatible. | Vertex: no instruction limit. Fragment: 1024 instruction limit (512 texture + 512 arithmetic), 32 temporary registers and 4 texture indirections. It is possible to override these limits using#pragma profileoptiondirective. For example, #pragma profileoption MaxTexIndirections=256raises texture indirections limit to 256. See #pragma profileoption for more information. Note that some shader model 3.0 features, like derivative instructions, aren't supported by vertex or fragment shaders. You can use #pragma glsl to translate to GLSL instead which has fewer restrictions. See#pragma glsl for more information. |
surface | (Name of surface shader) | Tells the compiler which function is meant to be used as a surface shader. | When writing custom surface shaders, this option MUST be written, and it MUST be written first. |
(Name of lighting shader, minus the "Lighting" prefix. For example "LightingCookTorrence()" would be supplied here as "CookTorrence".) | Tells the compiler which function is meant to be used as a lighting model. See Built-In lighting modelsfor a list of lighting models already provided by Unity. You may write your own lighting models as well. | When writing custom surface shaders, this option MUST be written, and it MUST be written second. |
alphaTest:(Property Name) | Similar to therender stateAlphaTest, except it only culls alpha values less than or equal to the provided value. | Both the alpha test render state and preprocessor directive can be used together although their interaction is unclear. Unlike AphaTest, you may only use properties (like _Cutoff) and not constants (like 0.5). |
vertex | (Name of vertex shader) | Tells the compiler which function is meant to be used as a vertex shader. | When writing custom vertex or fragment shaders with #pragma target 3.0 the compiler MUST know which vertex shader to use. You may provide your own vertex shader but if not you may use one of Unity's built in vertex shaders. |
fragment | (Name of fragment shader) | Tells the compiler which function is meant to be used as a fragment shader. | When writing custom vertex or fragment shaders with #pragma target 3.0 this directive MUST be written. This option MUST be written first with any shader model. You may provide your own fragment shader but if not you may use one of Unity's built in fragment shaders. |
vertex:(Name of vertex shader) | Tells the compiler which function is meant to be used as a vertex shader. | When writing custom vertex or fragment shaders with #pragma target 3.0 the compiler MUST know which vertex shader to use. You may provide your own vertex shader but if not you may use one of Unity's built in vertex shaders. |
fragmentoption | <TODO: Add option here.> | <TODO: Add description here.> | <TODO: Add option notes here.> | This directive has no effect on vertex programs or programs that are compiled to non-OpenGL targets. |
<TODO: Add option here.> | <TODO: Add description here.> | <TODO: Add option notes here.> |
only_renderers | | | |
exclude_renderers | | | |
glsl | | | |
profileoption | | | |
See Also