Dec 13 '15
Authors:
This is an older version of the article. Click here to see the current one.

Shading

Shading links many of the components of 3D rendering together. After surface intersections have been found along viewing rays, a colour needs to be computed based on the geometry’s material at the intersection and the lights in the scene. Shading is a process that links materials and lights, solving or approximating the rendering equation. While this is where the name comes from it is often generalized to computing colour in a rendered image, which is not always a physically based operation.

In real-time rendering with dedicated graphics processors, shaders are small programs which are executed on the GPU to compute shading. A colour may be computed for vertices of a mesh and interpolated during rasterization, or alternatively computed after rasterization at the pixel level, giving name to vertex and fragment shaders. These shader programs are in turn executed on shader cores, which are the parallel processors of the GPU. In offline (non-real-time) or software rendering, the components which compute shading are also called shaders.

While the shading process covers all possible shading operations, a monolithic shader which can compute results for interactions between all light and material combinations is normally too expensive and cumbersome to maintain. Shaders are normally small dedicated programs which handle just a few cases. When other cases occur, with different materials or lights, different shaders are used to compute the results.

Diffuse Shading

Diffuse shading, which calculates the amount of directly scattered light, is the most common lighting calculation a shader will do. A widely used model is Lambertian relfectance, which scales the light intensity by the cosine of the angle between the surface normal and light direction to give the amount of scattered light.

enter image description here