Click to go Home  
 
REALSOFT 3D
About
History
News
Screenshots
Articles
Render times
User Profiles
Newsletters
LEARNING
Tutorials
Reviews
Glossary
Beginners
Links
DATABASE
Models
Macros / Scripts
Materials / VSL
Environments
Plugins
RENDERS
User images
Image Contest
Realsoft images
VRgrafix images
EX IMAGE:
Mech Turtle
By David Coombs, 2002
 
glossary

The RealSoft3D ray-tracing process - Part 4: More Shaders

By David Coombes
d.g.coombes@btopenworld.com

Foreword to Part 4

In this part you'll learn about the other shaders that contribute to the final rendered image, and get a final overview of the RealSoft3D rendering pipeline.

An overview

I'll start this part with a visual overview. Look at the order of shaders and the samples involved. The names of the shaders and samples give a good idea of what's going on.

Initial shaders

In part two our pipeline started with surface geometry. In the full pipeline there are a few shaders executed before this. The two most worth noting here are...
 

Shader

Activities of Shader



        Ray
        Hit

 
Ray hit occurs when a cast ray from the camera hits a surface. At this point the values of the surface's spacial data are initialised; such data as defines coordinate position, distance from ray source, and UV mapping coordinates. The end of ray hit copies the Ray Hit sample data to the Surface sample.
The usual application of this shader in custom materials is to modify the Distance channel. For example, setting Ray Hit:Distance to -1 fools the rendering process into thinking the surface is behind the camera, so the surface is not drawn. This is used in clip-mapping.

     

     Scanline


Visible NURBS curves and particles are created using a scanline renderer. The scanline shader serves the same job as ray hit for scanline objects. For custom effects on scanline geometry, such as displacement or custom channel defined thickness, code to control this has to appear in the scanline shader.
The output of this shader creates surface data with the same properties as 3D geometry surfaces, copied to the Surface sample where it can be assigned values through the subsequent shaders such as surface properties.

Volume shaders

Volume effects such as fog are created by three volume shaders. Note that Surface:Volume Sampling must have a value greater than or equal to 1.0 for volume calculation to be invoked. Surface:Volume Sampling gives an integer value defining sampling density. Higher values give more accurate results at a cost of rendering time, while low values though fast can produce very abstract results. Volume sampling has a most noticable affect when applied to a volume with varying properties such as colour and turbidity, or where shadows are cast through the volume.
 

Volume Shader

Activities of Shader  



    Volume
    Properties 

 
Volume properties defines the properties of the volume in terms of density (Turbidity), self illumination (Illumination), etc., much the same as surface properties does for the surface.
This shader is the place to create density effects, either constant value or defined by mapping coordinates etc. creating patchy fog or even checkered fog, by changing the value of Volume:Turbidity.



    Volume
    Illumination


Volume illumination defines the degree of illumination of a point in the volume, as held in Volume:Illumination. The default actions of this shader produces Volume:Illumination by (Light:Illumination * Volume:Color).
Custom illumination can involve changing this value, and setting either Light:Illumination or Volume:Color to (0,0,0) nullifies default volume illumination.



    Volume
    Shading


By this point the volume's density and brightness have been calculated, stored respectively in Volume:Illumination and Volume:Turbidity. The final step to rendering the volume is blending the volume with surfaces behind it based on the distance through the volume that the ray travels, and density. Actions here can change all the properties to create custom effects, and setting Volume:Turbidity to 0 nullifies default volume rendering.

Remember also that volume filtering plays a part. For each sample of the volume the light is filtered based on Volume:Transparency. Volume properties are perhaps the least used in RealSoft3D save for slight fog effect, although the potential is there for much more powerful material application. As a little sample, here is a picture of a volume with color, turbidity and transparency defined by a checker pattern...

A last point regards volumes - changing the sampling rate (Surface:Volume Sampling) strongly affects the results of other properties. For example, with Volume:Transparency set to a feint (0.95,0.95,0.95), every time a volume sample is taken the light passing through (Ligh:Illumination) is reduced by this much. With sampling = 3, light is reduced by (0.95,0.95,0.95) three times by the time it leaves the volume, giving a light intensity of (0.86,0.86,0.86). If sampling is increased to 10, light is filtered ten times and the end illumination result is (0.6,0.6,0.6). High quality volume effects need only very slight values for filtering to produce strong effects. In the checkered volume example above, Surface:Volume Sampling = 30 and Volume:Transparency for the blue volumes is set to (0.991,0.991,1.0), producing strong blue shadows. The shadows cast by the sphere show obvious banding in the original image, not visible above where the image has been post-processed.

Final shaders

There are two shaders introduced at the end of the pipeline; post particles and post image.
 

Terminal Shader

Activities of Shader  



    Post
    Particle 

 
The post particle shader serves the equivalent role of surface properties for post-particle effects, though there are few properties to post-particles.
The main use of this shader is to define particle properties for post-particle disks, such as colour and fade.  For example, adding random noise to Post Particle:Color means particle disks are rendered with random colours per particle. Another option is to modify particle placement by changing data in Post Particle:Coordinates. The post-particle drawing process writes to the image data (Image sample).



    Post
    Processing


This is the final step of image creation. The values held in the Surface sample are copied to the Image sample, with Surface:Illumination becoming Image:Color. Post Image effects then transform this data.
The addition of VSL code acting upon this data creates a hugh variety of posibilities in render effects.

The remaining shaders

There are two other shaders not shown on the diagram that have limited roles to play.
 

Missing Shader

Activities of Shader  



    Material
    Initialisation

 
This is the very first shader executed. It has no access to basic scene data other than time. It is used when writing custom VSL materials to define global variable data, or to load data into custom channels.
An important aspect of material initialisation is that it is only executed once at the beginning of rendering. When the material is accessed later through recursion, material initialisation is not executed again.



    Common
    Properties

 
It is sometimes the case that you wish to apply a global effect to a material, such as applying a checkered pattern to colour, reflectivity and volume transparency. Common properties defines common properties shared by all the shaders, such as map coordinates or custom data. Defining values here makes them available in subsequent shaders. For example, specifying a custom channel Spot value derived from mapping coordinates, and then using this value to produce bumps in surface geometry and red spots in surface properties.
The only default channels available for writing to in this shader are Map Coords and Scope, though all custom channels are available.

If you use these shaders in your VSL materials, you will see the output is not to any of the defined samples such as Surface or Light. Instead these shader's data are copied to the relevant channel in later shaders. A transformation of Map Coords in common properties such as "Map Coords *= (2,2,2)" will double the mapping coordinate values in all relevant targets. The example code on the right shows a material where common properties defines new coordinates. This data is copied to the relevant Map Coords channels of the following shaders.  So the surface properties shader accesses the Surface sample, where Surface:Map Coords is a copy of those transformed in common properties, and likewise the Map Coords channel of the Light sample is a copy from those defined in common properties. Such a material as this is unlikely to be used in a real-world situation as it applies to various different, unrelated objects.

What's missing?

By now you should have a very good idea of the whole RealSoft3D rendering process, except that there is one shader in the diagram beginning this Part 4 which I have not yet made mention of.  The last shader to explain, the secondary ray shader, is responsible for reflections and it works somewhat differently to the other shaders. For that reason, Part 5 will be all about the secondary ray shader.

Page updated on Thursday, 20 March, 2003 . For feedback / model submissions or articles - please email us.
Site proudly sponsored by VR.grafix
Goto Sponsors Site (VRgrafix)