RealSoft3D ray-tracing process - Part 4:
By David Coombes
Foreword to Part 4
In this part you'll learn about the other
shaders that contribute to the final rendered
image, and get a final overview of the RealSoft3D
I'll start this part with
a visual overview. Look at the order of
shaders and the samples involved. The names
of the shaders and samples give a good idea
of what's going on.
In part two our pipeline started with surface
geometry. In the full pipeline
there are a few shaders executed before
this. The two most worth noting here are...
occurs when a cast ray from the
camera hits a surface. At this point
the values of the surface's spacial
data are initialised; such data
as defines coordinate position,
distance from ray source, and UV
mapping coordinates. The end of
copies the Ray
Hit sample data to the Surface
The usual application of this shader
in custom materials is to modify
channel. For example, setting Ray
to -1 fools the rendering process
into thinking the surface is behind
the camera, so the surface is not
drawn. This is used in clip-mapping.
Visible NURBS curves and particles
are created using a scanline renderer.
shader serves the same job as
for scanline objects. For custom
effects on scanline geometry, such
as displacement or custom channel
defined thickness, code to control
this has to appear in the scanline
The output of this shader creates
surface data with the same properties
as 3D geometry surfaces, copied
to the Surface sample where it can be assigned values
through the subsequent shaders such
Volume effects such as fog are created
by three volume shaders. Note that Surface:Volume
Sampling must have a value greater
than or equal to 1.0 for volume calculation
to be invoked. Surface:Volume
Sampling gives an integer value
defining sampling density. Higher values
give more accurate results at a cost of
rendering time, while low values though
fast can produce very abstract results.
Volume sampling has a most noticable affect
when applied to a volume with varying
properties such as colour and turbidity,
or where shadows are cast through the volume.
defines the properties of the volume
in terms of density (Turbidity),
self illumination (Illumination),
etc., much the same as surface
properties does for the surface.
is the place to create density effects,
either constant value or defined
by mapping coordinates etc. creating
patchy fog or even checkered fog,
by changing the value of
defines the degree of illumination
of a point in the volume, as held
The default actions of this shader
produces Volume:Illumination by (Light:Illumination * Volume:Color).
Custom illumination can involve
changing this value, and setting
either Light:Illumination or Volume:Color to (0,0,0)
nullifies default volume illumination.
By this point the volume's density
and brightness have been calculated,
stored respectively in Volume:Illumination
The final step to rendering the
volume is blending the volume with
surfaces behind it based on the
distance through the volume that
the ray travels, and density. Actions
here can change all the properties
to create custom effects, and setting
Volume:Turbidity to 0 nullifies default volume
Remember also that volume
filtering plays a part. For each
sample of the volume the light is filtered
based on Volume:Transparency.
Volume properties are perhaps the least
used in RealSoft3D
save for slight fog effect, although the
potential is there for much more powerful
material application. As a little sample,
here is a picture of a volume with color,
turbidity and transparency defined by a
A last point regards volumes
- changing the sampling rate (Surface:Volume Sampling)
strongly affects the results of other properties.
For example, with Volume:Transparency set to a feint (0.95,0.95,0.95), every
time a volume sample is taken the light
passing through (Ligh:Illumination) is reduced by this much. With sampling
= 3, light is reduced by (0.95,0.95,0.95)
three times by the time it leaves the volume,
giving a light intensity of (0.86,0.86,0.86).
If sampling is increased to 10, light is
filtered ten times and the end illumination
result is (0.6,0.6,0.6). High quality volume
effects need only very slight values for
filtering to produce strong effects. In
the checkered volume example above, Surface:Volume Sampling = 30 and Volume:Transparency for the blue volumes is set to (0.991,0.991,1.0),
producing strong blue shadows. The shadows
cast by the sphere show obvious banding
in the original image, not visible above
where the image has been post-processed.
There are two shaders introduced at the
end of the pipeline; post
particles and post
The post particle
shader serves the equivalent role
properties for post-particle
effects, though there are few properties
The main use of this shader is to
define particle properties for post-particle
disks, such as colour and fade.
For example, adding random
noise to Post Particle:Color
means particle disks are rendered
with random colours per particle.
Another option is to modify particle
placement by changing data in Post
The post-particle drawing process
writes to the image data (Image sample).
This is the final step of image
creation. The values held in the
sample are copied to the Image
sample, with Surface:Illumination
becoming Image:Color. Post Image effects then transform this data.
The addition of VSL code acting
upon this data creates a hugh variety
of posibilities in render effects.
There are two other shaders not shown on
the diagram that have limited roles to play.
This is the very first shader executed.
It has no access to basic scene
data other than time. It is used
when writing custom VSL materials
to define global variable data,
or to load data into custom channels.
An important aspect of material
initialisation is that
it is only executed once at the
beginning of rendering. When the
material is accessed later through
not executed again.
It is sometimes the case that you
wish to apply a global effect to
a material, such as applying a checkered
pattern to colour, reflectivity
and volume transparency. Common
properties defines common
properties shared by all the shaders,
such as map coordinates or custom
data. Defining values here makes
them available in subsequent shaders.
For example, specifying a custom
value derived from mapping coordinates,
and then using this value to produce
bumps in surface
geometry and red spots
The only default channels available
for writing to in this shader are
though all custom channels are available.
If you use these shaders in your VSL materials, you will
see the output is not to any of the defined
samples such as Surface
or Light. Instead
these shader's data are copied to the relevant
channel in later shaders. A transformation
of Map Coords
in common properties
such as "Map Coords *= (2,2,2)"
will double the mapping coordinate values
in all relevant targets. The example code
on the right shows a material where common
properties defines new coordinates.
This data is copied to the relevant Map
Coords channels of the following
shaders. So the surface
properties shader accesses the
Coords is a copy of those transformed
in common properties,
and likewise the Map
Coords channel of the Light
sample is a copy from those defined in common
properties. Such a material as
this is unlikely to be used in a real-world
situation as it applies to various different,
By now you should have a very good idea
of the whole RealSoft3D
rendering process, except that there is
one shader in the diagram beginning this
Part 4 which I have not yet made mention
of. The last shader to explain, the
shader, is responsible for reflections and
it works somewhat differently to the other
shaders. For that reason, Part 5 will be
all about the secondary