Example 6.1: Volume Rendering vs. Path Tracer
Introduction
In this example you develop a network to show some differences between volume rendering and the MeVisLab Path Tracer. You will visualize the same scene using both 3D rendering techniques and some of the modules for path tracing.
The MeVis Path Tracer requires an NVIDIA graphics card with CUDA support. In order to check your hardware, open MeVisLab and add a SoPathTracer module to your workspace. You will see a message if your hardware does not support CUDA:
MeVisLab detected an Intel onboard graphics adapter. If you experience rendering problems, try setting the environment variables SOVIEW2D_NO_SHADERS and GVR_NO_GLSL.
Handling cudaGetDeviceCount returned 35 (CUDA driver version is insufficient for CUDA runtime version)
Steps to do
As a first step for comparison, you are creating a 3D scene with 2 spheres using the already known volume rendering.
Volume Rendering
Create 3D objects
Add 3 WEMInitialize modules for 1 Cube and 2 Icosphere to your workspace and connect each of them to a SoWEMRenderer. Set instanceName of the WEMInitialize modules to Sphere1, Sphere2 and Cube. Set instanceName of the SoWEMRenderer modules to RenderSphere1, RenderSphere2 and RenderCube.
For RenderSphere1 define a Diffuse Color yellow and set Face Alpha to 0.5. The RenderCube remains as is and the RenderSphere2 is defined as Diffuse Color red and Face Alpha 0.5.
Group your modules and name the group Initialization. Your network should now look like this:
Use the Output Inspector for your SoWEMRenderer outputs and inspect the 3D rendering. You should have a yellow and a red sphere and a grey cube.
Rendering
Add 2 SoGroup modules and a SoBackground to your network. Connect the modules as seen below.
If you now inspect the output of the SoGroup, you will see an orange sphere.
You did not translate the locations of our 3 objects, they are all located at the same place in world coordinates. Open the WEMInitialize panels of your 3D objects and define the following translations and scalings:
The result of the SoGroup now shows 2 spheres on a rectangular cube.
For the viewer, you now add a SoCameraInteraction, a SoDepthPeelRenderer and a SoRenderArea module to your network and connect them.
You now have a 3D volume rendering of our 3 objects.
In order to distinguish between the 2 viewers, you now add a label to the SoRenderArea describing that this is the Volume Rendering. Add a SoMenuItem, a SoBorderMenu and a SoSeparator to your SoRenderArea.
Define the Label of the SoMenuItem as Volume Rendering and set Border Alignment to Top Right and Menu Direction to Horizontal for the SoBorderMenu.
Finally, you should group all modules belonging to your volume rendering.
Path Tracing
For the Path Tracer, you can just re-use our 3D objects from volume rendering. This helps us to compare the rendering results.
Rendering
Path Tracer modules fully integrate into MeVisLab Open Inventor, therefore the general principles and the necessary modules are not completely different. Add a SoGroup module to your workspace and connect it to your 3D objects from SoWemRenderer. A SoBackground as in volume rendering network is not necessary but you add a SoPathTracerMaterial and connect it to the SoGroup. You can leave all settings as default for now.
Add a SoPathTracerAreaLight, a SoPathTracerMesh and a SoPathTracer to a SoSeparator and connect the SoPathTracerMesh to your SoGroup. This adds your 3D objects to a Path Tracer Scene.
Selecting the SoSeparator output already shows a preview of the same scene rendered via Path Tracing.
Add a SoCameraInteraction and a SoRenderArea to your network and connect them as seen below.
You can now use both SoRenderArea modules to visualize the differences side by side. You should also add the SoMenuItem, a SoBorderMenu and a SoSeparator to your SoRenderArea in order to have a label for Path Tracing inside the viewer.
Define the Label of the SoMenuItem as Path Tracing and set Border Alignment to Top Right and Menu Direction to Horizontal for the SoBorderMenu.
Finally, group your Path Tracer modules to another group named Path Tracing.
Share the same camera
Finally, you want to have the same camera perspective in both viewers so that you can see the differences. Add a SoPerspectiveCamera module to your workspace and connect it to the volume rendering and the Path Tracer network. The Path Tracer network additionally needs a SoGroup, see below for connection details. You have to toggle detectCamera in both of your SoCameraInteraction modules in order to synchronize the view for both SoRenderArea viewers.
SoPathTracer panel. The more iterations, the better the result but the more time it takes to finalize your image.Results
Path Tracing provides a much more realistic way to visualize the behavior of light in a scene. It simulates the scattering and absorption of light within the volume.
Exercises
- Play around with different
SoPathTracerMaterialsettings and define different materials - Change the maximum number of iterations in
SoPathTracermodule - Change the configurations in
SoPathTracerAreaLightmodule
Summary
- Path Tracer modules can be used the same way as Open Inventor modules
- A
SoPerspectiveCameracan be used for multiple viewers to synchronize camera position - Path Tracing produces beautiful, photorealistic renderings, but can be computationally expensive




























