Implemented a raytracer by following the Ray Tracing in One Weekend books by Peter Shirley.
My version extends the basic tracer described in Book 1 by adding: 0. Per-pixel variance estimation, early exit if noise if below a certain threshold. This speeds up rendering in the tested scenes by a factor of 3 to 4 while producing similar levels of visual noise. More complex parts of the image will be rendered using up to 20x more samples than simple parts of the scene.
- Spectral rendering for dispersion using a continuous blackbody spectrum, conversion of wavelength to color at the end
- GUI
- More primitives (mostly from book 2 and triangle intersections using the Möller Trumbore algorithm)
- Sampling and filtering improvements as described in the book Physically based rendering
- Obj loading based on this OpenGL tutorial, slightly improved to utilize newer C++ features
- Directional lights (only emit light when hit within the determined angle, e.g. to simulate lasers)
- 16bit floating point EXR support (for HDR and better color depth) using an adapted version of mini exr
- Multithreading
sudo apt install libglm-dev libsfml-dev libpcg-cpp-dev
mkdir build
cd build
cmake ..
make
Then run the executable
./RaytracingWeekend
Caustics and dispersion from a sphere light through a triangle mesh.
Dispersion is visible at the edges of the glass spheres (rainbow colored edges)
Programatically generated scene with concentric glass spheres
Artifacts caused by race conditions between multiple threads which try to write into the image array.
All light rays that travelled through glass are colored in green for debugging purposes.
Rainbow produces by interpolating RGB looks unnatural and would not produce realistic dispersion.
Rainbow produced by simulating the intensity of different wavelengths using the planck spectrum. The wavelengths are convoluted by the sensitivity spectrum of our eyes receptors to get the color we would see. This produces a realistic spectrum image and much better dispersion.
Another advantage of this approach is that we can realistically simulate the color of blackbody radiation by setting the temperature of the object to the desired value.
2000K blackbody radiation
4500K blackbody radiation
8000K blackbody radiation
Image produced by changing the temperature of the lightsource during the render
Testing OBJ file loading
Simplified RGB dispersion model, debug image
Reference image of 3 spheres made from different materials
What happens if we apply a cross product instead of a dot product to the normals?
Debug view showing the BVH traversal cost, when splitting on the longest axis. Red is expensive, blue is cheap.
These three images show the effect of adaptive sampling. The first image is rendered with a constant number of samples per pixel and the second with adaptive sampling. Although the second image has the same total numer of samples as the first one, it has less visible noise (especially on the diffuse areas around the light and in shadows). The third image (black and white) shows which areas were sampled more often. The adaptive sampling algorithm estimates the variance of each pixel at runtime and decides whether to continue sampling or not.
- Multithreading support
- Different sampling strategies
- Replacing shared_ptr with raw pointers for the hit_records. As these are being created at every ray bounce, the overhead of incrementing and decrementing the internal reference counter became quite significant. 10% speedup.
- Minor restructuring to take advantage of compiler optimization and reduced branching
- Faster cube intersection method adapted from the PSRaytracing repository
- Better BVH, that doesn't create a copy of the scene array for each node.
- thread_local RNG objects, to make it fully parallelizable (before, a single RNG object was being accessed from all threads and became the bottleneck, as it was the only single threaded operation.)
- Better BVH splitting using surface area heuristics
- Sobol sampling everything for faster convergence
- Importance sampling