GPU Photon Mapping
This project is an experimental implementation of photon mapping that runs on the GPU using Metal compute shaders and ray tracing APIs. This implementation modifies the density estimation step of the traditional photon mapping algorithm. Instead of storing photons as points, photons are stores as triangles and density estimation occurs by creating photon gather rays in a square above the gather point and intersecting the rays with the photon triangles. By turning photon points into photon triangles, the photon map can utilize Metal's BVH implementation to enable photon gathering and density estimation on the GPU. If a photon gather ray intersects with a photon triangle, that photon is gathered. Then, compute shaders aggregate the gathered photons into a caustic texture, which contains the contribution of the photons to the indirect lighting of the scene. This texture is combined with the direct lighting texture computed by the rest of the path tracer to create the final rendered image.
How Photon Mapping Works
Photon mapping is an implementation of bidirectional path tracing and it is best suited for rendering complex lighting effects like color bleeding, indirect illumination, and caustics. In bidirectional path tracing, partial ray paths are generated starting from both the light sources and camera. Next, the partial camera paths and light paths are connected together to generate a full light path from the camera to the light. In photon mapping, when a light path intersects scene geometry, the light's intersection point, color and incoming direction are cached in a photon map. When camera paths intersect with scene geometry, the photon map is queried for light path intersections within a specified radius around the camera path intersection point. The queried data is then used in rendering.
There are four rendering modes in the project.
The first is rasterization mode. This mode is physically-based(ish) and uses the Phong reflection model for shading.
The second is path tracing mode, which uses compute shaders to generate the output image, and runs on the GPU. This mode works like most other path tracers, and is based off of Apple's ray tracing project.
The third mode is photon rasterization mode. This mode visualizes the photons in the photon map. A caustic is generated on the inside of the ring. This is caused by light reflecting off of the ring on to the ground.
The fourth mode is photon mapping mode, which combines the photon map with path tracing to render caustics.
The photon map is generated by tracing the path a photon takes from the light source through the scene for a specified number of bounces. My photon map ignores the inital emmision of photons into the scene, and only begins recording photons after the first bounce. I did this because I wanted to use the photon map for indirect lighting only and allow the path tracer to handle the direct lighting.
After all of the photons are generated, photons are turned into 'photon triangles'. These photon triangles are tiny triangles centered at the photon's position. The triangles are fed into a Metal MPSTriangleAccelerationStructure. This allows the program to gather photons by doing ray casts around the gather point. This also enables GPU photon gathering without the need for me to implement a GPU-friendly spacial data structure.
When a ray intersects the scene, multiple 'gather rays' are generated above the intersection point that are directed down onto the surface. These gather rays intersect with the photon triangles. If an intersection occurs, the color of the photon is recorded in a large caustic texture. The colors of all the gather rays are summed together via a texture reducer, and the final color is written to a texture, which is then sent to the path tracing shading compute function where it is added to the result of the path traced texture.
Caustic Texture Generation
Gather rays around the same gather point are rendered into pixel tiles in a large caustic texture. The size of each tile depends on the number of gather rays. Then, the pixels in the tiles are summed to a final pixel value that represents the value of the caustic for the gather point. This pixel summation is done via a 'texture reducer', which essentially mipmaps the large texture. Instead of using Metal's mipmapping functionality, I implemented my own version. I did this because I wanted to preserve all color data from the large causic texture, and I worried that a standard texture could not handle the precision needed to accomplish this.
My texture reducer reads in the large caustic texture and converts the pixel to 3 unsigned integer values, which are then stored as a Color data structure in a buffer. Then, a compute shader repeatedly adds groups of four Color data structures together and outputs the sum to an output buffer. Once the buffer is summed to the desired size, another kernel function converts the Colors into the final caustic texture, and performs any desired scaling or averaging on the pixels.