|
COS 526 - Advanced Computer Graphics
|
Fall 2018
|
|
|
|
Assignment 3 - Photon Mapping
Part I due Fri, Dec 14
Part II due Tue, Jan 15
In this assignment you will implement photon mapping, an algorithm for
image synthesis with global illumination. Part I of the assignment is
to implement a basic system, while part II involves implementing an
extension of your choosing, and producing a realistic rendering of
some real-world object or scene.
I. Implementing a Photon Mapper
Photon Tracing:
- Photon emission:Implement code to emit photons in random
directions from every light source in a scene. The total number of
photons emitted for each light source should be proportional to the
power of the light source (so that each photon carries approximately
equal power), and the distribution of photons should be proportional
to the power in each direction -- e.g., for spot lights
(section 2.1.1 in Jensen01).
- Photon scattering: Trace photons via reflections and
transmissions through the scene. At each ray-surface intersection,
randomly generate a secondary ray along a direction of diffuse
reflection, specular reflection, transmission, or absorption with
probability proportional to kd, ks, kt, and (1 - kd+ks+kt), respectively
(section 2.1.2 in Jensen01).
- Russian Roulette: At each surface intersection, terminate rays
with probability p (e.g., p=0.5) and multiply the power of
surviving rays by 1.0/p
(section 2.1.2 in Jensen01).
See section 8.5 of these
siggraph course notes for details.
- Photon storage: Store photon-surface intersections in a kd-tree,
retaining the position, incident direction, and power of each photon.
(section 2.1.3 in Jensen01).
You can use your code from assignment 2, or the R3Kdtree class in R3Shapes
to implement this feature.
- BRDF importance sampling: Select the directions of reflected
and transmitted rays with probabilities proportional to the Phong BRDF of
the scattering surface. See Jason
Lawrence's notes for details.
- Multiple photon maps: Implement separate photon
maps for global (L{S|D}*D) and caustic (LS+D) ray paths
(section 2.1.5 in Jensen01).
Rendering:
- Camera ray tracing: Generate a ray(s) from the camera eye
point through each pixel. Trace them through the scene with
reflections and transmissions at surface intersections -- i.e., at
each ray-surface intersection, randomly generate a secondary ray along
a direction of diffuse reflection, specular reflection, transmission,
or absorption using importance sampling.
(section 2.4 in Jensen01).
- Radiance estimation: Use the kd-tree to find the N closest
photons for each ray-surface intersection. Estimate the radiance traveling
along the ray towards the camera from the power of those photons.
(section 2.3.1 in Jensen01).
- Pixel integration: Trace multiple rays per pixel and
average the radiance computed for all rays to estimate the radiance to
store in the output image for each pixel. Compare the results with
different numbers of rays per pixel (N).
Visualization: (helpful for debugging!)
- Photon map visualization: Visualize photons stored in your
photon map(s) -- e.g., show positions, normals, and powers of photons.
- Ray tracing visualization: Visualize ray paths traced from the
camera -- e.g., show line segments between the camera and successive surface
intersections for a random sampling of rays.
Getting started:
To get started, download cos526_photon.zip
(courtesy Tom Funkhouser).
This contains some scenes to get you started, as well as C++ code providing
the basic infrastructre for reading scenes, computing ray intersections,
etc. It also provides a simple program (scnview) for viewing scenes using
OpenGL. You will probably need to augment this program to include command
line arguments of your own to turn on and off specific features and/or
provide parameters for specific applications.
The skeleton code is able to read scenes in a simple scene file format, This format was created to
provide the features required by this assignment -- a simple scene
graph along with materials, lights, and cameras. We provide several
scenes in the input subdirectory of the zip file
that you can use to test basic functionality of your program. However,
these scenes are not enough to test all the interesting lighting
effects your program should demonstrate. So, you should design your
own scenes for testing/demonstration and include them in your writeup
file.
If you are using a programming language other than C++, feel free to use
the provided code as a starting point, and translate to your languange of
choice. Also feel free to use any ray-tracing code you may already have
as reference when implementing your photon mapper.
Useful resources:
Example solutions from previous years:
- Photon Mapping Assignment Writeup, Nik
Sigatapu, COS 526, Fall 2014.
- Photon Mapping Assignment Writeup,
Maciej Halber, COS 526, Fall 2014.
- Photon Mapping Assignment Writeup,
Brian Matejek, COS 526, Fall 2014.
- Photon Mapping Assignment
Writeup, Nora Willet, COS 526, Fall 2014.
- Photon Mapping Assignment Writeup,
Abuhair Saparov, COS 526, Fall 2012.
- Photon Mapping Assignment Writeup,
Edward Zhang, COS 526, Fall 2012.
- Photon Mapping Assignment Writeup, Ohad
Fried, COS 526, Fall 2012.
Submitting Part I
Part I of this assignment is due Friday, December 14.
Please submit a single .zip file containing your code and a
writeup describing your implementation and showcasing some pretty pictures!
The Dropbox link to submit the assignment is
here.
II. Implementing Advanced Effects
Implement at least one of the following:
- Filtering: Implement disk, cone, and/or Gaussian filtering methods
for weighting photons during radiance estimation and compare the
results of different methods. (section 2.3.2 in Jensen01).
- Participating medium: Implement volume photon
maps that represent scattering of participating medium (e.g., fog)
(sections 2.1.4 and 2.3.3 in Jensen01).
- Projection maps: Use projection maps to sample photons leaving light
sources in the directions of scene geometry (section 2.1.1 in
Jensen01).
- Something else: Extend your photon mapper to handle subsurface
scattering, motion blur, or another feature described in
Jensen01.
Rendering a realistic object:
- Pick a real-world object or scene, and use your system to produce a
rendering or short animation that is as photorealistic as possible. See
this page
for inspiration. Note that most of your effort should be spent on modeling
optical phenomena, not on geometric modeling.
Submitting Part II
Part II of this assignment is due on Dean's date, Tuesday, Jan 15.
Per university policy,
no extensions of this deadline will be available.
Please submit a single .zip file containing your code and a
writeup describing your implementation and showcasing some pretty pictures!
The Dropbox link to submit the assignment is
here.
Last update
19-Nov-2018 13:57:44
smr at princeton edu