Commit ae2b9dc5 authored by TheNumbat's avatar TheNumbat
Browse files

Release new version

Features:
    - Particle systems can now specify a maximum dt per step
    - Animation key-framing & timing system now supports objects with simulation
    - Mixture/multiple importance sampling for correct low-variance direct lighting
        - New BSDF, point light, and environment light APIs that separate sampling, evaluation, and pdf
        - Area light sampling infrastructure
        - Removed rectangle area lights; all area lights are now emissive meshes
        - Reworked PathTracer tasks 4-6, adjusted/improved instructions for the other tasks

Bug fixes:
    - Use full rgb/srgb conversion equation instead of approximation
    - Material albedo now specified in srgb (matching the displayed color)
    - ImGui input fields becoming inactive no longer apply to a newly selected object
    - Rendering animations with path tracing correctly steps simulations each frame
    - Rasterization based renderer no longer inherits projection matrix from window
    - Scene file format no longer corrupts particle emitter enable states
    - Documentation videos no longer autoplay
    - Misc. refactoring
    - Misc. documentation website improvements
parent afa3f68f
---
layout: default
title: Dielectrics and Transmission
parent: (Task 6) Materials
grand_parent: "A3: Pathtracer"
permalink: /pathtracer/dielectrics_and_transmission
---
# Dielectrics and Transmission
## Fresnel Equations for a Dielectric
The [Fresnel Equations](https://en.wikipedia.org/wiki/Fresnel_equations) (another [link](http://hyperphysics.phy-astr.gsu.edu/hbase/phyopt/freseq.html) here) describe the amount of reflection from a surface. The description below is an approximation for dielectric materials (materials that don't conduct electricity). In this assignment you're asked to implement a glass material, which is a dielectric.
In the description below, <img src="dielectric_eq1.png" width="18"> and <img src="dielectric_eq2.png" width="15"> refer to the index of refraction of the medium containing an incoming ray, and the zenith angle of the ray to the surface of a new medium. <img src="dielectric_eq3.png" width="18"> and <img src="dielectric_eq4.png" width="15"> refer to the index of refraction of the new medium and the angle to the surface normal of a transmitted ray.
The Fresnel equations state that reflection from a surface is a function of the surface's index of refraction, as well as the polarity of the incoming light. Since our renderer doesn't account for polarity, we'll apply a common approximation of averaging the reflectance of polarizes light in perpendicular and parallel polarized light:
<img src="dielectric_eq5.png" width="200">
The parallel and perpendicular terms are given by:
<img src="dielectric_eq6.png" width="200">
<img src="dielectric_eq7.png" width="200">
Therefore, for a dielectric material, the fraction of reflected light will be given by <img src="dielectric_eq8.png" width="18">, and the amount of transmitted light will be given by <img src="dielectric_eq9.png" width="50">.
Alternatively, you may compute <img src="dielectric_eq8.png" width="18"> using [Schlick's approximation](https://en.wikipedia.org/wiki/Schlick%27s_approximation).
## Distribution Function for Transmitted Light
We described the BRDF for perfect specular reflection in class, however we did not discuss the distribution function for transmitted light. Since refraction "spreads" or "condenses" a beam, unlike perfect reflection, the radiance along the ray changes due to a refraction event. In your assignment you should use Snell's Law to compute the direction of refraction rays, and use the following distribution function to compute the radiance of transmitted rays. We refer you guys to Pharr, Jakob, and and Humphries's book [Physically Based Rendering](http://www.pbr-book.org/) for a derivation based on Snell's Law and the relation <img src="dielectric_eq10.png" width="150">. (But you are more than welcome to attempt a derivation on your own!)
---
layout: default
title: (Task 6) Direct Lighting
permalink: /pathtracer/direct_lighting
parent: "A3: Pathtracer"
---
# (Task 6) Direct Lighting
After tasks 4 and 5, `Pathtracer::sample_direct_lighting` is no different than the indirect case: it simply samples a ray from the BSDF scattering function and traces it into the scene to gather direct lighting. In this task, you will modify the sampling algorithm by splitting samples between BSDF scatters and the surface of area lights, a procedure commonly known as _next event estimation_.
First consider why sampling lights is useful. Currently, we are only importance sampling the BSDF term of the rendering equation (in which we have included the cosine term). However, each sample we take will also be multiplied by incoming radiance. If we could somehow sample the full product, our monte carlo estimator would exhibit far lower variance. Sampling lights is one way to importance sample incoming radiance, but there are some caveats.
- Importance sampling indirect lighting is hard. Doing so in an un-biased fashion requires more advanced integration schemes like bidirectional path tracing or metropolis light transport. In Scotty3D, we will only be adding sampling for direct lighting, because we know a priori what directions it can come from: those that point at emissive objects. Although doing so will not importance sample the full distribution of incoming radiance, it will be a better approximation.
- Specular BSDFs, such as mirror and glass, will not be improved by directly sampling lights. This is because any ray that is not a perfect reflection or refraction has zero contribution. However, scenes using continuous distributions like the Lambertian, Blinn-Phong, or GGX BSDFs, can benefit tremendously, especially in the presence of small and/or intense area lights.
### How do we do it?
It's tempting to simply compute an estimate of direct lighting from BSDF samples, another from sampling the surface of lights, and average the results. However, variance is additive: averaging two high-variance estimators does not give back a low-variance one. Instead, we can create a single new distribution that has the average PDF of the two inputs.
To do so, simply uniformly randomly choose which strategy to use before sampling from it as usual. Any given sample could then have been generated from either distribution, so the PDF at the sample is the average over each individual strategy's PDF. This is called _mixture sampling_, or more properly _single-sample multiple importance sampling_.
### Why is this helpful?
Intuitively, consider that the average of multiple PDFs is itself a PDF: it is non-negative and integrates to one. The average PDF will assign higher weight to any region that one of its components did, producing a distribution following the strengths of each. We can then use the improved distribution to sample and accumulate monte carlo estimates as usual.
Previously, if we chose a low-weight BSDF sample that just happened to hit a bright light, we would get a huge (high-variance) result after dividing (large) incoming light by the (small) PDF. Now, regardless of whether that sample came from `BSDF::scatter` or `Pathtracer::sample_area_lights`, the PDF of the sample cannot be small, because its `Pathtracer::area_lights_pdf` component is not small.
For a more rigorous explanation of multiple importance sampling, refer to [Physically Based Rendering](https://www.pbr-book.org/3ed-2018/) chapters 13.10 and 14.3.
- Note: because the mixture PDF now depends on both strategies, we need to separate the concepts of sampling a distribution and evaluating its PDF. This is why we separate `BSDF::scatter`/`BSDF::pdf` and `Pathtracer::sample_area_lights`/`Pathtracer::area_lights_pdf`.
Lastly, note that when the true light distribution heavily favors only one strategy (e.g. a highly specular but still continuous BSDF, or an intense but small area light), we will end up wasting time on samples we got from the wrong strategy. Ideally, we could adaptively choose how much contribution we take from each option, which is known as _multi-sample multiple importance sampling_ (see Extra Credit).
---
Finally, let's upgrade `Pathtracer::sample_direct_lighting`. Start by reading the following functions:
- `Pathtracer::sample_area_lights` takes a world-space position, returning a world-space direction pointing towards an area light.
- `Pathtracer::area_lights_pdf` takes a world-space position and direction, returning the PDF for generating the direction at the point from the area lights in the scene.
The direct lighting procedure should now follow these steps:
- If the BSDF is discrete, we don't need to bother sampling lights: the behavior should be the same as task 4.
- Otherwise, we should randomly choose whether we get our sample from `BSDF::scatter` or `Pathtracer::sample_area_lights`. Choose between the strategies with equal probability.
- Create a new world-space ray (the "shadow ray") and call `Pathtracer::trace` to get incoming light. You should modify time_bounds so that the ray does not intersect at time = 0. We are still only interested in the emissive component, so the ray depth can be zero.
- Add estimate of incoming light scaled by BSDF attenuation. Given a sample, we don't know whether it came from the BSDF or the light, so you should use `BSDF::evaluate`, `BSDF::pdf`, and `Pathtracer::area_lights_pdf` to compute the proper weighting. What is the PDF of our sample, given it comes from the combined distribution?
---
## Tips
- The converged output of all scenes should **not** change with the addition of task 6. If it does, you've done something wrong.
- We do not provide much in the way of reference images: make your own scene demonstrating what situations area light sampling is or isn't well suited for.
- Use the ray log to visually debug what proportion of your shadow rays are being directed at lights and where they are going.
---
## Reference Results
You will now be able to render scenes featuring area lights using far fewer samples and still get good results. The effect will be particularly pronounced when small and/or intense area lights are used with Lambertian materials (e.g. see task 7, grace.exr).
Here are `cbox.dae` and `cbox_lambertian.dae` without (left) and with (right) area light sampling (32 samples, max depth = 8):
<center><img src="images/cbox_importance.png"></center>
---
## Extra Credit
- Upgrade the mixture sampling procedure to use proper multiple importance sampling. This will involve always generating both a light sample and BSDF sample, then weighting their direct light contributions by the balance or power heuristic. You may also want to re-combine the direct and indirect sampling procedures. Refer to [Physically Based Rendering](http://www.pbr-book.org/3ed-2018/) chapter 14.3 for more information.
- Currently, computing the area light PDF involves checking whether the sampled ray intersects each individual emissive triangle. Improve the brute-force approach by building and querying a BVH containing only emissive triangles. You will need a traversal algorithm that returns all triangles intersected by a ray, not just the closest.
- (Advanced) Implement a more powerful integration scheme such as bidirectional path tracing or photon mapping. Refer to [Physically Based Rendering](http://www.pbr-book.org/3ed-2018/) chapter 16.
- (Advanced) Implement [specular next event estimation](http://rgl.epfl.ch/publications/Zeltner2020Specular) for the mirror and glass materials.
...@@ -9,69 +9,91 @@ has_toc: false ...@@ -9,69 +9,91 @@ has_toc: false
# (Task 7) Environment Lighting # (Task 7) Environment Lighting
### Walkthrough Video ## Walkthrough
<iframe width="750" height="500" src="Task7_EnvMap.mp4" frameborder="0" allowfullscreen></iframe> <video width="750" height="500" controls>
<source src="videos/Task7_EnvMap.mp4" type="video/mp4">
</video>
The final task of this assignment will be to implement a new type of light source: an infinite environment light. An environment light is a light that supplies incident radiance (really, the light intensity dPhi/dOmega) from all directions on the sphere. Rather than using a predefined collection of explicit lights, an environment light is a capture of the actual incoming light from some real-world scene; rendering using environment lighting can be quite striking. The final task of this assignment will be to implement a new type of light source: an infinite environment light. An environment light is a light that supplies incident radiance (really, the light intensity dPhi/dOmega) from all directions on the sphere. Rather than using a predefined collection of explicit lights, an environment light is a capture of the actual incoming light from some real-world scene; rendering using environment lighting can be quite striking.
The intensity of incoming light from each direction is defined by a texture map parameterized by phi and theta, as shown below. The intensity of incoming light from each direction is defined by a texture map parameterized by phi and theta, as shown below.
![envmap_figure](envmap_figure.jpg) ![envmap_figure](figures/envmap_figure.jpg)
In this task you need to implement the `Env_Map::sample` and `Env_Map::sample_direction` method in `student/env_light.cpp`. You'll start with uniform direction sampling to get things working, and then move to a more advanced implementation that uses **importance sampling** to significantly reduce variance in rendered images. In this task you will implement `Env_Map::sample`, `Env_Map::pdf`, and `Env_Map::evaluate` in `student/env_light.cpp`. You'll start with uniform sampling to get things working, and then move onto a more advanced implementation that uses **importance sampling** to significantly reduce variance in rendered images.
## Step 1: Uniformly sampling the environment map ---
To get things working, your first implementation of `Env_Map::sample` will be quite simple. You should generate a random direction on the sphere (**with uniform (1/4pi) probability with respect to solid angle**), convert this direction to coordinates (phi, theta) and then look up the appropriate radiance value in the texture map using **bilinear interpolation** (note: we recommend you begin with bilinear interpolation to keep things simple.)
## Step 1: Uniformly sampling the environment map
Since high dynamic range environment maps can be large files, we have not included them in the starter code repo. You can download a set of environment maps from this [link](http://15462.courses.cs.cmu.edu/fall2015content/misc/asst3_images/asst3_exr_archive.zip). You can designate rendering to use a particular environment map from the GUI: go to `layout` -> `new light` -> `environment map`-> `add`, and then select one of the environment maps that you have just downloaded. To get things working, your first implementation of `Env_Map::sample` will be quite simple. First, check out the interface of `Env_Map` in `rays/env_light.h`. For `Env_Map`, the `image` field is a `HDR_Image`, which contains the size and pixels of the environment map. The `HDR_Image` interface may be found in `util/hdr_image.h`.
![envmap_gui](envmap_gui.png) Second, implement the uniform sphere sampler in `student/samplers.cpp`. Implement `Env_Map::sample` using `uniform_sampler` to generate a direction uniformly at random. Implement `Env_Map::pdf` by returning the PDF of a uniform sphere distribution.
For more HDRIs for creative environment maps, check out [HDRIHAVEN](https://hdrihaven.com/) Lastly, in `Env_Map::evaluate`, convert the given direction to image coordinates (phi and theta) and look up the appropriate radiance value in the texture map using **bilinear interpolation**.
Since high dynamic range environment maps can be large files, we have not included them in the Scotty3D repository. You can download a set of sample environment maps [here](http://15462.courses.cs.cmu.edu/fall2015content/misc/asst3_images/asst3_exr_archive.zip).
**Tips:** To use a particular environment map with your scene, select `layout` -> `new light` -> `environment map`-> `add`, and select your file. For more creative environment maps, check out [Poly Haven](https://polyhaven.com/)
* You must write your own code to uniformly sample the sphere. ![envmap_gui](images/envmap_gui.png)
* check out the interface of `Env_Map` in `rays/env_light.h`. For `Env_Map`, the `image` field is the actual map being represented as a `HDR_Image`, which contains the pixels of the environment map and size of the environment texture. The interface for `HDR_Image` is in `util/hdr_image.h`.
## Step 2: Importance sampling the environment map ## Step 2: Importance sampling the environment map
Much like light in the real world, most of the energy provided by an environment light source is concentrated in the directions toward bright light sources. **Therefore, it makes sense to bias selection of sampled directions towards the directions for which incoming radiance is the greatest.** In this final task you will implement an importance sampling scheme for environment lights. For environment lights with large variation in incoming light intensities, good importance sampling will significantly improve the quality of renderings. Much like light in the real world, most of the energy provided by an environment light source is concentrated in the directions toward bright light sources. Therefore, it makes sense to prefer sampling directions for which incoming radiance is the greatest. For environment lights with large variation in incoming light intensities, good importance sampling will significantly reduce the variance of your renderer.
The basic idea is that you will assign a probability to each pixel in the environment map based on the total flux passing through the solid angle it represents. The basic idea of importance sampling an image is assigning a probability to each pixel based on the total radiance coming from the solid angle it subtends.
A pixel with coordinate <img src="environment_eq1.png" width ="45"> subtends an area <img src="environment_eq2.png" width = "80"> on the unit sphere (where <img src="environment_eq3.png" width = "20"> and <img src="environment_eq4.png" width = "20"> the angles subtended by each pixel -- as determined by the resolution of the texture). Thus, the flux through a pixel is proportional to <img src="environment_eq5.png" width = "45">. (We only care about the relative flux through each pixel to create a distribution.) A pixel with coordinate <img src="figures/environment_eq1.png" width ="45"> subtends an area <img src="figures/environment_eq2.png" width = "80"> on the unit sphere (where <img src="figures/environment_eq3.png" width = "20"> and <img src="figures/environment_eq4.png" width = "20"> are the angles subtended by each pixel as determined by the resolution of the texture). Thus, the flux through a pixel is proportional to <img src="figures/environment_eq5.png" width = "45">. (Since we are creating a distribution, we only care about the relative flux through each pixel, not the absolute flux.)
**Summing the fluxes for all pixels, then normalizing the values so that they sum to one, yields a discrete probability distribution for picking a pixel based on flux through its corresponding solid angle on the sphere.** **Summing the flux for all pixels, then normalizing each such that they sum to one, yields a discrete probability distribution over the pixels where the probability one is chosen is proportional to its flux.**
To sample this 2D discrete probability distribution, we recommend treating the image as a single vector (row-major), where The question is now how to efficiently get samples from this discrete distribution. To do so, we recommend treating the distribution as a single vector representing the whole image (row-major). In this form, it is easy to compute its CDF: the CDF for each pixel is the sum of the PDFs of all pixels before it. Once you have a CDF, you can use inversion sampling to pick out a particular index and convert it to a pixel and a 3D direction.
the CDF of a pixel is the sum of the PDFs of the pixels before it. You can then use inversion sampling with this vector to sample a pixel from this 2D discrete probability distribuion.
**Here are a few tips:** The bulk of the importance sampling algorithm will be found as `Samplers::Sphere::Image` in `student\samplers.cpp`. You will need to implement the constructor, the inversion sampling function, and the PDF function, which returns the value of your PDF at a particular direction. Once these methods are complete, upgrade `Env_Map::sample` and `Env_Map::pdf` to use your new `image_sampler`.
* When computing areas corresponding to a pixel, use the value of theta at the pixel centers. Be sure to update your `image_sampler` to scale the returned PDF according to
* We recommend precomputing the joint distributions p(phi, theta) and marginal distributions p(theta) in the constructor of `Sampler::Sphere::Image` and storing the resulting values in fields `pdf`. See `rays/sampler.h`. the Jacobian that appears when converting from one sampling distribution to the
* `Spectrum::luma()` returns the luminance (brightness) of a Spectrum. The probability of a pixel should be proportional to the product of its luminance and the solid angle it subtends. other. The PDF value that corresponds to a pixel in the HDR map should be
* `std::lower_bound` is your friend. Documentation is [here](https://en.cppreference.com/w/cpp/algorithm/lower_bound). multiplied by the Jacobian below before being returned by
`Samplers::Sphere::Image::pdf`.
<center><img src="figures/env_light_sampling_jacobian_diagram.png"></center>
## Sample results for importance sampling: The Jacobian for transforming the PDF from the HDR map sampling distribution to
the unit sphere sampling distribution can be thought of as two separate
Jacobians: one to a rectilinear projection of the unit sphere, and then the
second to the unit sphere from the rectilinear projection.
ennis.exr with 32 spp The first Jacobian scales the w x h rectangle to a 2pi x pi
rectangle, going from (dx, dy) space to (d\phi, d\theta) space.
Since we have a distribution that integrates to 1 over (w,h), in order to obtain
a distribution that still integrates to 1 over (2pi, pi), we must multiply by the
ratio of their areas, i.e. (wh / 2pi^2). This is the first Jacobian.
![ennis](new_results/ennis32importance.png) Then in order to go from integrating over the rectilinear projection of the unit
sphere to the unit sphere, we need to go from integrating over (d\phi, d\theta) to
solid angle (d\omega). Since we know that d\omega = sin(\theta) d\phi d\theta,
if we want our new distribution to still integrate to 1, we must divide by sin(\theta), our second Jacobian.
uffiz.exr with 32 spp Altogether, the final Jacobian is (wh / 2pi^2 sin(\theta)).
![uffiz](new_results/uffizi32importance.png) ---
### Tips
field.exr with 32 spp - Remember to use the coordinate system as outlined in Task 1!
- When computing areas corresponding to a pixel, use the value of theta at the pixel centers.
- Compute the PDF and CDF in the constructor of `Sampler::Sphere::Image`, storing them values in fields `_pdf` and `_cdf`. See `rays/sampler.h`.
- `Spectrum::luma()` returns the luminance (brightness) of a Spectrum. The weight assigned to a pixel should be proportional both its luminance and the solid angle it subtends.
- For inversion sampling, use `std::lower_bound`: it's a binary search. Read about it [here](https://en.cppreference.com/w/cpp/algorithm/lower_bound).
- If you didn't use the ray log to debug area light sampling, start using it now to visualize what directions are being sampled from the environment map.
![ennis](new_results/field32importance.png) ---
field.exr with 1024 spp ## Reference Results
![ennis](new_results/field1024imp.png) ![ennis](images/ennis.png)
![uffiz](images/uffiz.png)
![grace](images/grace.png)
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment