<!DOCTYPE html><htmllang="en-US"><head><metacharset="UTF-8"><metahttp-equiv="X-UA-Compatible"content="IE=Edge"><title>(Task 1) Camera Rays - </title><linkrel="shortcut icon"href="/favicon.ico"type="image/x-icon"><linkrel="stylesheet"href="/assets/css/just-the-docs-default.css"><script type="text/javascript"src="/assets/js/vendor/lunr.min.js"></script><script type="text/javascript"src="/assets/js/just-the-docs.js"></script><metaname="viewport"content="width=device-width, initial-scale=1"><!-- Begin Jekyll SEO tag v2.7.1 --><title>(Task 1) Camera Rays</title><metaname="generator"content="Jekyll v4.2.0"/><metaproperty="og:title"content="(Task 1) Camera Rays"/><metaproperty="og:locale"content="en_US"/><metaname="twitter:card"content="summary"/><metaproperty="twitter:title"content="(Task 1) Camera Rays"/><script type="application/ld+json">{"headline":"(Task 1) Camera Rays","@type":"WebPage","url":"/pathtracer/camera_rays","@context":"https://schema.org"}</script><!-- End Jekyll SEO tag --></head><body><svgxmlns="http://www.w3.org/2000/svg"style="display: none;"><symbolid="svg-link"viewBox="0 0 24 24"><title>Link</title><svgxmlns="http://www.w3.org/2000/svg"width="24"height="24"viewBox="0 0 24 24"fill="none"stroke="currentColor"stroke-width="2"stroke-linecap="round"stroke-linejoin="round"class="feather feather-link"><pathd="M10 13a5 5 0 0 0 7.54.54l3-3a5 5 0 0 0-7.07-7.07l-1.72 1.71"></path><pathd="M14 11a5 5 0 0 0-7.54-.54l-3 3a5 5 0 0 0 7.07 7.07l1.71-1.71"></path></svg></symbol><symbolid="svg-search"viewBox="0 0 24 24"><title>Search</title><svgxmlns="http://www.w3.org/2000/svg"width="24"height="24"viewBox="0 0 24 24"fill="none"stroke="currentColor"stroke-width="2"stroke-linecap="round"stroke-linejoin="round"class="feather feather-search"><circlecx="11"cy="11"r="8"></circle><linex1="21"y1="21"x2="16.65"y2="16.65"></line></svg></symbol><symbolid="svg-menu"viewBox="0 0 24 24"><title>Menu</title><svgxmlns="http://www.w3.org/2000/svg"width="24"height="24"viewBox="0 0 24 24"fill="none"stroke="currentColor"stroke-width="2"stroke-linecap="round"stroke-linejoin="round"class="feather feather-menu"><linex1="3"y1="12"x2="21"y2="12"></line><linex1="3"y1="6"x2="21"y2="6"></line><linex1="3"y1="18"x2="21"y2="18"></line></svg></symbol><symbolid="svg-arrow-right"viewBox="0 0 24 24"><title>Expand</title><svgxmlns="http://www.w3.org/2000/svg"width="24"height="24"viewBox="0 0 24 24"fill="none"stroke="currentColor"stroke-width="2"stroke-linecap="round"stroke-linejoin="round"class="feather feather-chevron-right"><polylinepoints="9 18 15 12 9 6"></polyline></svg></symbol><symbolid="svg-doc"viewBox="0 0 24 24"><title>Document</title><svgxmlns="http://www.w3.org/2000/svg"width="24"height="24"viewBox="0 0 24 24"fill="none"stroke="currentColor"stroke-width="2"stroke-linecap="round"stroke-linejoin="round"class="feather feather-file"><pathd="M13 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V9z"></path><polylinepoints="13 2 13 9 20 9"></polyline></svg></symbol></svg><divclass="side-bar"><divclass="site-header"><ahref="/"class="site-title lh-tight"></a><ahref="#"id="menu-button"class="site-button"><svgviewBox="0 0 24 24"class="icon"><usexlink:href="#svg-menu"></use></svg></a></div><navrole="navigation"aria-label="Main"id="site-nav"class="site-nav"><ulclass="nav-list"><liclass="nav-list-item"><ahref="/"class="nav-list-link">Home</a></li><liclass="nav-list-item"><ahref="/git/"class="nav-list-link">GitHub Setup</a></li><liclass="nav-list-item"><ahref="/build/"class="nav-list-link">Building Scotty3D</a></li><liclass="nav-list-item"><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/guide/"class="nav-list-link">User Guide</a><ulclass="nav-list "><liclass="nav-list-item "><ahref="/guide/animate_mode/"class="nav-list-link">Animate</a></li><liclass="nav-list-item "><ahref="/guide/layout_mode/"class="nav-list-link">Layout</a></li><liclass="nav-list-item "><ahref="/guide/model_mode/"class="nav-list-link">Model</a></li><liclass="nav-list-item "><ahref="/guide/render_mode/"class="nav-list-link">Render</a></li><liclass="nav-list-item "><ahref="/guide/rigging_mode/"class="nav-list-link">Rig</a></li><liclass="nav-list-item "><ahref="/guide/simulate_mode/"class="nav-list-link">Simulate</a></li></ul></li><liclass="nav-list-item"><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/meshedit/"class="nav-list-link">A2: MeshEdit</a><ulclass="nav-list "><liclass="nav-list-item "><ahref="/meshedit/halfedge"class="nav-list-link">Halfedge Mesh</a></li><liclass="nav-list-item "><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/meshedit/local/"class="nav-list-link">Local Operations</a><ulclass="nav-list"><liclass="nav-list-item "><ahref="/meshedit/local/edge_flip"class="nav-list-link">Edge Flip Tutorial</a></li><liclass="nav-list-item "><ahref="/meshedit/local/bevel/"class="nav-list-link">Bevelling</a></li></ul></li><liclass="nav-list-item "><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/meshedit/global/"class="nav-list-link">Global Operations</a><ulclass="nav-list"><liclass="nav-list-item "><ahref="/meshedit/global/catmull/"class="nav-list-link">Catmull-Clark Subdivision</a></li><liclass="nav-list-item "><ahref="/meshedit/global/remesh/"class="nav-list-link">Isotropic Remeshing</a></li><liclass="nav-list-item "><ahref="/meshedit/global/linear/"class="nav-list-link">Linear Subdivision</a></li><liclass="nav-list-item "><ahref="/meshedit/global/loop/"class="nav-list-link">Loop Subdivision</a></li><liclass="nav-list-item "><ahref="/meshedit/global/simplify/"class="nav-list-link">Simplification</a></li><liclass="nav-list-item "><ahref="/meshedit/global/triangulate/"class="nav-list-link">Triangulation</a></li></ul></li></ul></li><liclass="nav-list-item active"><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/pathtracer/"class="nav-list-link">A3: Pathtracer</a><ulclass="nav-list "><liclass="nav-list-item active"><ahref="/pathtracer/camera_rays"class="nav-list-link active">(Task 1) Camera Rays</a></li><liclass="nav-list-item "><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/pathtracer/intersecting_objects"class="nav-list-link">(Task 2) Intersections</a><ulclass="nav-list"><liclass="nav-list-item "><ahref="/pathtracer/ray_triangle_intersection"class="nav-list-link">Ray Triangle Intersection</a></li><liclass="nav-list-item "><ahref="/pathtracer/ray_sphere_intersection"class="nav-list-link">Ray Sphere Intersection</a></li></ul></li><liclass="nav-list-item "><ahref="/pathtracer/bounding_volume_hierarchy"class="nav-list-link">(Task 3) BVH</a></li><liclass="nav-list-item "><ahref="/pathtracer/shadow_rays"class="nav-list-link">(Task 4) Shadow Rays</a></li><liclass="nav-list-item "><ahref="/pathtracer/path_tracing"class="nav-list-link">(Task 5) Path Tracing</a></li><liclass="nav-list-item "><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/pathtracer/materials"class="nav-list-link">(Task 6) Materials</a><ulclass="nav-list"><liclass="nav-list-item "><ahref="/pathtracer/dielectrics_and_transmission"class="nav-list-link">Dielectrics and Transmission</a></li></ul></li><liclass="nav-list-item "><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/pathtracer/environment_lighting"class="nav-list-link">(Task 7) Environment Lighting</a><ulclass="nav-list"><liclass="nav-list-item "><ahref="/pathtracer/importance_sampling"class="nav-list-link">Environment Light Importance Sampling</a></li></ul></li><liclass="nav-list-item "><ahref="/pathtracer/visualization_of_normals"class="nav-list-link">Visualization of normals</a></li></ul></li><liclass="nav-list-item"><ahref="#"class="nav-list-expander"><svgviewBox="0 0 24 24"><usexlink:href="#svg-arrow-right"></use></svg></a><ahref="/animation/"class="nav-list-link">A4: Animation</a><ulclass="nav-list "><liclass="nav-list-item "><ahref="/animation/splines"class="nav-list-link">Splines</a></li><liclass="nav-list-item "><ahref="/animation/skeleton_kinematics"class="nav-list-link">Skeleton Kinematics</a></li><liclass="nav-list-item "><ahref="/animation/skinning"class="nav-list-link">Skinning</a></li><liclass="nav-list-item "><ahref="/animation/particles"class="nav-list-link">Particles</a></li></ul></li></ul></nav><footerclass="site-footer"> This site uses <ahref="https://github.com/pmarsceill/just-the-docs">Just the Docs</a>, a documentation theme for Jekyll. </footer></div><divclass="main"id="top"><divid="main-header"class="main-header"><divclass="search"><divclass="search-input-wrap"><inputtype="text"id="search-input"class="search-input"tabindex="0"placeholder="Search "aria-label="Search "autocomplete="off"><labelfor="search-input"class="search-label"><svgviewBox="0 0 24 24"class="search-icon"><usexlink:href="#svg-search"></use></svg></label></div><divid="search-results"class="search-results"></div></div></div><divid="main-content-wrap"class="main-content-wrap"><navaria-label="Breadcrumb"class="breadcrumb-nav"><olclass="breadcrumb-nav-list"><liclass="breadcrumb-nav-list-item"><ahref="/pathtracer/">A3: Pathtracer</a></li><liclass="breadcrumb-nav-list-item"><span>(Task 1) Camera Rays</span></li></ol></nav><divid="main-content"class="main-content"role="main"><h1id="task-1-generating-camera-rays"><ahref="#task-1-generating-camera-rays"class="anchor-heading"aria-labelledby="task-1-generating-camera-rays"><svgviewBox="0 0 16 16"aria-hidden="true"><usexlink:href="#svg-link"></use></svg></a> (Task 1) Generating Camera Rays </h1><h3id="walkthrough-video"><ahref="#walkthrough-video"class="anchor-heading"aria-labelledby="walkthrough-video"><svgviewBox="0 0 16 16"aria-hidden="true"><usexlink:href="#svg-link"></use></svg></a> Walkthrough Video </h3><iframewidth="750"height="500"src="Task1_Camera_Rays_1.mp4"frameborder="0"allowfullscreen=""></iframe><p>“Camera rays” emanate from the camera and measure the amount of scene radiance that reaches a point on the camera’s sensor plane. (Given a point on the virtual sensor plane, there is a corresponding camera ray that is traced into the scene.)</p><p>Take a look at <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code> in <codeclass="language-plaintext highlighter-rouge">student/pathtracer.cpp</code>. The job of this function is to compute the amount of energy arriving at this pixel of the image. Conveniently, we’ve given you a function <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_ray(r)</code> that provides a measurement of incoming scene radiance along the direction given by ray <codeclass="language-plaintext highlighter-rouge">r</code>. See <codeclass="language-plaintext highlighter-rouge">lib/ray.h</code> for the interface of ray.</p><p>Here are some <ahref="https://drive.google.com/file/d/0B4d7cujZGEBqVnUtaEsxOUI4dTMtUUItOFR1alQ4bmVBbnU0/view">rough notes</a> giving more detail on how to generate camera rays.</p><p>This tutorial from <ahref="https://www.scratchapixel.com/lessons/3d-basic-rendering/ray-tracing-generating-camera-rays/generating-camera-rays">Scratchapixel</a> also provides a detailed walkthrough of what you need to do. (Note that the coordinate convention that Scratchpixel adopted is different from the one we use, and you should stick to the coordinate system from the <ahref="https://drive.google.com/file/d/0B4d7cujZGEBqVnUtaEsxOUI4dTMtUUItOFR1alQ4bmVBbnU0/view">rough notes</a> all the time.)</p><p><strong>Step 1:</strong> Given the width and height of the screen, and point in screen space, compute the corresponding coordinates of the point in normalized ([0-1]x[0-1]) screen space in <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code>. Pass these coordinates to the camera via <codeclass="language-plaintext highlighter-rouge">Camera::generate_ray</code> in <codeclass="language-plaintext highlighter-rouge">camera.cpp</code>.</p><p><strong>Step 2:</strong> Implement <codeclass="language-plaintext highlighter-rouge">Camera::generate_ray</code>. This function should return a ray <strong>in world space</strong> that reaches the given sensor sample point. We recommend that you compute this ray in camera space (where the camera pinhole is at the origin, the camera is looking down the -Z axis, and +Y is at the top of the screen.). In <codeclass="language-plaintext highlighter-rouge">util/camera.h</code>, the <codeclass="language-plaintext highlighter-rouge">Camera</code> class stores <codeclass="language-plaintext highlighter-rouge">vert_fov</code> and <codeclass="language-plaintext highlighter-rouge">aspect_ratio</code> indicating the vertical field of view of the camera (in degrees, not radians) as well as the aspect ratio. Note that the camera maintains camera-space-to-world space transform matrix <codeclass="language-plaintext highlighter-rouge">iview</code> that will come in handy.</p><p><strong>Step 3:</strong> Your implementation of <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code> must support super-sampling. The member <codeclass="language-plaintext highlighter-rouge">Pathtracer::n_samples</code> specifies the number of samples of scene radiance to evaluate per pixel. The starter code will hence call <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code> one time for each sample, so your implementation of <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code> should choose a new location within the pixel each time.</p><p>To choose a sample within the pixel, you should implement <codeclass="language-plaintext highlighter-rouge">Rect::Uniform::sample</code> (see <codeclass="language-plaintext highlighter-rouge">src/student/samplers.cpp</code>), such that it provides (random) uniformly distributed 2D points within the rectangular region specified by the origin and the member <codeclass="language-plaintext highlighter-rouge">Rect::Uniform::size</code>. Then you may then create a <codeclass="language-plaintext highlighter-rouge">Rect::Uniform</code> sampler with a one-by-one region and call <codeclass="language-plaintext highlighter-rouge">sample()</code> to obtain randomly chosen offsets within the pixel.</p><p>Once you have implemented <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code>, <codeclass="language-plaintext highlighter-rouge">Rect::Uniform::sample</code> and <codeclass="language-plaintext highlighter-rouge">Camera::generate_ray</code>, you should have a working camera.</p><p><strong>Tip:</strong> Since it’ll be hard to know if you camera rays are correct until you implement primitive intersection, we recommend debugging your camera rays by checking what your implementation of <codeclass="language-plaintext highlighter-rouge">Camera::generate_ray</code> does with rays at the center of the screen (0.5, 0.5) and at the corners of the image.</p><p>The code can log the results of raytracing for visualization and debugging. To do so, simply call function <codeclass="language-plaintext highlighter-rouge">Pathtracer::log_ray</code> in your <codeclass="language-plaintext highlighter-rouge">Pathtracer::trace_pixel</code>. Function <codeclass="language-plaintext highlighter-rouge">Pathtracer::log_ray</code> takes in 3 arguments: the ray tat you want to log, a float that specifies the distance to log that ray up to, and a color for the ray. If you don’t pass a color, it will default to white. You should only log only a portion of the generated rays, or else the result will be hard to interpret. To do so, you can add <codeclass="language-plaintext highlighter-rouge">if(RNG::coin_flip(0.0005f)) log_ray(out, 10.0f);</code> to log 0.05% of camera rays.</p><p>Finally, you can visualize the logged rays by checking the box for Logged rays under Visualize and then <strong>starting the render</strong> (Open Render Window -> Start Render). After running the path tracer, rays will be shown as lines in visualizer. Be sure to wait for rendering to complete so you see all rays while visualizing.</p><p><imgsrc="new_results/log_rays.png"alt="logged_rays"/></p><p><strong>Step 4:</strong><codeclass="language-plaintext highlighter-rouge">Camera</code> also includes the members <codeclass="language-plaintext highlighter-rouge">aperture</code> and <codeclass="language-plaintext highlighter-rouge">focal_dist</code>. These parameters are used to simulate the effects of de-focus blur and bokeh found in real cameras. Focal distance represents the distance between the camera aperture and the plane that is perfectly in focus. To use it, you must simply scale up the sensor position from step 2 (and hence ray direction) by <codeclass="language-plaintext highlighter-rouge">focal_dist</code> instead of leaving it on the <codeclass="language-plaintext highlighter-rouge">z = -1</code> plane. You might notice that this doesn’t actually change anything about your result, since this is just scaling up a vector that is later normalized. However, now aperture comes in: by default, all rays start a single point, representing a pinhole camera. But when <codeclass="language-plaintext highlighter-rouge">aperture > 0</code>, we want to randomly choose the ray origin from an <codeclass="language-plaintext highlighter-rouge">aperture</code>x<codeclass="language-plaintext highlighter-rouge">aperture</code> square centered at the origin and facing the camera direction (-Z). Then, we use this point as the starting point of the ray while keeping its sensor position fixed (consider how that changes the ray direction). Now it’s as if the same image was taken from slightly off origin. This simulates real cameras with non-pinhole apertures: the final photo is equivalent to averaging images taken by pinhole cameras placed at every point in the aperture.</p><p>Finally, we can see that non-zero aperture makes focal distance matter: objects on the focal plane are unaffected, since where the ray hits on the sensor is the same regardless of the ray’s origin. However, rays that hit objects objects closer or farther than the focal distance will be able to “see” slightly different parts of the object based on the ray origin. Averaging over many rays within a pixel, this results in collecting colors from a region larger slightly than that pixel would cover given zero aperture, causing the object to become blurry. We are using a square aperture, so bokeh effects will reflect this.</p><p>You can test aperture/focal distance by adjusting <codeclass="language-plaintext highlighter-rouge">aperture</code> and <codeclass="language-plaintext highlighter-rouge">focal_dist</code> using the camera UI and examining logging rays. Once you have implemented primitive intersections and path tracing (tasks 3/5), you will be able to properly render <codeclass="language-plaintext highlighter-rouge">dof.dae</code>:</p><p><imgsrc="new_results/dof.png"alt="depth of field test"/></p><p><strong>Extra credit ideas:</strong></p><ul><li>Write your own camera pixel sampler (replacing Rect::Uniform) that generates samples with improved distribution. Some examples include: <ul><li>Jittered Sampling</li><li>Multi-jittered sampling</li><li>N-Rooks (Latin Hypercube) sampling</li><li>Sobol sequence sampling</li><li>Halton sequence sampling</li><li>Hammersley sequence sampling</li></ul></li></ul></div></div><divclass="search-overlay"></div></div></body></html>