Image-Based Lighting in Computational Photography
Explore the techniques and concepts of image-based lighting in computational photography through slides covering topics such as rendering objects in images, using environment maps and light probes, mirrored spheres, and dealing with light sources like the sun. Learn about capturing and utilizing light data to enhance visual realism in digital imagery.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
COMP790: Computational Photography Image-Based Lighting Montek Singh Apr 17, 2019 (Credits to numerous other people on individual slides)
Next section of slides mainly from Derek Hoiem, and transitively from Debevec, some Efros, and Kevin Karsch. Note many other sources for individual slides.
Image-based Lighting T2 Slide from Derek Hoiem
How to render an object inserted into an image? Image-based lighting Capture incoming light with a light probe Model local scene Ray trace, but replace distant scene with info from light probe Debevec SIGGRAPH 1998
Key ideas for Image-based Lighting Environment maps: tell what light is entering at each angle within some shell +
Key ideas for Image-based Lighting Light probes: a way of capturing environment maps in real scenes
1) Compute normal of sphere from pixel position 2) Compute reflected ray direction from sphere normal 3) Convert to spherical coordinates (theta, phi) 4) Create equirectangular image
Mirror ball -> equirectangular Mirror ball Normals Reflection vectors Phi/theta of reflection vecs Phi/theta equirectangular domain Equirectangular
One small snag How do we deal with light sources? Sun, lights, etc? They are much, much brighter than the rest of the environment Relative Brightness . 1907 . 46 . 15116 . 1 . 18 Use High Dynamic Range photography!
Key ideas for Image-based Lighting Capturing HDR images: needed so that light probes capture full range of radiance
LDR->HDR by merging exposures 0 to 255 Exposure 1 Exposure 2 Exposure n 10-6 106 Real world High dynamic range
The Math Let g(z) be the discrete inverse response function For each pixel site i in each image j, want: ln Radiancei+ln tj= g(Zij) Solve the overdetermined linear system: Z N P max = i 1 = Z 2 + + 2 ln ln ( ) ( ) Radiance t g Z g z i j ij = 1 j z min fitting term smoothness term
Real-World HDR Lighting Environments Funston Beach Eucalyptus Grove Grace Cathedral Uffizi Gallery Lighting Environments from the Light Probe Image Gallery: http://www.debevec.org/Probes/
CG Objects Illuminated by a Traditional CG Light Source
Illuminating Objects using Measurements of Real Light Environment assigned glow material property in Greg Ward s RADIANCE system. Light Object http://radsite.lbl.gov/radiance/
Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.
Rendering with Natural Light SIGGRAPH 98 Electronic Theater
Movie http://www.youtube.com/watch?v=EHBgkeXH9lU
We can now illuminate synthetic objects with real light. - Environment map - Light probe - HDR - Ray tracing How do we add synthetic objects to a real scene?
Real Scene Example Goal: place synthetic objects on table
Modeling the Scene light-based model real scene
Modeling the Scene light-based model local scene synthetic objects real scene
The Lighting Computation distant scene (light-based, unknown BRDF) synthetic objects (known BRDF) local scene (estimated BRDF)
Rendering into the Scene Background Image
Differential Rendering Local scene w/o objects, illuminated by model
Rendering into the Scene Objects and Local Scene matched to Scene
Differential Rendering Difference in local scene - =
Differential Rendering Final Result
IMAGE-BASED LIGHTING INFIAT LUX Paul Debevec, Tim Hawkins, Westley Sarokin, H. P. Duiker, Christine Cheng, Tal Garfinkel, Jenny Huang SIGGRAPH 99 Electronic Theater
Fiat Lux http://ict.debevec.org/~debevec/FiatLux/movie/ http://ict.debevec.org/~debevec/FiatLux/technology/
What if we dont have a light probe? Zoom in on eye Insert Relit Face Environment map from eye http://www1.cs.columbia.edu/CAVE/projects/world_eye/ -- Nishino Nayar 2004
Can Tell What You are Looking At huris8shrinkEye Image: alexinput Computed Retinal Image:
Summary Real scenes have complex geometries and materials that are difficult to model We can use an environment map, captured with a light probe, as a replacement for distance lighting We can get an HDR image by combining bracketed shots We can relight objects at that position using the environment map