Google Summer of Code Blender Light Paint using Spherical Harmonics

Document Sample
Google Summer of Code Blender Light Paint using Spherical Harmonics Powered By Docstoc
					      Google Summer of Code: Blender Light Paint using Spherical

                                         Jingyuan Huang
                           David R. Cheriton School of Computer Science
                                      University of Waterloo

                                                April 7, 2009

1     Contact Information
    • Email:
    • IRC: yukishiro
    • Website: j23huang

2     Synopsis

Inspired by Illumination Brush [1], this project aims to create an interactive tool for Blender to preview,
modify, and create image-based lighting environments specified by HDR light probe images. Artists can
change the HDR light probe by directly painting colours on the models. This tool will be integrated with the
existing image-based rendering routines in Blender in order to help artists determine the final renderings.

3     Benefits to the Blender Community

Blender already has vertex paint and texture paint tools, both of which are very effective tools to modify the
appearances of the models. Light paint is different from these tool in the way that it modifies light instead
of surfaces, yet it is similar to the existing tools since it provides an intuitive way for artists to specify the
desired rendering effects. Using this light paint tool, artists can fine tune rendering effects without changing
objects’ material or locating another high dynamic range image to achieve better results.

4     Deliverables

4.1    Workflow Overview

The overall workflow can be described in the following steps (Figure 1):

  1. The user creates the model.
  2. The user chooses an HDR light probe image as the lighting environment or starts with a blank lighting
  3. The user previews the lighting results interactively in light paint main area.
  4. The user modifies the lighting environment by painting colours on the model or dragging shadows.
  5. The user uses the modified lighting environment for the final rendering. Alternatively, he or she can
     use the modification as a light node and mix with other light probe images.

                                             Figure 1: Workflow

4.2    UI Design

Similar to vertex paint and texture paint, light paint will be a new mode in the 3D viewer space. It reads the
environment map specified in Shading → World and energy value in Ambient Occlusion panel to computes
the spherical harmonics coefficients (Figure 2). If there is no environment map, light paint will not take
effect. If Ambient Occlusion is not enabled as the moment, the scaling factor for lighting would default to 1.

                                Figure 2: World Panel for Environment Map

When the user switches to light paint mode for the first time after creating the model, the tool would compute
the Spherical Harmonics coefficients for the model and store it as a part of the blend file. This step can take

some time to complete for complex models so the computation should be a threaded job so that regular UI
interaction is not affected. The progress of the computation should also be indicated. After the coefficients
are computed, the model will then be rendered using these coefficients and the light coefficients. The model
would be displayed using OpenGL (preferrably GLSL to integrate with Blender’s current 3D view display).
The selected light probe image would also be rendered as the background (Figure 3).

                      Figure 3: Interactive Rendering with Light Probe Background

There would be a paint panel in editing buttons space. The paint panel has two tools: diffuse brush tool
and shadow rotation tool. For diffuse brush tool, the panel has a colour picker (rgb), an intensity button
(exponent), a brush size button. A pop-up property window for paint will also be created when the user
presses ‘N’. There is no extra button for shadow drag tool, except that the user needs to use a modifier
key together with mouse motion to drag the shadow. It should be easy to use two tools together to modify
lighting smoothly. Figure 4 shows a mockup of the design.

Furthermore, a new type of node would be created: light node. A light node can take a light probe image as
input. The node object contains information about the modifications made to the image. The modifications
can be applied directly to the model, or can be mixed with other light probe images to create a new lighting
environment (Figure 5). Light nodes are used with material nodes to create a render layer.

5     Technical Details

5.1    Spherical Harmonics

Precomputed radiance transfer using Spherical Harmonics was first introduced by Sloan et al. [2]. Spherical
Harmonic Lighting: The Gritty Details by Robin Green offers good explanations of the basic concepts [3],
however the implementation provided in the tutorial is naive and slow since it aims for explanation rather
than performance. A faster algorithm was proposed by Ian G. Lisle and S.-L. Tracy Huang [4], and is the
base of my implementation.

                                 Figure 4: A mockup for light paint mode

5.2    Estimating Colour Change in Lighting Environment

Algorithm described in Illumination Brush [1] will be used for calculating lighting environment changes after
users specify colours on the scene. The new light coefficients will be calculated by solving a least square
problem for the linear system. The constraints described in [1] will be applied as well.

5.3    Lighting Environment Rotation

Ian G. Lisle and S.-L. Tracy Huang’s implementation [4] already provides Wigner matrices for scenes rotating
relative to the lighting environment. The same matrix can be used to rotate the lighting environment
relative to the scene. The amount of rotation is calculated based on mouse movement, which is described in
Illumination Brush [1].

6     Project Schedule
    • March 23 – April 15
      Implement basic algorithms with minimum dependence on Blender. This means that I wont use
      the existing Blender routines for HDR image loading and writing, or ray trace routines for shadow
      intersection. My goal in this period of time is to quickly get something to show up on the screen and
      work out the core algorithms (how to update sh coefficents based on the selected vertices and colour,
      how to impose constraints, how to rotate light environment based on dragging). I already started this
      part and keep a worklog here.
    • April 16 – May 6

                                 Figure 5: A mockup for light node

  This is my yearly vacation in China. I will be in Shanghai during this time and should be able to work
  (I may be unavailable for a week or so). I will continue developing some of the algorithms that I may
  not be able to finish in the first period.
• May 7 – May 13
  I will be back in Waterloo. I will start setting up the UI controls in Blender and get familiar with
  Blender 3D view display code. The controls should be set up at the end of the period. Id also like to
  get some suggestions on light node design and UI layout at the end of this period.
• May 14 – July 6
  This period is the main iterative cycle. Work involved in this period includes Blender integration, main
  algorithm development/integration, improvements based on previous reviews and suggestions, and
  testing. Everything in light paint mode should work by the end of this period, including using/storing
  HDR file, calculating and storing SH coefficients in blend files, selecting brush color and intensity,
  rotating the scene, applying brush stroke to the model, previewing probe image, etc. Googles midterm
  review will be done at the end of this period. Preliminary code for light node should start, especially
  for the UI controls.
• July 7 – August 1
  Midterm review shall be submitted by the beginning of the period. This period is another iteration
  cycle, including minor design changes (no big changes should happen at this point), integration (es-
  pecially with existing IBL code), testing, and documentation. The focus of this phase is to complete
  light node (within the design scope of this project), including creating node that represents the lighting
  modification, mixing the changes from different nodes, applying nodes to input probe images, as well
  as combining light nodes and material nodes for rendering etc.
• August 2 – August 9
  SIGGRAPH! I applied for student volunteer but I won’t know the results till the end of April. For
  now let’s assume that I won’t be available during this period.
• August 10 – August 21

      This period is dedicated to bug fixing and documentation. Proper Wiki pages should be added to
      explain the work flow and UI controls. Problems and issues shall also be documented.
    • August 22 – September Contingency time. I may choose to leave some minor features at this stage if
      there is a scheduling problem. However, I really hope that things can be wrapped up begore Aug.22.

7     Extensions after GSoC
    • Other light types
      In this proposal I focus only on environment maps. However, with light node, it is possible to create
      a light map from multiple light sources using a light mixer node. Supporting different light types is
      crucial to make light paint and light node useful.
    • All-frequency light
      One issue with Spherical Harmonics is that it cannot represent high frequency light and we cannot
      get preview of hard cast shadow. This may not be an issue at all, since light paint just provides a
      preview for the final rendering. This may be an issue if people want to output light probes that have
      high frequencies. Okabe et al. has already adopted Spherical Radial Basis function to solve this issue
      in their newer version of Illumination Brush.

8     Bio

I’m a Master’s student in The David R. Cheriton School of Computer Science at University of Waterloo. I
completed my Bachelor degree in Software Engineering at University of Waterloo as well. I’m very comfort-
able with C/C++ development, and I’ve used Python quite often since 2006 (my baby Ada compiler was
written in Python). You can find my resume and some of my past projects on my website.

I used Blender for my ray tracer project in 2007 and I’ve been following the dev mailing list since. I
understand Blender’s basic code structure, however, I haven’t been able to make real contributions to the
Blender community yet.

My thesis topic is image-based relighting. This project is partly related to (or might be a part of) my
research work. I hope that my research can bring some thoughts to the development of this project (and
maybe benefit from it as well). I will focus on my thesis topic this summer and not take any courses. Since
this project is related to my research, I will be able to work on both as a unity instead of two separate

[1] M. Okabe, Y. Matsushita, T. Igarashi, and H.-Y. Shum, “Illumination brush: interactive design of image-
    based lighting,” in SIGGRAPH ’06: ACM SIGGRAPH 2006 Research posters, (New York, NY, USA),
    p. 141, ACM, 2006.
[2] P.-P. Sloan, J. Kautz, and J. Snyder, “Precomputed radiance transfer for real-time rendering in dynamic,
    low-frequency lighting environments,” ACM Trans. Graph., vol. 21, no. 3, pp. 527–536, 2002.
[3] R. Green, “Spherical harmonic lighting: The gritty details.”
    spherical-harmonic-lighting.html, 2003.

[4] I. G. Lisle and S.-L. T. Huang, “Algorithms for spherical harmonic lighting,” in GRAPHITE ’07: Pro-
    ceedings of the 5th international conference on Computer graphics and interactive techniques in Australia
    and Southeast Asia, (New York, NY, USA), pp. 235–238, ACM, 2007.