# Lighting and Shading - PowerPoint by jthUqa4

VIEWS: 45 PAGES: 81

• pg 1

Spring 2003   Cal State San Marcos
Illumination
The  flux of light energy from light
sources to objects in the scene via direct
and indirect paths
Lighting
The  process of computing the luminous
intensity reflected from a specified 3-D
point
The    process of assigning a colors to a
pixel

Spring 2003                        Cal State San Marcos
Energy and Power of Light
    Light is a form of energy
        measured in Joules (J)
    Power: energy per unit time
        Measured in Joules/sec = Watts (W)
        Also known as Radiant Flux (  )

Spring 2003                        Cal State San Marcos
Point Light Source
    Total radiant flux in Watts
        Energy emitted in unit time
    How to define angular
dependence?
        Use solid angle
    Define power per unit solid
        Measured in Watts per

Spring 2003                           Cal State San Marcos
Light Emitted from a Surface
unit area per unit solid
angle
        Measured in W/m2sr
        dA is projected area –
perpendicular to given
direction
integrated over all directions
B   L( ,  ) cosd


        Power from per unit area,
measured in W/m2

Spring 2003                             Cal State San Marcos
Light Falling on a Surface
    Power falling on per
(E)
        Measured in W/m2
        Depends on
 Distance from the light
source: inverse square
law E ~ 1/r2
 Incident direction:
cosine law E ~ n.l

Spring 2003                         Cal State San Marcos
 The irradiance (E) is an integral of all incident

E   Li cos i di

Spring 2003              Cal State San Marcos
Reflection
The  amount of reflected radiance is proportional

Lr   ( r , r ,i , i ) Ei

Lr
Li

Spring 2003                    Cal State San Marcos
BRDF
  ( r ,  r ,  i ,  i )
is called Bidirectional Reflectance
Distribution Function (BRDF)
        Is a surface property
        Relates energy in to energy out
        Depends on incoming and outgoing directions

Spring 2003                       Cal State San Marcos
Local Illumination
Lr r     (r , i ) Li i cos i di

Phong illumination model approximates the BRDF with
combination of diffuse and specular components.

Spring 2003                        Cal State San Marcos
Energy Balance Equation
    The total light leaving a point is given by the
sum of two major terms:
      Emitted from the point
      Incoming light from other sources reflected at the
point
L( x,o ,o )  Le ( x,o ,o )   bd ( x,o ,o , , ) Li ( x, , ) cosd


Light       Emission     Sum                   BRDF      Incoming
leaving                                                     light

Incoming light reflected at the point
Spring 2003                         Cal State San Marcos
Rendering Equation
r                                r
L (x ¢ w¢ = E (x ¢ +
, )         )     ò r (x ¢)L (x , w)G (x , x ¢)V (x , x ¢)dA
s

 L is the radiance from a point on a surface in a given direction ω
 E is the emitted radiance from a point: E is non-zero only if x’ is
emissive
 V is the visibility term: 1 when the surfaces are unobstructed
along the direction ω, 0 otherwise
 G is the geometry term, which depends on the geometric
relationship between the two surfaces x and x’
It’s very hard to directly solve this equation. Have to use some
approximations.

Spring 2003                   Cal State San Marcos
Fast and Dirty Approximations
(OpenGL)
    Use red, green, and blue instead of full spectrum
        Roughly follows the eye's sensitivity
        Forego such complex surface behavior as metals
    Use finite number of point light sources instead of full hemisphere
        Integration changes to summation
        Forego such effects as soft shadows and color bleeding
    BRDF behaves independently on each color
        Treat red, green, and blue as three separate computations
        Forego such effects as iridescence and refraction
    BRDF split into three approximate effects
        Ambient: constant, non-directional, background light
        Diffuse: light reflected uniformly in all directions
        Specular: light of higher intensity in mirror-reflection direction
    Radiance L replaced by simple ``intensity'' I
        No pretense of being physically true

Spring 2003                                 Cal State San Marcos
Approximate Intensity
Equation (single light source)
I   I   k I   k I  cos( l )  k I W ( l ) S ( l )
o     e    a a      d l                s l

•  stands for each of red, green, blue
• I  is the intensity of the light source (modified for distance)
l

• cos( ) accounts for the angle of the incoming light
l

•the k are between 0 and 1 and represent absorption factors
• W ( l ) accounts for any highlight effects that depend on the incoming
direction
•use cos( ) or constant if there is nothing special
l

• is the mirror reflection angle for the light
l

•the angle between the view direction and the mirror reflection
direction
• S ( ) accounts for highlights in the mirror reflection direction
l

•the superscripts e, a, d, s stand for emitted, ambient, diffuse, specular
respectively
•sum over each light l if there are more one
Spring 2003                               Cal State San Marcos
OpenGL Lighting Model
    Local illumination model
        Only depends relationship to the light source.
        Don’t consider light reflected or refracted from other
objects.
        Don’t model shadow (can be faked)
    A point is lighted only if it can see the light source.
        It’s difficult to compute visibility to light sources in
complex scenes. OpenGL only tests if the polygon faces
to the light source.
 For a point P on a polygon with norm n, P is lighted by the

light source Q only if (Q  P)  n  0

Spring 2003                          Cal State San Marcos
Ambient Light Source
  An object is lighted by the ambient
light even if it is not visible to any light
source.
 Ambient light
   no spatial or directional
characteristics.
   The amount of ambient light incident
on each object is a constant for all
surfaces in the scene.
   A ambient light can have a color.
 The amount of ambient light that is
reflected by an object is independent
of the object's position or orientation.
 Surface properties are used to
determine how much ambient light is
reflected.

Spring 2003                          Cal State San Marcos
Directional Light Sources
  All of the rays from a directional
light source have a common
direction, and no point of origin.
      It is as if the light source was
infinitely far away from the
surface that it is illuminating.
      Sunlight is an example of a
directional light source.

  The direction from a surface to a
light source is important for
computing the light reflected from
the surface.
 A directional light source has a
constant direction for every
surface. A directional light source
can be colored.
Spring 2003                                Cal State San Marcos
Point Light Sources
The   point light source emits rays in radial directions from its
source. A point light source is a fair approximation to a local
light source such as a light bulb.

The direction of the light to each point on a surface
changes when a point light source is used. Thus, a
normalized vector to the light emitter must be computed for
each point that is illuminated.
    p  l

d
p  l

Spring 2003                 Cal State San Marcos
Other Light Sources
Spotlights
 Restricting the shape of light emitted by a
point light to a cone.
 Requires a color, a point, a direction, and
a cutoff angle to define the cone.
Area          Light Sources
 Light source occupies a 2-D area (usually
a polygon or disk)
Extended          Light Sources
 Spherical Light Source

Spring 2003                   Cal State San Marcos
OpenGL Specifications
    Available light models in OpenGL
        Ambient lights
        Point lights
        Directional lights
        Spot lights
    Attenuation
I (Q)
        Physical attenuation:    I ( P) 
P Q
2

        OpenGL attenuation: I ( P)  I (Q)
a  bd  cd 2
 Default a = 1, b=0, c=0

Spring 2003                            Cal State San Marcos
Example of OpenGL Light
    Setting up a simple lighting situation
GLfloat ambientIntensity[4] = {0.1, 0.1, 0.1, 1.0};
GLfloat diffuseIntensity[4] = {1.0, 0.0, 0.0, 1.0};
GLfloat position[4] = {2.0, 4.0, 5.0, 1.0};
glEnable(GL_LIGHTING); // enable lighting
glEnable(GL_LIGHT0);    // enable light 0
// set up light 0 properties
glLightfv(GL_LIGHT0, GL_AMBIENT, ambientIntensity);
glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuseIntensity);
glLightfv(GL_LIGHT0, GL_POSITION, position);

Spring 2003                Cal State San Marcos
Ideal Diffuse Reflection
Ideal Diffuse Reflection: an incoming ray of light is equally likely to be
reflected in any direction over the hemisphere.
      An ideal diffuse surface is, at the microscopic level a very rough surface.
Chalk is a good approximation to an ideal diffuse surface.
      The reflected intensity is independent of the viewing direction. The
intensity does however depend on the light source's orientation relative to
the surface.

Spring 2003                                Cal State San Marcos
Computing Diffuse Reflection

 Angle of incidence: The angle between the surface
normal and the incoming light ray
 Lambert's law states that the reflected energy from a
small surface area in a particular direction is proportional to
cosine of the angle of incidence.

Spring 2003                    Cal State San Marcos
Diffuse Lighting Examples
 We need only consider angles from 0 to 90 degrees.
Greater angles (where the dot product is negative) are
blocked by the surface, and the reflected energy is 0.
 Below are several examples of a spherical diffuse
reflector with a varying lighting angles.

Spring 2003            Cal State San Marcos
Specular Reflection

  A specular reflector is necessary to model a shiny
surface, such as polished metal or a glossy car
finish.
 We see a highlight, or bright spot on those
surfaces.
 Where this bright spot appears on the surface is a
function of where the surface is seen from. This type
of reflectance is view dependent.
 At the microscopic level a specular reflecting
surface is very smooth, and usually these
microscopic surface elements are oriented in the
same direction as the surface itself.
 Specular reflection is merely the mirror reflection of
the light source in a surface. An ideal mirror is a
purely specular reflector.

Spring 2003                 Cal State San Marcos
Reflection
 The incoming ray, the
surface normal, and the
reflected ray all lie in a
common plane.
 According to Snell’s Law,
The incident angle is equal to
the reflection on a perfect
reflective surface.

ql = qr
Spring 2003            Cal State San Marcos
Non-ideal Reflectors
  Snell's law, however, applies only to ideal
mirror reflectors. Real materials tend to
deviate significantly from ideal reflectors. At
this point we will introduce an empirical
model that is consistent with our experience,
at least to a crude approximation.
 In general, we expect most of the reflected
light to travel in the direction of the ideal ray.
However, because of microscopic surface
variations we might expect some of the light
to be reflected just slightly offset from the
ideal reflected ray. As we move farther and
farther, in the angular sense, from the
reflected ray we expect to see less light
reflected.

Spring 2003                        Cal State San Marcos
Phong Illumination
  One function that approximates this fall off is called the Phong
Illumination model. This model has no physical basis, yet it is
one of the most commonly used illumination models in computer
graphics.
 The cosine term is maximum when the surface is viewed from
the mirror direction and falls off to 0 when viewed at 90 degrees
away from it. The scalar nshiny controls the rate of this fall off.

n shiny
I specular = k s I light cos                        f
Spring 2003                 Cal State San Marcos
Effect of the nshiny coefficient
The diagram below shows the how the reflectance
drops off in a Phong illumination model. For a large
value of the nshiny coefficient, the reflectance
decreases rapidly with increasing viewing angle.

Spring 2003            Cal State San Marcos
Computing Phong Illumination
n shiny
ˆ
I specular = k s I light Vˆ ×R (   )
The V vector is the unit vector in the direction of the viewer and
the R vector is the mirror reflectance direction. The vector R can
be computed from the incoming light direction and the surface
normal:

ˆ ˆ ˆ ˆ
R = 2 N ×L N - L
(         )
Spring 2003                   Cal State San Marcos
Blinn & Torrance Variation
  Jim Blinn introduced another approach for computing Phong-like
illumination based on the work of Ken Torrance. His illumination
function uses the following equation:             n
ˆ ˆ
shiny
I specular = k s I light N ×H       (   )
  In this equation the angle of specular dispersion is computed by
how far the surface's normal is from a vector bisecting the incoming
light direction and the viewing direction.

On your own you should consider
how this approach and the previous
one differ. OpenGL implements this
model.

Spring 2003                  Cal State San Marcos
Phong Examples
 The following spheres illustrate specular reflections as the
direction of the light source and the coefficient of shininess is varied.

Spring 2003                   Cal State San Marcos
Putting it all together
 Phong        Illumination Model  
                               
  nshiny
I total  Ka I ambient  I light   Kd ( N  L)  K s (V  R)

Spring 2003                                 Cal State San Marcos
Colored Lights and Surfaces

I total ,l  K a I ambient,l 
lights

I
i 1
i ,l      
 
     
 
Kd N  L  Ks V  R   
nshiny

  for each light Ii
 for each color component
 reflectance coefficients kd, ks, and ka scalars between 0 and
1 may or may not vary with color
 nshiny scalar integer: 1 for diffuse surface, 100 for metallic
shiny surfaces

Spring 2003                                 Cal State San Marcos
Where do we Illuminate?

 To this point we have discussed how to compute an illumination
model at a point on a surface. But, at which points on the surface is
the illumination model applied? Where and how often it is applied
has a noticeable effect on the result.
 Lighting can be a costly process involving the computation of and
normalizing of vectors to multiple light sources and the viewer.
 For models defined by collections of polygonal facets or triangles:
 Each facet has a common surface normal
 If the light is directional then the diffuse contribution is constant
across the facet. Why?
 If the eye is infinitely far away and the light is directional then the
specular contribution is constant across the facet. Why?

Spring 2003                      Cal State San Marcos
  The simplest shading method applies only one illumination
calculation for each primitive. This technique is called constant or
flat shading. It is often used on polygonal primitives.

 Drawbacks:
 the direction to the light source varies over the facet
 the direction to the eye varies over the facet
 Nonetheless, often illumination is computed for only a single point
on the facet. Usually the centroid.

Spring 2003                   Cal State San Marcos

Even when the illumination equation is applied at
each point of the facet, the polygonal nature is still
apparent.

To overcome this limitation normals are introduced at
each vertex.
     different than the polygon normal
 for shading only (not backface culling or other
computations)
 better approximates smooth surfaces
Spring 2003                       Cal State San Marcos
Vertex Normals
 If vertex normals are not
provided they can often be
approximated by averaging the
normals of the facets which
share the vertex.
r
r   k
ni
nv =    å
i
=1
r
ni
 This only works if the
polygons reasonably
approximate the underlying
surface.
Spring 2003                  Cal State San Marcos

 The Gouraud shading method applies the illumination model at
each vertex and the colors in the triangles interior are linearly
interpolated from these vertex values.

 Implemented in OpenGL as Smooth Shading.
 Notice that facet artifacts are still visible.

Spring 2003                        Cal State San Marcos
  In Phong shading (not to be confused with the Phong illumination model),
the surface normal is linearly interpolated across polygonal facets, and the
Illumination model is applied at every point.
 A Phong shader assumes the same input as a Gouraud shader, which
means that it expects a normal for every vertex. The illumination model is
applied at every point on the surface being rendered, where the normal at
each point is the result of linearly interpolating the vertex normals defined
at each vertex of the triangle.

Phong shading will usually result in a very smooth appearance, however,
evidence of the polygonal model can usually be seen along silhouettes.

Spring 2003                     Cal State San Marcos

Spring 2003   Cal State San Marcos
OpenGL Specifications
    Each light has ambient, diffuse, and specular
component.
    Each light can be a point light, directional light, or
spot light.
        Directional light is a point light positioned at infinity
    Shading Models: flat and smooth
        GL_FLAT, GL_SMOOTH
        Smooth model uses Gouraud shading

Spring 2003                           Cal State San Marcos
OpenGL Examples
    We have shown how to set up lights in OpenGL
    Set surface material properties
        glMaterialf(GLenum face, GLenum pname, GLfloat param)
        glMaterialfv(GLenum face,GLenum pname,GLfloat* param)
        Face can be GL_FRONT, GL_BACK, or L_FRONT_AND_BACK
        Pname can be GL_AMBIENT, GL_DIFFUSE, GL_SPECULAR,
GL_SHINESS, and GL_EMISSION …
GLfloat mat_specular[] = { 1.0, 1.0, 1.0, 1.0 };
GLfloat low_shininess[] = { 5.0 };
glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
glMaterialfv(GL_FRONT, GL_SHININESS,low_shininess);
 Nonzero GL_EMISSION makes an object appear to be giving off
light of that color
    Refer to OpenGL programming guide for more details

Spring 2003                          Cal State San Marcos
Triangle Normals
 Surface normals are the most important geometric surface characteristic
used in computing illumination models. They are used in computing both
the diffuse and specular components of reflection.
 On a faceted planar surface vectors in the tangent plane can be
computed using surface points as follows.
r      r     r      r       r    r
t 1 = p1 - p 0 , t 2 = p 2 - p 0
 Normal is always orthogonal to the tangent space at a point. Thus,
given two tangent vectors we can compute the normal as follows:
r r
r
n = t1 ´ t 2
This normal is perpendicular to both of these tangent vectors.
r r    r r
t      t
n ×1 = n ×2 = 0

Spring 2003                    Cal State San Marcos
Normals of Curved Surfaces
Not all surfaces are given as planar facets. A common example of
such a surface a parametric surface. For a parametric surface the
three-space coordinates are determined by functions of two
parameters u and v.
X
é (u ,v )ù
ê         ú
Y
S (u ,v ) = ê (u ,v ) ú
ê         ú
ê (u ,v ) ú
Z
ë         û
The tangent vectors are computed with partial derivatives and the
normal with a cross product:
é¶ X   ù        é¶ X ù
ê      ú        ê ú
ê¶ u   ú        ê¶ v ú
ê      ú r      ê ú
r   ê¶ Y   ú        ê¶ Y ú r  r r
t1 = ê      ú, t 2 = ê ú, n = t 1 ´ t 2
ê¶ u   ú        ê¶ v ú
ê¶ Z   ú        ê¶ Z ú
ê      ú        ê ú
ê¶ u
ê      ú
ú        ê¶ v ú
ê ú
ë      û        ë û
Spring 2003                   Cal State San Marcos
Normals of Implicit Surfaces
 Normals of implicit surfaces S are even
simpler:
If S = { x , y , z ) F ( x , y , z ) = 0} then
(
é¶ F      ù
ê         ú
ê¶ x      ú
ê         ú
r ê¶ F       ú
n= ê         ú
ê¶ y      ú
ê         ú
ê¶ F      ú
ê
ê¶ z      ú
ú
ë         û
This         is often called the gradient vector
Spring 2003                 Cal State San Marcos
Texture Mappings

Spring 2003   Cal State San Marcos
Mapping Techniques
    Consider the problem of rendering a sphere in the
examples
      The geometry is very simple - a sphere
      But the color changes rapidly
      With the local shading model, so far, the only place to
specify color is at the vertices
      To get color details, would need thousands of polygons for
a simple shape
      Same things goes for an orange: simple shape but
complex normal vectors
    Solution: Mapping techniques use simple geometry
modified by a mapping of some type
Spring 2003                        Cal State San Marcos
textures
The concept is very simple!

Spring 2003              Cal State San Marcos
Texture Mapping
    Texture mapping associates the color of a point with
the color in an image: the texture
      Each point on the sphere gets the color of mapped pixel of
the texture
    Question to address: Which point of the texture do
we use for a given point on the surface?
    Establish a mapping from surface points to image
points
      Different mappings are common for different shapes
      We will, for now, just look at triangles (polygons)

Spring 2003                        Cal State San Marcos
Basic Mapping
    The texture lives in a 2D space
 Parameterize points in the texture with 2 coordinates: (s,t)
 These are just what we would call (x,y) if we were talking about
an image, but we wish to avoid confusion with the world (x,y,z)
    Define the mapping from (x,y,z) in world space to (s,t) in texture
space
 To find the color in the texture, take an (x,y,z) point on the
surface, map it into texture space, and use it to look up the
color of the texture
    With polygons:
 Specify (s,t) coordinates at vertices
 Interpolate (s,t) for other points based on given vertices

Spring 2003                     Cal State San Marcos
Texture Interpolation
    Specify where the
vertices in world space
t
are mapped to in
texture space
    Linearly interpolate the                            s
mapping for other                             Texture map
points in world space                                       Triangle in
      Straight lines in world                            world space
space go to straight lines
in texture space

Spring 2003                         Cal State San Marcos
Textures
    Two-dimensional texture pattern T(s, t)
    It is stored in texture memory as an n*m
array of texture elements (texels)
    Due to the nature of rendering process, which
works on a pixel-to-pixel base, we are more
interested in the inverse map from screen
coordinates to the texture coordinates

Spring 2003           Cal State San Marcos
Computing Color in Texture
mapping
    Associate texture with polygon
    Map pixel onto polygon and then into texture map
    Use weighted average of covered texture to compute color.

Spring 2003                    Cal State San Marcos
Basic OpenGL Texturing
    Specify texture coordinates for the polygon:
 Use glTexCoord2f(s,t) before each vertex:
   Eg: glTexCoord2f(0,0); glVertex3f(x,y,z);
    Create a texture object and fill it with texture data:
 glGenTextures(num, &indices) to get identifiers for the
objects
 glBindTexture(GL_TEXTURE_2D, identifier) to bind the
texture
   Following texture commands refer to the bound texture
      glTexParameteri(GL_TEXTURE_2D, …, …) to specify
parameters to use when applying the texture
      glTexImage2D(GL_TEXTURE_2D, ….) to specify the texture
data (the image itself)                     MORE…

Spring 2003                             Cal State San Marcos
Basic OpenGL Texturing (cont)
    Enable texturing: glEnable(GL_TEXTURE_2D)
    State how the texture will be used:
      glTexEnvf(…)
    Texturing is done after lighting

Spring 2003                  Cal State San Marcos
Controlling Different
Parameters
    The “texels” in the texture map may be interpreted
as many different things. For example:
      As colors in RGB or RGBA format
      As grayscale intensity
      As alpha values only
    The data can be applied to the polygon in many
different ways:
      Replace: Replace the polygon color with the texture color
      Modulate: Multiply the polygon color with the texture color
or intensity
      Similar to compositing: Composite texture with base color
using operator

Spring 2003                         Cal State San Marcos
Texture Stuffs
     Texture must be in fast memory - it is accessed for every pixel
drawn
 If you exceed it, performance will degrade horribly
 There are functions for managing texture memory
 Skilled artists can pack textures for different objects into one
map
     Texture memory is typically limited, so a range of functions are
available to manage it
     Specifying texture coordinates can be annoying, so there are
functions to automate it
     Sometimes you want to apply multiple textures to the same
point: Multitexturing is now in some hardware

Spring 2003                       Cal State San Marcos
Yet More Texture Stuff
    There is a texture matrix in OpenGL: apply a matrix
transformation to texture coordinates before
indexing texture
    There are “image processing” operations that can be
applied to the pixels coming out of the texture
    There are 1D and 3D textures
      Mapping works essentially the same
      3D used in visualization applications, such a visualizing
MRI or other medical data
      1D saves memory if the texture is inherently 1D, like
stripes

Spring 2003                         Cal State San Marcos
Procedural Texture Mapping
    Instead of looking up an image, pass the texture
coordinates to a function that computes the texture
value on the fly
      Renderman, the Pixar rendering language, does this
      Available in a limited form with vertex shaders on current
generation hardware
      Near-infinite resolution with small storage cost
    Has the disadvantage of being slow in many cases

Spring 2003                         Cal State San Marcos
Other Types of Mapping
    Bump-mapping computes an offset to the normal
vector at each rendered pixel
 No need to put bumps in geometry, but silhouette
looks wrong
    Displacement mapping adds an offset to the surface at
each point
 Like putting bumps on geometry, but simpler to model

    All are available in software renderers like RenderMan
compliant renderers
    All these are becoming available in hardware
Spring 2003               Cal State San Marcos
Bump Mapping
Textures can be used to alter the surface normal of an object. This does not change the actual
shape of the surface -- we are only shading it as if it were a different shape! This technique is
called bump mapping. The texture map is treated as a single-valued height function. The
value of the function is not actually used, just its partial derivatives. The partial derivatives tell
how to alter the true surface normal at each point on the surface to make the object appear
as if it were deformed by the height function.

Since the actual shape of the object does not change, the silhouette edge of the object will
not change. Bump Mapping also assumes that the Illumination model is applied at every pixel
(as in Phong Shading or ray tracing).

Swirly Bump Map

Sphere w/Diffuse Texture & Bump Map
Sphere w/Diffuse Texture
Spring 2003                              Cal State San Marcos
Bump Map Examples

Bump Map

Cylinder w/Diffuse Texture Map                          Cylinder w/Texture Map & Bump Map
Spring 2003                     Cal State San Marcos
Displacement Mapping
We use the texture map to actually move the surface point. This is called
displacement mapping. How is this fundamentally different than bump
mapping?

The geometry must be displaced before visibility is determined.

Spring 2003                   Cal State San Marcos
    Textbook 16.1-16.3
    OpenGL Programming Guide Chapter 5 and
Chap 9.

Spring 2003         Cal State San Marcos
Better Illumination Models
Blinn-Torrance-Sparrow     (1977)
 isotropic collection of planar microscopic facets
Cook-Torrance (1982)
 Add wavelength dependent Fresnel term
He-Torrance-Sillion-Greenberg (1991)
 adds polarization, statistical microstructure, self-reflectance
Very little of this work has made its way into graphics H/W.

Spring 2003                    Cal State San Marcos
Cook-Torrance Illumination
lights            é             DGFl (qi )ù
ê lˆ ×n + k               ú
I l = ka I l ,a +   å        I l ,i   kd i ˆ
ê     ( )   s
p ( ˆ ×n ) ú
v ˆ ú
i =1              ê
ë                         û
Iλ,a  - Ambient light intensity
ka - Ambient surface reflectance
Iλ,i - Luminous intensity of light source i
ks - percentage of light reflected specularly (notice terms sum to one)
kd - Diffuse reflectivity
li - vector to light source
n - average surface normal at point
D - the distribution of microfacet orientations
Fλ(θi) - Fresnel conductance term related to material’s index of refraction
v - vector to viewer

Spring 2003                              Cal State San Marcos
Microfacet Distribution Function
ö2
ætan( n ×ˆ) ÷
ˆh ÷
-ç
ç
ç m         ÷
ç           ÷
÷
e       è           ø
D=
ˆ ˆ
4m 2 cos 4 (n ×h )

• Statistical model of the microfacet variation in normal direction
• Based on a Beckman distribution function
• Consistent with the surface variations of rough surfaces
• m - the root-mean-square slope of the microfacets
• Small m (e.g., 0.2) : pretty smooth surface
• Large m (e.g., 0.7) : pretty rough surface

Spring 2003                            Cal State San Marcos
Beckman's Distribution

Spring 2003    Cal State San Marcos
Geometric Attenuation Factor
 The geometric attenuation factor G accounts for microfacet
      in the range from 0 (total shadowing) to 1 (no shadowing).
 There are many different ways that an incoming beam of light
can interact with the surface locally.
 The entire beam can simply reflect.

Spring 2003                           Cal State San Marcos
Blocked Reflection
A portion of the out-going beam can be blocked.

Spring 2003           Cal State San Marcos
Blocked Beam
 A portion of the incoming beam can be blocked.

Spring 2003           Cal State San Marcos
Geometric Attenuation Factor
In  each case, the
l
geometric configurations G = 1 - blocked
can be analyzed to                      l facet
r r r r
compute the percentage                   2 n ×h (n × )
(       v
)
of light that actually    G masking =            r r
escapes from the surface.                       v ×h
r r r r
The geometric factor,                       2 n ×h n ×l
(       )( )
chooses the smallest      G shadowing =            r r
amount of lost light.                             v ×h
1,

Spring 2003             Cal State San Marcos
Fresnel Effect

    At a very sharp angle surfaces like a book, a wood table top,
concrete and plastic can almost become mirrors.

Spring 2003                   Cal State San Marcos
Fresnel Reflection
The   Fresnel term results
from a complete analysis
of the reflection process
while considering light as
an electromagnetic wave.
The behavior of
reflection depend on how
the incoming electric field
is oriented relative to the
surface at the point where
the field makes contact.

Spring 2003             Cal State San Marcos
Fresnel Reflection
The   Fresnel effect is wavelength dependent. It behavior is determined
by the index-of-refraction of the material
The Fresnel effect accounts for the color change of the specular
highlight as a function of the angle of incidence of the light source,
particularly on metals (conductors).
 It also explains why most surfaces approximate mirror reflectors when
when the light strikes them at a grazing angle.

æ2                  2 ö
1 (g - c ) çç1 + (c (g + c )- 1) ÷÷
Fl (qi ) =           2 ç
÷
2÷
2 (g + c ) çç                     ÷
è    (c (g - c ) + 1) ø
÷
r r
c = cos qi = l ×h
g=     nl 2 + c 2 - 1

Spring 2003                       Cal State San Marcos
Reflectance of Metal

Spring 2003   Cal State San Marcos
Reflectance of Dielectrics
    Non conducting material, e.g., glass, plastics.

Spring 2003               Cal State San Marcos
Schlick Approximation
    To calculate F for every angle, we can use
measurements of F0, the value of F at normal
incidence.

F  F0  (1  cos  ) (1  F0 )    5

Spring 2003                Cal State San Marcos
Index of Refraction
Index of
Medium
refraction
Vaccum         1
Air            1.0003
Water          1.33
Fused Quartz   1.46
Glass, crown   1.52
Glass, dense   1.66
flint
Diamond        2.42
Metal          200

Spring 2003                       Cal State San Marcos
Remaining Hard Problems
Reflective        Diffraction Effects
 thinfilms
 feathers of a blue jay

Anisotropy
 brushed  metals
 strands pulled materials

 satin and velvet cloths

Spring 2003                 Cal State San Marcos

To top