# Face Detection, Eigenfaces, Fisherfaces - PDF

Document Sample

```					                                             Computer Vision – Exercise 4
Perceptual and Sensory Augmented Computing

Face Detection, Eigenfaces, Fisherfaces
8.12.2009
Computer Vision WS 09/10

Tobias Weyand
RWTH Aachen
http://www.mmp. rwth-aachen.de

weyand@umic.rwth-aachen.de
Any questions?
• Problems with the exercise?
Perceptual and Sensory Augmented Computing

• Feel free to e-mail me in case of problems
   weyand@umic.rwth-aachen.de
Computer Vision WS 09/10

2
Question 4 Postponed
• Compilation problems …
• You can use our lab to do the exercise
Perceptual and Sensory Augmented Computing

   Room 125 (if it’s locked, ask us)
   Just come over Wednesday or Thursday
• OpenCV is now available in the RBI pool
   /rbi/openCV
   Matlab is in /rbi/matlabR2008a
Computer Vision WS 09/10

   But webcams don’t work there 
• You can submit your solution of Q4 until
Thursday, 10.12.2009, 23:59
Recap: Subspace Methods

Subspace methods
Perceptual and Sensory Augmented Computing

Reconstructive                                Discriminative

PCA, ICA, NMF                                FLD, SVM, CCA
Computer Vision WS 09/10

=     +a1     +a2    +a3   +…

representation

classification
regression

4
B. Leibe
Slide credit: Ales Leonardis
Recap: Principal Component Analysis
• Direction that maximizes the variance of the projected
data:                           N
Perceptual and Sensory Augmented Computing

Projection of data point
N
1
Computer Vision WS 09/10

N
Covariance matrix of data
1
N
• The direction that maximizes the variance is the
eigenvector associated with the largest eigenvalue of 
5
B. Leibe
Slide credit: Svetlana Lazebnik
Remember: Fitting a Gaussian
• Mean and covariance matrix of data define a Gaussian
model
Perceptual and Sensory Augmented Computing

g2

g
Computer Vision WS 09/10

g1

6
Interpretation of PCA
• Compute eigenvectors of covariance i.
• Eigenvectors: main directions
Perceptual and Sensory Augmented Computing

• Eigenvalue: variance along eigenvector

g2           2 e2
1 e1
Computer Vision WS 09/10

g

g1
• Result: coordinate transform to best represent the
variance of the data
7
Singular Value Decomposition (SVD)
• Any mn matrix A may be factored such that
A  U V T
Perceptual and Sensory Augmented Computing

[m  n]  [m  m][m  n][ n  n]
• U: mm, orthogonal matrix
   Columns of      U are the eigenvectors of AAT
• V: nn, orthogonal matrix
Computer Vision WS 09/10

   Columns are the eigenvectors of     ATA
• : mn, diagonal with non-negative entries (1, 2,…, s)
with s=min(m,n) are called the singular values.
   Singular values are the square roots of the eigenvalues of both
AAT and ATA. Columns of U are corresponding eigenvectors!
   Result of SVD algorithm: 12…s                              8
Slide credit: Peter Belhumeur
Performing PCA with SVD
• Singular values of A are the square roots of eigenvalues
of both AAT and ATA.
Perceptual and Sensory Augmented Computing

   Columns of U are the corresponding eigenvectors.
n
• And            a a   a1  an  a1  an   AAT
T                       T
i i
i 1

• Covariance matrix n
Computer Vision WS 09/10

   T
  n  ( xi   )(xi   )
1

i 1

• So, ignoring the factor 1/n, subtract mean image  from
each input image, create data matrix, and perform
(thin) SVD on the data matrix.
9
B. Leibe
Slide credit: Peter Belhumeur
Computer Vision WS 09/10
Perceptual and Sensory Augmented Computing

Slide credit: Peter Belhumeur
Recap: Eigenfaces

B. Leibe
10
Eigenfaces Example
• Face x in “face space” coordinates:
Perceptual and Sensory Augmented Computing

=
• Reconstruction:
Computer Vision WS 09/10

=                  +

x         =       µ          +   w1u1 + w2u2 + w3u3 + w4u4 + …
11
B. Leibe
Slide credit: Svetlana Lazebnik
Recap: Properties of PCA
• It can be shown that the mean square error between xi
and its reconstruction using only m principle
Perceptual and Sensory Augmented Computing

eigenvectors is given by the expression:
N          m                 N

  
j 1
j
j 1
j         
j  m 1
j
90% of variance
Computer Vision WS 09/10

k eigenvectors

• Interpretation                                                       Cumulative influence
of eigenvectors
   PCA minimizes reconstruction error
   PCA maximizes variance of projection
   Finds a more “natural” coordinate system for the sample data.
12
B. Leibe
Slide credit: Ales Leonardis
Recap: Obj Identification by Distance IN Eigenspace

• Objects are represented as coordinates in an n-dim.
eigenspace.
Perceptual and Sensory Augmented Computing

• Example:
   3D space with points representing individual objects or a
manifold representing parametric eigenspace (e.g., orientation,
pose, illumination).
Computer Vision WS 09/10

• Estimate parameters by finding the NN in the eigenspace
13
B. Leibe
Benefits of PCA

• Nicely interpretable eigenvectors (eigenfaces)
Perceptual and Sensory Augmented Computing

• Very compact representation
   Faster matching
• Noise reduction
Computer Vision WS 09/10
Projection and Reconstruction
• An n-pixel image xRn can be
projected to a low-dimensional
Perceptual and Sensory Augmented Computing

feature space yRm by
y  Wx

• From yRm, the reconstruc-
Computer Vision WS 09/10

tion of the point is WTy

• The error of the reconstruc-
tion is
x  W Wx    T

15
B. Leibe
Slide credit: Peter Belhumeur
Recap: Obj. Detection by Distance TO Eigenspace
• Scan a window  over the image
and classify the window as object
or non-object as follows:
Perceptual and Sensory Augmented Computing

   Project window to subspace
and reconstruct as earlier.
   Compute the distance bet-
ween  and the reconstruc-
tion (reprojection error).
Computer Vision WS 09/10

   Local minima of distance over
all image locations  object
locations
   Repeat at different scales
   Possibly normalize window intensity
such that ||=1.

16
B. Leibe
Slide credit: Peter Belhumeur
Scatter Matrices
• We calculate the within-class
scatter matrix as:
c
Perceptual and Sensory Augmented Computing

SW              ( x k   i )( x k   i ) T
i 1 xk  X i

• We calculate the between-class
Computer Vision WS 09/10

scatter matrix as:
c
S B   N i (  i   )(  i   ) T
i 1

17
B. Leibe
Slide credit: Peter Belhumeur
Visualization
S1
Perceptual and Sensory Augmented Computing

SB

SW  S1  S 2
Computer Vision WS 09/10

S2

Good separation
18
B. Leibe
Slide credit: Ales Leonardis
Recap: Fisher’s Linear Discriminant (FLD)
• Maximize distance between classes
Class                                  • Minimize distance within a class
1
Perceptual and Sensory Augmented Computing

x                        • Criterion:

Sb … between-class scatter matrix
Sw … within-class scatter matrix
x
Computer Vision WS 09/10

Class    • Vector w is a solution of a
2             generalized eigenvalue problem:

w
• Classification function:

19
B. Leibe
Slide credit: Ales Leonardis
Application: Fisherfaces
• Idea:
   Using Fisher’s linear discriminant to find class-specific linear
projections that compensate for lighting/facial expression.
Perceptual and Sensory Augmented Computing

• Singularity problem
   The within-class scatter is always singular for face recognition,
since #training images << #pixels
   This problem is overcome by applying PCA first
Computer Vision WS 09/10

20
B. Leibe                   [Belhumeur et.al. 1997]
Slide credit: Peter Belhumeur
Recap: Fisherfaces
• Example Fisherface for recognition “Glasses/NoGlasses“
Perceptual and Sensory Augmented Computing
Computer Vision WS 09/10

21
B. Leibe       [Belhumeur et.al. 1997]
Slide credit: Peter Belhumeur

```
DOCUMENT INFO
Shared By:
Categories:
Stats:
 views: 105 posted: 6/5/2010 language: English pages: 21