# Approximate Nearest Subspace

Document Sample

```					Approximate Nearest Subspace Search
with applications to pattern recognition

Ronen Basri      Tal Hassner   Lihi Zelnik-Manor
Weizmann Institute             Caltech
Subspaces in Computer Vision

•Illumination

•Faces                      Basri & Jacobs, PAMI’03

•Objects

QuickTime™ and a

•Viewpoint, Motion
TIFF (LZW) decompressor
are neede d to see this picture.

•Dynamic textures
•…                   Zelnik-Manor &                 Nayar et al.,
IUW’96
Irani, PAMI’06
Nearest Subspace Search

Query

Which is the
Nearest
Subspace?
Sequential Search
Database
n subspaces
d dimensions
k subspace
dimension

Sequential search:   O(ndk)
Too slow!!
Is there a sublinear
solution?
A Related Problem:
Nearest Neighbor Search
Database
n points
d dimensions

Sequential search:   O(nd)

There is a sublinear
solution!
Approximate NN
• Tree search (KD-trees)
r
• Locality Sensitive Hashing

(1+)r
Query:         Logarithmic
Preprocessing: O(dn)

Fast!!
Is it possible to speed-up
Nearest Subspace Search?

Existing point-based methods
cannot be applied

LSH                 Tree search
Our Suggested Approach
• Reduction to points

• Works for both
linear and affine spaces

Sequential
Our
Run time

Database size
Problem Definition
S  Subspace with dimk
q  Query

Find Mapping         u  f (S)     Independent
v  g(q)        mappings

2
u  v   dist 2 (q,S)                 Monotonic
in distance
A linear 
function of original distance

Apply standard point ANN to u,v
Finding a Reduction
2                      q
dist (q,S)  SS q  q
2               T

 Vec(SST  I) Vec(qqT )   
T
S
SS q
u             v
Feeling lucky?

2
uv  u  v
2    2
 2 q,S 
 2dist
We are lucky !!
2
              
Constants?          u dk
2      4
v  q           Depends on query
Basic Reduction
u  Vec(SS  I)
T

v  Vec(qq )   T

2
u  v   dist (q,S)  
2

Want: minimize   /
Geometry of Basic Reduction
u  Vec(SS  I) T

v  Vec(qq )         T

Query
2      4
Lies on a cone           v  q

Database
2
Lies on a sphere          u dk
and on ahyper-plane
Improving the Reduction
u  Vec(SS  I)
T

v  Vec(qqT )


Final Reduction
u  Vec(SS  I)
T

v  Vec(qqT )

,,   = constants

Can We Do Better?

q                                  uv

   dist (q,S)  0
2                             
2
If =0                 uv 0

Trivial mapping        Additive Constant is Inherent


Final Mapping Geometry
ANS Complexities

Preprocessing:   O(nkd 2)      Linear in n

Query:    O(d 2)+TANN(n,d 2)    Log in n
Dimensionality May be Large
• Embedding in d2
• Might need to use small ε

• Current solution:
–Use random projections (use Johnson-
Lindenstrauss Lemma)
–Repeat several times and select the nearest
Synthetic Data
Varying database size              Varying dimension

Sequential                       Sequential
Our                              Our

Run time
Run time

Database size                      dimension
d=60, k=4                       n=5000, k=4
Face Recognition         (YaleB)

Database    64 illuminations
k=9 subspaces

Query:
New illumination
Face Recognition Result

Wrong Match   Wrong Person

True NS
Approx NS
Retiling with Patches
Wanted

Query         Patch database   Approx Image
Retiling with Subspaces
Wanted
Subspace
Query                  Approx Image
database
Patches
+
ANN
~0.6sec
Subspaces
+
ANS
~1.2 sec
Patches
+
ANN
~0.6sec
Subspaces
+
ANS
~1.2 sec
Summary
• Fast, approximate nearest subspace search
• Reduction to point ANN
• Useful applications in computer vision
– Embedding in d2
– Additive constant 
• Other methods?