AdrianBarbu_Hierarchical by xiangpeng

VIEWS: 3 PAGES: 18

									Hierarchical Image-Motion Segmentation
      using Swendsen-Wang Cuts


 Adrian Barbu

 Siemens Corporate Research
 Princeton, NJ
                   Acknowledgements: S.C. Zhu , Y.N. Wu, A.L. Yuille et al.

                                                                              1
                           Talk Outline
   The Swendsen-Wang Cuts algorithm
       The original Swendsen-Wang algorithm
       Generalization to arbitrary probabilities
   Multi-Grid and Multi-Level Swendsen-Wang Cuts
   Application: Hierarchical Image-Motion Segmentation
   Conclusions and future work




                                                                    2 of 39
                                                    Harvard, May 14th, 2007
      Swendsen-Wang for Ising / Potts Models
Swedsen-Wang (1987) is an extremely smart idea that flips a patch at a time.


                          V2




                    V0




               V1




Each edge in the lattice e=<s,t> is associated a probability q=e-b.
1. If s and t have different labels at the current state, e is turned off.
   If s and t have the same label, e is turned off with probability q.
   Thus each object is broken into a number of connected components (subgraphs).
2. One or many components are chosen at random.
3. The collective label is changed randomly to any of the labels.
                                                                                         3 of 39
                                                                         Harvard, May 14th, 2007
           The Swendsen-Wang Algorithm
Pros
         Computationally efficient in sampling the Ising/Potts models
Cons:
         Limited to Ising / Potts models and factorized distributions
         Not informed by data, slows down in the presence of an
          external field (data term)

Swendsen Wang Cuts
      Generalizes Swendsen-Wang to arbitrary posterior probabilities
      Improves the clustering step by using the image data


                                                                                4 of 39
                                                                Harvard, May 14th, 2007
          SW Cuts: the Acceptance Probability
Theorem (Metropolis-Hastings) For any proposal probability q(AB) and probability
p(A), if the Markov chain moves by taking samples from q(A  B) which are accepted
with probability



then the Markov chain is reversible with respect to p and has stationary distribution p.

Theorem (Barbu,Zhu ‘03). The acceptance probability for the Swendsen-Wang Cuts
algorithm is




                                                                                             5 of 39
                                                                             Harvard, May 14th, 2007
             The Swendsen-Wang Cuts Algorithm
                                                                  Swendsen-Wang Cuts: SWC
                                                                  Input: Go=<V, Eo>, discriminative probabilities qe, e Eo,
                                                                                      and generative posterior probability p(W|I).
                                                                  Output: Samples W~p(W|I).
                                                                  1. Initialize a graph partition
                                                                  2. Repeat, for current state A= π
        The initial graph Go                                       3. Repeat for each subgraph Gl=<Vl, El>, l=1,2,...,n in A
                                                                           4. For e El turn e=“on” with probability qe.
                                                                           5. Partition Gl into nl connected components:
                                                                                   gli=<Vli, Eli>, i=1,...,nl
                                                 V2
                                                 V2
                                                                    6. Collect all the connected components in
                                             x x

                         x
                                 x
                                         x
                                                             x
                                                                                CP={Vli: l=1,...,n, i=1,...,nl}.
                                                                     7. Select a connected component V0CP at random
                                                              x
                 x                   V0      V0
                                             V0
             x                                           x


   V1
        V1       x
                     x
                                             x
                                                 x
                                                     x
                                                                     8. Propose to reassign V0 to a subgraph Gl’,
                             x       x
                                                                                        l' follows a probability q(l'|V0,A)
CP
State AB
 State                                                              9. Accept the move with probability α(AB).

                                                                                                                               6 of 39
                                                                                                               Harvard, May 14th, 2007
        Advantages of the SW Cuts Algorithm


   Our algorithm bridges the gap between the specialized
    and generic algorithms:
       Generally applicable – allows usage of complex models
        beyond the scope of the specialized algorithms
       Computationally efficient – performance comparable with the
        specialized algorithms
       Reversible and ergodic – theoretically guaranteed to
        eventually find the global optimum




                                                                           7 of 39
                                                           Harvard, May 14th, 2007
Hierarchical Image-Motion Segmentation
                   Three-level representation:
                X2 – Level 2: Intensity regions are grouped into
                     moving objects Oi with motion parameters qi


                X1 – Level 1: Atomic regions are grouped into
                     intensity regions Rij of coherent motion
                     with intensity models Hij
                X0 – Level 0: Pixels are grouped into atomic regions
                    rijk of relatively constant motion and intensity
                         – motion parameters (uijk,vijk)
                         – intensity histogram hijk




                                                                        8 of 39
                                                        Harvard, May 14th, 2007
                                     Multi-Grid SWC
                        State XA                        State XB

      V1            x    x       x             V1
                                                                        x
               x                                                          x

                x                                                           x
                             R                              R           x
                x                                                   x

                                                                    x
                                                                x




       V2                                 V3   V2                                     V3



1.   Select an attention window  ½ G.
2.   Cluster the vertices within  and select a connected component R
3.   Swap the label of R
4.   Accept the swap with probability , using     as boundary condition.



                                                                                                9 of 39
                                                                                Harvard, May 14th, 2007
                          Multi-Level SWC




1.   Select a level s, usually in an increasing order.
2.   Cluster the vertices in G(s) and select a connected component R
3.   Swap the label of R
4.   Accept the swap with probability, using the lower levels, denoted by
     X(<s), as boundary conditions.



                                                                              10 of 39
                                                               Harvard, May 14th, 2007
      Hierarchical Image-Motion Segmentation
Modeling occlusion
     Accreted (disoccluded) pixels
     Motion pixels


Bayesian formulation                              Accreted pixels



   Motion pixels explained by motion



   Intensity segmentation factor       with generative and
    histogram models.

                                                                          11 of 39
                                                           Harvard, May 14th, 2007
     Hierarchical Image-Motion Segmentation
The prior has factors for
       Smoothness of motion


       Main motion for each object


       Boundary length


       Number of labels



                                                      12 of 39
                                       Harvard, May 14th, 2007
                  Designing the Edge Weights
    Level 0:
         Pixel similarity
         Common motion



   Level 1:
                                Histogram Hi



                                Histogram Hj

   Level 2:
                              Motion histogram Mi



                              Motion histogram Mj
                                                                   13 of 39
                                                    Harvard, May 14th, 2007
                 Experiments




Input sequence    Image Segmentation    Motion Segmentation




Input sequence   Image Segmentation    Motion Segmentation

                                                                 14 of 39
                                                  Harvard, May 14th, 2007
                 Experiments




Input sequence    Image Segmentation   Motion Segmentation




Input sequence    Image Segmentation   Motion Segmentation

                                                                15 of 39
                                                 Harvard, May 14th, 2007
                           Conclusion
Two extensions:
 Swendsen-Wang Cuts
       Samples arbitrary probabilities on Graph Partitions
       Efficient by using data-driven techniques
       Hundreds of times faster than Gibbs sampler


   Marginal Space Learning
       Constrain search by learning in Marginal Spaces
       Six orders of magnitude speedup with great accuracy
       Robust, complex statistical model by supervised learning

                                                                             16 of 39
                                                              Harvard, May 14th, 2007
                            Future Work
   Algorithm Boosting
        Any algorithm has a success rate and an error rate
        Can combine algorithms into a more robust algorithm by supervised learning
        Proof of concept for Image Registration

   Hierarchical Computing
        Efficient representation of Top-Down and Bottom-Up communication using
         specialized dictionaries
        Robust integration of multiple MSL paths by Algorithm Boosting


   Applications to medical imaging
        3D curve localization and tracking
        Brain segmentation
        Lymph node detection

                                                                                     17 of 39
                                                                      Harvard, May 14th, 2007
                              References
   A. Barbu, S.C. Zhu.
    Generalizing Swendsen-Wang to sampling arbitrary posterior
    probabilities, IEEE Trans. PAMI, August 2005.
    http://www.stat.ucla.edu/~abarbu/Research/partition-pami.pdf


   A. Barbu, S.C. Zhu.
    Generalizing Swendsen-Wang for Image Analysis. To appear in
    J. Comp. Graph. Stat. http://www.stat.ucla.edu/~abarbu/Research/jcgs.pdf


                                     Thank You!



                                                                                  18 of 39
                                                                   Harvard, May 14th, 2007

								
To top