Your Federal Quarterly Tax Payments are due April 15th Get Help Now >>

Directional Multiscale Image Modeling using the Contourlet Transform by pengxuebo

VIEWS: 8 PAGES: 29

									Directional Multiscale Modeling of
             Images


      Duncan Po and Minh N. Do
University of Illinois at Urbana-Champaign
  Motivation: Image Modeling




       A randomly             A “natural” image
     generated image

A simple and accurate image model is the key in many
           image processing applications.
Background: Multiscale Modeling




•Initially: Wavelet transform as a good decorrelator
•Later: Incorporate dependencies across scale and
space
    New Image Representations




•Idea: Efficiently represent smooth contours by directional
and elongated basis elements
•Idea: Successive refinement for edges in both location
and direction
     The Contourlet Transform




•Contourlet transform (Do and Vetterli, 2002): extension of
the wavelet transform using directional filter banks
•Properties: sparse representation for smooth contours,
efficient FB algorithms, tree data structure, …
The Contourlet Transform




 Wavelet




 Contourlet
                        Our Goals
• Study statistics and properties of contourlet coefficients
  of natural images.
  -    Good understanding of properties would provide key
       insights in developing contourlet-based applications.

• Based on statistics and properties, develop a suitable
  model.
  -    Key Idea: Model all three fundamental parameters
       of visual information: scale, space, and direction.


• Apply model to applications.
  -    denoising and texture retrieval.
Marginal Distribution (Peppers)




   Kurtosis = 24.50 >> 3  non-Gaussian!!
Structure of Contourlet Coefficients




  Each coefficient (X) has:
  Parent (PX), Neighbors (NX), Cousins (CX)
  We refer to all of them as Generalized Neighbors
Joint Statistics: Conditional
  Distributions (Peppers)




Parent     Neighbor      Cousin
Joint Statistics: Conditional Distributions on
    Far Neighbors (3 Coefficients away)
                    Peppers




                    Goldhill
     Joint Statistics: Conditional
       Distributions (Peppers)




P(X| PX = px)     P(X| NX = nx)     P(X| CX = cx)
Kurtosis = 3.90   Kurtosis = 2.90   Kurtosis = 2.99
      Joint Statistics: Quantitative
 •Use mutual information (Liu and Moulin, 2001)
                                           p ( x, y )
      I ( X ;Y )    
                     X Y
                           p ( x, y ) log
                                          p( x) p( y )
                                                       dxdy

                                 p ( x, y ) 
      I ( X ; Y )  E XY log
                               p( x) p( y )  
      I ( X ; Y )  D( p ( x, y ) || p ( x) p ( y ))

• A measure of how much information X conveys about Y
• In estimation, quantifies how easy to estimate X given Y.
• In compression, if it takes m bits to encode X, then given
Y, it takes only m-I(X,Y) bits to encode X.
     Joint Statistics: Quantitative
•Histogram estimator for mutual information
(Moddemeijer, 1989)
                     ki , j          ( J  1)(K  1)
                              ki , j N
 I ( X ;Y )  
 ˆ                      log        
              i, j    N     ki k j         2N
•Multiple variables: use sufficient statistics (Liu and
Moulin, 2001)
 I ( X ; Y1 , Y2 ,..., Yn )  max I ( X ; T )
                             T   wi Yi
                 Results: I(X;.)
          PX     NX   CX     PX     NX     PX     all
                             /NX    /CX    /CX

Lena      0.11   0.23 0.19   0.24   0.26   0.21   0.26


Barbara   0.14   0.58 0.39   0.58   0.59   0.40   0.56


Peppers   0.10   0.17 0.14   0.17   0.20   0.16   0.20
                        Lena
                  PX            NX         CX
9-7,CD            0.11          0.23       0.19
Haar, CD          0.18          0.33       0.32
9-7, Haar         0.11          0.24       0.22
9-7, PKVA         0.11          0.24       0.15

         4 directions    8 directions   16 directions
NX       0.26            0.23           0.20
CX       0.14            0.19           0.19
  Average Mutual Information against
   Individual Generalized Neighbors
           I(X;PX)   I(X;NX)   I(X;CX)


Lena        0.11      0.09      0.08


Barbara     0.14      0.31      0.20


Peppers     0.08      0.07      0.06
                    Summary
•   Contourlet coefficients of natural images
    exhibit the following properties:
     1. non-Gaussian marginally.
     2. dependent on generalized neighborhood.
     3. conditionally Gaussian conditioned on
        generalized neighborhood.
     4. parents are (often) the most influential.


•   Next Step: Develop a simple statistical model
    that takes into account these properties
    Hidden Markov Tree (HMT) Model
      •Developed for wavelets (Crouse et. al., 1998)
     Transition
                                         a   11
                                                       a 21
                                                               
                                   Aij                       
     Matrix A                                 i, j      i, j
                                         a   1 2
                                                       a2 2   
State S                                      i, j      i, j   
                                           Transition Matrix

  Coefficient u




                     State 1           State 2
               Contourlet HMT Model
• Each tree has the following parameters
     
  –   p1,1 root state probabilities
  –   A j ,k
           state transition probability matrix between
    subband k in scale j and its parent subband in scale j-1
     
  –  p, q Gaussian standard deviations of subband q in
    scale p
Contourlet HMT Model




Wavelets

              Contourlets
   Denoising: Bayesian Estimation
                                             
         y  xw                              v u n

                                        
E ui , j ,k | vi , j ,k , u   p Si , j ,k  m | vi , j ,k , u                 
                                m
                                               2 , u 
                                        
                            
                                              i , j ,k ,m
                                     2 , n             2 , u 
                                                                      vi , j ,k
                                   i , j ,k          i , j ,k ,m
          Denoising Results: PSNR
 Image     Noise      Noisy   Wiener2    Wavelet    Wavelet   Contourlet
          Std. Dev.                     Threshold    HMT        HMT
Lena
            10        28.13   33.03     32.00       33.84     33.38
            30        18.88   27.40     26.67       28.35     28.18
            50        14.63   24.75     24.20       25.89     26.04
Barbara
            10        28.11   31.38     29.90       31.36     29.18
            30        18.72   24.95     23.76       25.11     25.27
            50        14.48   22.57     21.96       23.71     23.74
Zelda
            10        28.13   34.06     33.37       35.33     33.45
            30        18.83   28.67     28.24       30.67     30.00
            50        14.61   25.78     26.05       27.63     27.07
    Denoising Results: Zelda
                         Noisy, w = 50    Wiener2, 5X5
      Original           PSNR = 14.61      PSNR = 25.78




Wavelet Thresholding     Wavelet HMT      Contourlet HMT
T = 3w , PSNR = 26.05   PSNR = 27.63     PSNR = 27.07
  Denoising Results: Barbara
                         Noisy, w = 51    Wiener2, 5X5
      Original           PSNR = 14.48      PSNR = 22.57




Wavelet Thresholding     Wavelet HMT      Contourlet HMT
T = 3w , PSNR = 21.96   PSNR = 23.71     PSNR = 23.74
Contourlet Texture Retrieval System
 Texture Retrieval: Use Kullback-
         Liebler Distance

                      
D p X  q pX  i    p x  q log
                                        dx
                                      p xq
                                      px  i 
          Retrieval Results
•Average retrieval rates:
 Wavelet HMT: 90.87% Contourlet HMT: 93.29%
•Wavelets retrieve better (>5%)



•Contourlets retrieve better (>5%)
                  Conclusions
• Contourlets: new true two dimensional transform
  allows modeling of all three visual parameters:
  scale, space, and direction
• Statistical measurements show:
  – Strong intra-subband, inter-scale, and inter-
    orientation dependencies
  – Conditioned on their neighborhood, coefficients are
    approximately Gaussian
• Contourlet hidden Markov tree model
  – Promising results in denoising and retrieval

								
To top