Using Color Histogram Algorithm for Pose- and Multiple by malj

VIEWS: 5 PAGES: 9

									              Using Color Histogram Normalization Algorithm for Pose- and
                  Multiple Illuminations-Invariant Matching of Images


                          Soo-Chang Pei (貝蘇章), and Ching-Long Tseng (曾慶龍)


                     Department of Electrical Engineering, National Taiwan University
                       No. 1 Sec. 4, Roosevelt Rd., Taipei, 10617, Taiwan, R. O. C.
                            Tel.: 886-2-23635251-321, FAX: 886-2-23671909
                       E-mail: pei@cc.ee.ntu.edu.tw, d6942009@ms.cc.ntu.edu.tw


                        Abstract                                   riant surface description at each image location. So many
                                                                   assumptions are necessary in these approaches, for exam-
We present an image-matching method using the cova-
                                                                   ple: the surface of known spectral reflectance beforehand,
riance matrix of R-G-B color histogram and tensor theo-
                                                                   highlights of dielectric materials can be observed, etc. But
ries to match different color distortion images. The im-
                                                                   these assumptions are not valid for general situations.
age-matching method is based on the theory of simplified
                                                                   Consequently, Maloney and Wandell proposed a signifi-
affine transform of color histogram: the variation of the          cant computational color constancy algorithm that
shape of the color histogram mainly under translation,             brought up the color-constant surface descriptors which
scaling, and rotation three basic distortions due to the
                                                                   can be computed, if photoreceptors have more classes
illumination changes. The algorithm is called color histo-
                                                                   than degrees of freedom in the linear surface reflectance
gram normalization algorithm with simplified affine
                                                                   model [3]. Besides, Swain and Ballard proposed an algo-
model. Here, we will consider the color images of the
                                                                   rithm called color indexing, which rests with the match-
same planar surface are taken under several color illumi-          ing of the color histogram intersections of the object from
nation environments or from different illumination poses,          a large database over a range of condition. Although the
so there are three groups of color distortion images to be
                                                                   technique does not use any geometry information of ob-
experimented. These color distortion images formed with
                                                                   jects, color distributions are often sensitive to illumina-
different illumination poses, multiple color illuminations,
                                                                   tion changes. Afterwards, Funt and Finlayson introduced
and two illuminant sources, respectively. However, the             a new algorithm called color constant color indexing to
color histogram normalization algorithm performed in               improve the above-mentioned problem. The approach is
R-G-B coordinate can achieve pose- and illumination-
                                                                   to match the distributions of color ratios under a vary-
invariant matching of color images for any cases. Simula-
                                                                   ing-illumination environment. But it still has significant
tion results show that this method is very effective and
                                                                   restrictions; for example: color ratios are often sensitive
successful. Meanwhile the method will be carried out in
                                                                   to noise under low intensity.
R-G-B coordinate and chromaticity space for perfor-                      Healey et al. used the affine transform of color dis-
mance comparison in this paper.                                    tribution change to propose several methods. For example:
                                                                   a color neighborhood representation for illumina-
Keywords: color histogram normalization algorithm with
                                                                   tion-invariant matching [4], but it requires a large amount
simplified affine model, pose-invariant, illumination-
                                                                   of memory and expensive comparisons for representing
invariant, chromaticity space.
                                                                   objects by using all color distributions. Then they selected
                                                                   a vector of six color distribution invariants as a feature to
1. Introduction                                                    recognize objects on a medium-sized database. However,
       As everyone knows, the difficulty of color con-             the approach is established as a finite-dimension linear
stancy problem comes from many natural factors, includ-            surface reflectance model, changing illumination color to
ing shadow, illumination conditions, and specular reflec-          cause a linear transformation of color distribution in color
tion [1], [2]. The color of an object varies with changes in       space. In recent years, Lee et al. brought up a new con-
illumination source, illumination geometry, viewing angle,         cept to use chromaticity space and distributions for spe-
and miscellaneous sensor parameters. What is called color          cularity, illumination color-invariant and illumination
constancy is the phenomenon of steady spectral color               pose-invariant to recognize 3D objects [1], [2]. They have
response under different illumination [3], so to remove            proposed a method called „model-based specularity de-
illumination-source color to obtain pure object‟s surface          tection/rejection‟, in which the specularity clusters must
reflectance color is one of the fundamental problems in            be detected and removed from a color image in the chro-
computer vision. Therefore, the color constancy algo-              maticity plane. When a rough initial estimation of illumi-
rithms are often used and play a major role for image re-          nation color is obtained, the goal of speculari-
trieval from image databases.                                      ty-invariance can be achieved. However, these methods
      The previous researches about computational color            signify that specularity effect has to be removed before-
constancy focused on recovering an illumination- inva-



                                                               1
hand in order to achieve the objective of speculari-                                                              m

ty-invariance.                                                                                  sx, y,     j x, y  S j                                              3
                                                                                                                  j 1

                                                                                              where S j   represent the fixed basis functions and
     Based on the theory of Healey et al., there is an im-
portant affine property of color histogram: the shape of
the color histogram is distorted under translation, scaling,                                   j x, y  are the surface weighting coefficients according
and rotation corresponding to the illumination conditions                                     to location; however, when the effect of human visual
change. For a unique illuminant source, as in [5], [6], we                                    sensitivity is included, the linear model approximations
focused on the discussion of the illumination color, so the                                   provides excellent fits of measure 0.9890 to the empirical
shadow and specular reflectance effects are not consi-                                        data for m  3 . Combining Eqs. (2) with (3), we obtain
dered. The color reproduction of the recovered images is                                          x, y   A x, y                                 4
quite correct, and matching performance is satisfactory by                                    where  x, y  denote the column vector of n sensor
using the proposed algorithm in [5], [6]. However, in this                                    measurements,  x, y  denote the column vector of m
paper, we will consider the environment condition for
                                                                                              fixed basis function weights, so A is an n  m matrix
several illumination sources in all directions at the same
                                                                                              with its entries as
time. That is, the color images of the same planar surface
are taken under multiple color illuminations or from dif-                                       A ij   L  S j   f i   d                                            5
ferent illumination poses. So the shadow and specular                                         We can know that A depends only on the light spectrum
reflectance effects are existed. Moreover, the algorithm                                      wavelength  , but has no relationship with the image
will be carried out in R-G-B coordinate and chromaticity                                      location (x,y). In most cases, m is equal n.
space for performance comparison in this paper.                                                     Consider the color images of the same planar surface
     This paper is organized as follows. In Section 2, we                                     are taken under multiple color illuminations or from dif-
                                                                                              ferent illumination poses as L  and L   , then the
discuss the modified modeling representation in color                                                                                          ~
images. The color histogram normalization algorithm for
                                                                                              two images are represented as
image matching is reviewed in Section 3. Section 4 de-
                                                                                                 x, y   A x, y ,  x, y   A x, y        6
                                                                                                                        ~           ~
scribes the experimental results and makes discussions for
                                                                                              Assume A and A corresponding to L  and L  
                                                                                                                         ~                                                 ~
several practical natural images. Finally, conclusions are
made in Section 5.                                                                            are nonsingular matrices, then the two images related by
                                                                                              using Eq. (6) can be shown as follow
2. Modeling Representation in Color Images                                                        x, y   M x, y 
                                                                                                 ~                                                    7
       The n sensor measurements can be made using a                                                          ~ 1
                                                                                              where M  AA .
color imaging system at each image location (x,y), de-                                              On the other hand, according to [1], [2], they pro-
scribed in [4]                                                                                posed a concept in which the sensor measurement is con-
   i x, y    l  sx, y,   f i   d, 1  i  n 1                                structed only from the diffuse component of the object
                  
                                                                                              color. Shading effect exists due to illumination direction,
where  indicates the wavelength, l   and sx, y,  
                                                                                              so the effect related to surface orientation can be consi-
are the spectral power distribution of the illumination                                       dered, then Eq. (6) with multiple color illuminations or
source and reflectance of the object‟s surface, respectively.                                 illuminated from different illumination poses L  and
On the other hand, f i   denotes the spectral sensitivity
                                                                                              L   becomes
                                                                                              ~
of the ith sensor filter, moreover, n is usually three to
represent the spectral response of the red, green, and blue
                                                                                                                                                        
                                                                                                 x, y   ns1 x, y   ...... nsp x, y   n( x, y ) A x, y  ,
sensing elements in a R-G-B color system. The range of                                           ~           ~
                                                                                                 x, y   ns1 x, y   ...... nsp x, y   n( x, y ) A x, y 
                                                                                                                                   ~                       ~
                                                                                                                                                                           8
the integration is on the entire visible wavelengths.                                         where nx, y represents the surface normal vector,
      Furthermore, if a color image is taken under several
                                                                                              ns1 x, y , ..., nsp x, y  and ns1 x, y , ..., nsp x, y  denote the
                                                                                                                                ~                 ~
illumination sources of l1  , l 2  , l 3  , ...,l p   in all
                                                                                              illumination direction vector for pixel (x,y) each with
directions at the same time, then Eq. (1) becomes
                                                                                               l1  , ..., l p   and l1  , ..., l p   , respectively. Similarly,
                                                                                                                          ~             ~
   i x, y    l1  s x, y ,   f i   d   l2  s x, y ,   f i   d                  ~
                                                     
                                                                                              A and A are the matrices which depend only on the
                 l3  s x, y ,   f i   d  ..........
                                                               ........                       light spectrum wavelength  corresponding to L 
                     

                                                                                              and L   .
                                                                                                    ~
                 l p  s x, y ,   f i   d
                     


                 
                                                            
               l1    l2    l3    ...  l p   s x, y ,   f i   d
                                                                                                   We can also show the two images related using Eq.
                                                                                              (8) for the change in the color of illuminations and illu-
               L s x, y ,   f i   d ,          1 i  n                2        mination poses
                 
                                                                                                 x, y  
                                                                                                ~             n x, y   ...  n x, y   n( x, y ) AA  x, y 
                                                                                                                ~                 ~
                                                                                                                                                          ~
where L   l1    l 2    l 3    ....  l p   denotes the
                                                                                                                   s1                 sp                      1

                                                                                                              n x, y   ...  n x, y   n( x, y )
                                                                                                                   s1                 sp
composed effect of the combination of several illumina-                                               M  x , y                                                             9 
tions.
      Then, we can approximate the surface reflectance
                                                                                                         ~
                                                                                              where M  s1
                                                                                                                           ~               
                                                                                                         n x, y   ...  nsp x, y   n( x, y ) ~ 1  
 sx, y,   to be a weighted sum of a series of fixed basis                                                                               
                                                                                                         ns1 x, y   ...  nsp x, y   n( x, y )
                                                                                                                                                     AA .
                                                                                                                                                         
functions in general                                                                                 Obviously, due to different nx, y , ns x, y  and
                                                                                              n s x, y  , the sensor measurements  x, y  will be ob-
                                                                                              ~




                                                                                          2
served over surfaces with different orientations, so the                   plified affine model. Now we will describe the color his-
linear transform relationship between the tristimulus val-                 togram normalization algorithm with simplified affine
ues at these points cannot be used to establish illumina-                  model succinctly.
tion-invariant. Therefore, to transfer color distributions in              3.2 Summary of the Color Histogram Normalization
chromaticity space, chromaticity values are computed by                         with Simplified Affine Model
normalizing the sensor measurements as                                            In this paragraph, we will describe succinctly how
            1                  2                  3                     to normalize a color histogram in R-G-B space; the de-
  ~
  X
                       ~
                      ,Y 
                                            ~
                                          , Z                  10
       1   2   3      1   2   3      1   2   3              tails are described in [6]. The main steps of the algorithm
When the sensor measurements are converted to chroma-                      are summarized as below:
ticity space, the fraction factor in Eq. (9) will be canceled.             (1) Computing the following moments of the original
Therefore, the shading effect due to illumination direction                     color histogram
with respect to surface orientation disappears. That is,                   1. mean vector c of the center of a color histogram‟s
chromaticity space is independent of illumination pose.                       appearance in 3D color space
                                                                             c  C r C g C b
                                                                                                               T
But worth to be mentioned, the intensity information from
the color distribution is removed in chromaticity space.
      In fact, from viewing Eq. (9), only to consider the                              rf r, g, bdV
                                                                                                                          
                                                                                                                           
                                                                                                                                   gf r, g , b dV               bf r, g , bdV  12
                                                                                                                                                                 
                                                                                                                                                                                 T



various multiple illuminations, we can find the numerator                  2. joint central moment  ijk
and the denomination of the fraction factor in Eq. (9) are
equivalent; i.e. Eq. (9) can be simplified to Eq. (7). On                                     
                                                                             μ ijk  E R  Cr  G  Cg  B  Cb 
                                                                                                               i                         j               k
                                                                                                                                                             
the other hand, to consider the illuminations directions,                                 r  Cr  g  Cg  b  Cb  f r, g , bdV                                             13
                                                                                                               i                     j               k
                                                                                              
the effects of illumination pose are introduced as the dot
product of illumination direction and surface normal.                      3. covariance matrix M
However, a planar surface has a constant surface normal,                          200 110 101 
thus the shading due to pose is merely a constant. That is,                   M  110 020 011 
                                                                                                                                14
the fraction factor in Eq. (9) is only a linear transforma-                       101 011 002 
                                                                                                 
tion — the effect of scaling. Therefore, Eq. (9) can be                    (2) Finding out the eigenvalues i and eigenvectors e i
regarded as Eq. (7) for such analysis and consideration.
                                                                                of M
      Whatever the coordinates will be used, H   and
                                                                                Since M is a symmetric matrix, the eigenvectors
 H   are n-dimensional color histograms corresponding
    ~
                                                                           e 1 e 2 e3 corresponding to 1  2  3 ( 1  2  3 ) are
to the same surface illuminated by multiple color illumi-
                                                                           orthonormal to one another. Hence, the color histogram
nations or illuminated from different illumination poses
                                                                           normalization can be represented to transform the color
 L  and L   . From Eq. (7), we can find
               ~
                                                                           histogram according to the rotational matrix E 0
   H    H M 
      ~                                                   11
                                                                                        r  C r  e 1   r  C r 
                                                                                                      T
                                                                             u 
Thus, when changing the illuminations, the 3D color his-                      v   E  g    e T   g                                                                        15
tograms are related by an affine transformation of the                              0     C g  2         C g
coordinates regardless of using R-B-G coordinate or                           w
                                                                                     b  C b  e T  b  C b 
                                                                                                  3            
chromaticity space.                                                        In order to solve the problem of mirror symmetry of the
                                                                           normalized color histogram, we apply tensor theory to
3. Review on Color Histogram Normaliza-                                    determine a new rotation matrix.
   tion Algorithm                                                          (3) According to the tensor theory, computing the tensor
                                                                                  111
3.1 Preface                                                                     T     and T 222
                                                                                            xi x j x k ... f x1 , x 2  dx1 dx 2
      Refer to [6], we have done extensive experiments,
and demonstrate that the simplified affine model is better                   T
                                                                                 ijk...
                                                                                             
                                                                                                                                                                                     16
than the general affine model for the distorted images due
to the illuminations changed. It‟s well known that a gen-
                                                                             T
                                                                                 ijk...
                                                                                            y y y ... f y , y dy dy
                                                                                              
                                                                                                  i    j   k
                                                                                                                              1     2
                                                                                                                                            1   2
                                                                                                                                                                                      17
eral affine transformation can be separated into transla-                    T
                                                                                 ijk
                                                                                       A  Ail Am Akn T lmn
                                                                                                  j
                                                                                                                                                                                     18
tion, scaling, rotation, skew and shearing five basic forms,                                  x1                                                                     y1 
                                                                                                                                             r  C r 
but we have proved the effect of translation, scaling and                                      2                                                                   2
rotation will dominate the distortion of the color histo-                  where              x          represents                        g  Cg  ,              y      represents
gram with respect to illuminations change [6]. Visually,                                      x 3                                          b  C                  y3
                                                                                                                                                 b                 
the shape of the distorted color histogram due to a change
                                                                           u  C u            A11 A2 A3 
                                                                                                      1   1
in illumination color doesn‟t change intensely; i.e. we can                           , and  A 2 A 2 A 2  denotes the affine transform
find that color histogram is still within first octant in                  v  C v            1 2 3
R-G-B color space whatever the illumination-changed                        w  C w            A 31 A 3 A 3 
                                                                                              1 2 3
will be. So a limited and restricted simplified affine mod-
                                                                           a11 a12 a13 
el can be proposed to satisfy with such situation; in other                a a a  . Then there exists a criterion as follow
words, it also represents the effect of skew and shearing is                21 22 23 
very little and not significant. Therefore, the below dis-                 a 31 a 32 a 33 
                                                                                          
cussions and experiments of the color histogram norma-                                                             
                                                                                             If T 111  0 , then e 1  e 1 ;            
lization algorithm in this paper are considered in the sim-



                                                                       3
                              If T 222  0 , then e 2  e 2 ;                              r '     r  C r    '    C r 
                                                                                               '                      
                               e 3  e1  e 2                                                  g   G  g   C g   G C g 
                                                                                                                          '                       25
                                           e                                                b '     b  C b         C b 
                                                                                                                        
                                                                        T
                                                                        1                                           '


                                            
     We obtain the new rotation matrix E  e T  and
                                              2                                             Compared with Eq. (20), we can obtain the affine trans-
                                           e T                                            form coefficients
                                            3
                                                                                              a11 a12 a13           b1  C r  C r 
  1  2  3 .
                                                                                                                                '


                                                                                                                          G             26
                                                                                              a 21 a 22 a 23  G,   b2 C g   C g 
                                                                                                                                    '

(4) The normalized color histogram can be obtained by
                                                                                              a 31 a 32 a 33
                                                                                                                    b3 C b 
                                                                                                                               C b 
                                                                                                                                    
    the transformation                                                                                                          '



                                                                                            In other words, the color histogram of the original image
  u        r  C r                                                                      can be recovered from the color histogram of the illumi-
   v   c E g                                                               19       nation-changed image by the following transformation
           
                C g
   w       b  C b                                                                        r          r '  C r       C r 
                   
                                                                                                                                        '


                                                                                               g   G 1  '      G 1                   27 
3.3 Image Matching                                                                                        g  C g        C g         '



                                                                                              b          b'  C          C b 
      Furthermore, we intend to remove illumination col-                                                        b               '



or or illumination pose to obtain pure object‟s surface                                     4. Experimental Results and Discussions
reflectance color. Here, we assume a finite-dimensional
                                                                                                  To handle the color reproduction of the illumina-
linear surface reflectance model, changing illumination                                     tion-changed images under several kinds of illumination
color to cause a linear transformation of color distribu-
                                                                                            conditions, we design three groups of experiments.
tions in color space. So the above 3D object normaliza-
                                                                                            A. First Group Experiments for Pose-Invariance:
tion technique can be applied to the color histogram of
                                                                                            We here discuss the situations of pure the illumination
color coordinate; the two 3D color histograms are related
                                                                                            poses as shown in Fig. 1. We first choose the image taken
by an affine transformation of the color coordinates when
                                                                                            under two nearly normal illuminations as the standard
changing the illumination. The relation of the color histo-                                 template image for matching; then in order to demon-
grams between the original image and the illumina-
                                                                                            strate the matching performance of color images taken
tion-changed image can be expressed as follow
                                                                                            under six arbitrary poses of these two illuminants. Note
  r '  a11 a12 a13  r  b1                                                           that the whole experiment does not need to transform
   '                     
   g   a 21 a 22 a 23  g   b 2                                     20           color distributions to chromaticity space.
  b'  a 31 a 32 a 33 b  b 3                                                         B. Second Group Experiments for Illumination- In-
                         
                                                                                                 variance with Multiple Illuminants:
By estimating the affine coefficients using normalization                                   We will consider the color images of the same planar sur-
algorithm, we can derive the effect of the illumina-                                        face are taken under multiple color illuminations to be
tion-changed between the two images. Using Eq. (19), the                                    close to the real-world environment. The experiment is to
color histogram of the original image is normalized by the                                  choose the first image taken under two main nearly nor-
transformation                                                                              mal illuminations and several slight white illuminations in
  u         r  C r                                                                     all directions as the standard template image for matching.
  v   c E  g                                                               21       In order to demonstrate the matching performance of the
         o 
          o      C g
   w
            b  C b 
                      
                                                                                            second experiment, an additional eleven images of each
                                                                                            scene were captured under arbitrary poses, red & yellow,
To do the same procedure with the illumination-changed
                                                                                            red & green, red & blue, red & magenta, yellow & green,
image
                                                                                            yellow & blue, yellow & magenta, green & blue, green &
  u                    r '  C r   '
                                                                                            magenta, and blue & magenta, eleven different illumina-
  v                    '        
   
                       c
                      Ea g  C g             '                             22           tions respectively. The sketch map is shown in Fig. 2.
   w             a     '                                                              C. Third Group Experiments for Illumination- Inva-
                       b C b           '


                                                                                                 riance with Two Illuminants:
Since the same normalized histograms are obtained, we
                                                                                            We will consider the images illuminated by two different
can combine Eqs. (21) and (22) to get                                                       illuminants at the same time; the sketch map is shown in
  r '  C r 
           '
                                       r  C r                                            Fig. 3. The procedure is to choose the first image illumi-
   '                        a               
  g  C g       '             Ea Eo g  C g
                                  T
                                                                             23           nated just two nearly normal illuminations without any
   '                       o       
                                       b  C b 
                                                                                            other illuminations in all directions as the standard tem-
  b C b      '
                                                
                                                                                            plate image for matching. Similar to the second group, for
           T
Note that Ea E o is the rotational matrix,                         a       o     is       demonstrating the matching performance of color images
the scaling factor and Cr  Cr is the translation between
                                                   '
                                                                                            for several illuminant colors, an additional eleven images
                                                                                            of each scene are captured by changing these two nearly
the two histograms. Let
                                                                                            normal sources.
           a
  G                        T
                           Ea Eo                                             24           We obtained these images by using our image-acquisition
           o                                                                               facility including a Sharp VL-H450 DIS Hi8, and an
Eq. (24) may be rearranged as                                                               Adobe Premiere 4.0 montage. Illumination intensity
                                                                                            should be controlled to avoid clipping in any color band.




                                                                                        4
      We list a picture in the first group for space saving.       rithm, the color distributions don‟t need to be transform
In Fig. 4, for each case, (a) shows the same image illumi-         into chromaticity space. That is, by using color histogram
nated by arbitrary directions of two nearly normal illumi-         normalization algorithm, no matter what the conditions of
nations. The color histogram with R-G-B coordinate of              illuminations are, the histograms in R-G-B coordinate are
the image is displayed in (b); this distribution does not be       able to obtain the better images matching. Finally, in or-
transform to chromaticity space. Then, using the color             der to remove illumination color to obtain pure image‟s
histogram normalization algorithm, the normalization               surface reflectance color, we have assumed there is a fi-
result is shown in (c). The recovered image is showed in           nite-dimensional linear surface reflectance model, chang-
(d). The procedure is pose-invariant matching of color             ing illuminations color to cause a linear transformation of
images by using the color histogram normalization algo-            color distribution in color space. In other words, when
rithm. The whole experiments of the same image under               varying the illuminations, the two 3D color histograms
arbitrary different illuminations directions are shown in          are related by a simplified affine transformation (transla-
Fig. 4. These images were taken as tested images to be             tion, scaling, and rotation) of the coordinates.
matched, and the matching is quite correct from these
recovered images. Then we list the PSNR values of these            5.   Conclusions
recovered images, as shown in Table 1; and we can easily                The color constancy problem is to remove illumina-
find the matching performance is quite good. In the table,         tion-source color to obtain pure object‟s surface reflec-
the entry „PSNR_R‟ denote the Peak Signal Noise Ratio              tance color. The reason why the problem occurs, it comes
(PSNR) of individual red band of the color images, and             from many natural factors, i.e. the color of an object va-
„ PSNR ‟ represent the average PSNR value of three bands           ries with changes in illumination source, illumination
of the color images, and so on. The PSNR values of the             geometry, viewing angle, and miscellaneous sensor pa-
recovered images are about 23~32 dB at the first group.            rameters. When we intend to solve the color constancy
      Similarly, we only list one picture in the second and        problem thoroughly, we must consider the effects of sha-
third group, respectively. The whole experiments are for           dow and multiple illumination conditions in color images
illumination-invariant matching of color images with               at the same time. In this paper, the environment of the
multiple illuminants, and there are an additional eleven           color images taken is close to real world as far as possible,
images of each scene were captured. Here, in order to              so we have deigned three groups of experiments on dif-
compare the matching performance, we will implement                ferent illumination poses, multiple color illuminations,
the illumination-invariant procedure in R-G-B coordinate           and two illuminant sources, respectively. All the recov-
and chromaticity space at the same time. But the images‟           ered images are obtained with fine matching by using
results in chromaticity space are not displayed here for           color histogram normalization algorithm in R-G-B coor-
space saving, we only list the PSNR values of these im-            dinate. Similarity, the matching performance is quite suc-
ages, as shown in Table 2 and Table 3. Fig. 5 and Fig. 6           cessful and satisfactory. Therefore, we can make a con-
are shown to represent the experimental results of the             clusion: whatever the environment of the color images
second and the third group experiments, respectively.              taken is, color histogram normalization algorithm com-
Then the PSNR values of them in R-G-B coordinate are               puted in R-G-B coordinate can achieve pose- (illumina-
also shown in Table 2 and Table 3. The PSNR values of              tion direction-) and multiple illumination- (illumination
                                                                   color-) matching of color images.
the recovered images are about 18~29 dB in R-G-B coor-
dinate and 12~27 dB in chromaticity space at the second
group. On the other hand, the PSNR values of the recov-            References
                                                                   [1] S. Lin and S. W. Lee, “Using chromaticity distribu-
ered images are about 16~29 dB in R-G-B coordinate and
                                                                       tions and eigenspace analysis for pose-, illumination-,
12~28 dB in chromaticity space at the third group. And
                                                                       and specularity-invariant recognition of 3D objects,”
worth to be mentioned, we don‟t have to remove the spe-
                                                                       Proc., IEEE Computer Society Conference on Com-
cularity component beforehand as in [1], [2] to achieve
                                                                       puter Vision and Pattern Recognition, pp.426-431,
good matching performance by using color histogram
                                                                       1997.
normalization algorithm; that is, by using our algorithm,
                                                                   [2] D. Berwick and S. W. Lee, “A chromaticity space for
we don‟t care the specular effect when we intend to match
                                                                       specularity, illumination color- and illumination
the color images.
                                                                       pose-invariant 3-D object recognition,” Proc. of 6th
      From the experimental results, the 20 dB matching
                                                                       International Conference on Computer Vision, pp.
performance is regarded as good recovery. In order to
                                                                       165-170, 1998.
know the difference of the matching performance be-
                                                                   [3] L. Maloney and B. Wandell, “Color constancy: a me-
tween in R-G-B coordinate and chromaticity space, to
                                                                       thod for recovering surface spectral reflectance,” J.
compare the PSNR tables. Then we can easily find the
                                                                       Opt. Soc. Am. A, vol. 3, pp. 29-33, Oct. 1986.
results of recovered images in R-G-B coordinate are all
                                                                   [4] D. Slater and G. Healey, “The illumination-invariant
better than the results in chromaticity space about 2~3 dB
                                                                       matching of deterministic local structure in color im-
in average. Recall to the section Ⅱ, we know that the
                                                                       ages,” IEEE Trans. Pattern. Anal. Machine Intell., vol.
intensity information from the color distribution will be              19, pp. 1146-1151, Oct. 1997.
removed in chromaticity space; maybe this is the reason            [5] S. C. Pei, C. L. Tseng and C. C. Wu, “The illumi-
of the worse outcomes in chromaticity space. So we can                 nant-invariant matching of images using color histo-
make a conclusion: if we intend to achieve images                      gram normalization,” Proc. of 4th Asian Conference on
matching by using color histogram normalization algo-



                                                               5
    Computer Vision, Taipei, Taiwan, R.O.C., Jan. 2000.                                lumination-changed Images,” J. Opt. Soc. Am. A, vol.
[6] S. C. Pei, C. L. Tseng and C. C. Wu, “Using color                                  18, pp. 2641-2658, Nov. 2001.
    histogram normalization for recovering chromatic il-




     Fig. 1: The sketch map of the first group experiments                          Fig. 2: The sketch map of the second group experiments

                                                                                   Category       PSNR_R          PSNR_G          PSNR_B            PSNR
                                                                                   Left Light      27.7288         27.4246         26.5112         27.2215
                                                                                   Right Light     30.3364         30.7534         31.1855         30.7585
                                                                                     Pose 1        29.7580         29.7112         29.2329         29.5674
                                                                                     Pose 2        30.9791         31.2159         28.9041         30.3664
                                                                                     Pose 3        24.8834         24.7253         24.8212         24.8101
     Fig. 3: The sketch map of the third group experiments                         Table 1: The PSNR value of the first group experiment in
                                                                                   R-G-B coordinate (unit: dB)
    Category      PSNR_R          PSNR_G          PSNR_B            PSNR           Category       PSNR_R          PSNR_G          PSNR_B            PSNR
   Change pose                                                                     Change pose
                 25.2728 22.7411 28.0951 26.0642 28.6884 24.0304 27.3521 24.2786                 30.6959 23.7894 30.6074 23.0230 28.4798 24.3600 29.9277 23.7241

     R+Y         23.6054 18.8927 25.0977 22.7103 22.5458 21.1668 23.7496 20.9233     R+Y         24.2000 16.2143 27.2014 18.8782 15.1769 15.8337 22.1928 16.9754

     R+G         24.8995 18.5111 24.9596 21.5789 23.2184 19.5842 24.3592 19.8914     R+G         23.6680 25.4296 22.7698 21.7151 21.4350 19.6916 22.6243 22.2788

     R+B         26.4346 14.3355 26.1632 18.6790 26.4796 16.2594 26.3592 16.4246     R+B         27.5565 21.1808 28.6278 26.5274 21.1715 16.9644 25.7853 21.5575

     R+M         25.3164 14.5824 24.0737 19.0919 25.0355 16.8722 24.8085 16.8488     R+M         18.9670 13.2777 26.3043 20.2655 19.2169 19.7924 21.4961 17.7786

     Y+G         28.8681 26.8529 28.8546 24.7823 23.9369 20.3479 27.2199 23.9944     Y+G         23.0700 12.8082 27.9338 17.9062 14.8161 16.2502 21.9400 15.6548

     Y+B         30.5304 28.8776 29.6613 27.8767 27.7730 26.9348 29.3215 27.8964     Y+B         28.4828 27.8319 30.2984 30.1189 29.4437 26.5979 29.4083 28.1829

     Y+M         29.2239 27.2889 28.7254 27.3111 26.1978 25.5520 28.0491 26.7173     Y+M         26.0500 25.9281 28.0537 27.3294 27.8538 26.0972 27.3192 26.4516

     G+B         19.2959 19.2195 27.2467 17.7500 29.6207 25.4751 25.3877 20.8149     G+B         23.6659 26.1303 25.5598 16.8724 27.2563 21.2461 25.4940 21.4163

     G+M         20.3226 19.9888 27.6462 20.2403 29.2867 26.2672 25.7518 22.1655     G+M         26.1849 27.7769 27.2370 19.9100 27.4137 25.8640 26.9452 24.5170

     B+M         23.3163 22.2497 27.7371 21.8275 29.7381 20.6341 26.9305 21.5705     B+M         23.1540 16.2535 25.0193 20.5128 21.5783 16.1557 23.2505 17.6407

   Table 2: The PSNR value of the second group experiment Table 3: The PSNR value of the third group experiment in
   in R-G-B coordinate and in Chromaticity space (unit: dB) R-G-B coordinate and in Chromaticity space (unit: dB)




      Two standard illuminations pose                   Only one left-direction illumination                 Only one right-direction illumination




                                                                             6
Arbitrarily change two illumination pose 1 Arbitrarily change two illumination pose 2 Arbitrarily change two illumination pose 3
  Fig. 4: (a) Image; (b) color histogram; (c) normalized color histogram; (d) recovered image (in R-G-B coordinate)




  Several standard illuminations          Change two main illuminations pose             Red and Yellow illuminations




   Red and Green illuminations                 Red and Blue illuminations               Red and Magenta illuminations




 Yellow and Green illuminations              Yellow and Blue illuminations            Yellow and Magenta illuminations




                                                             7
  Green and Blue illuminations         Green and Magenta illuminations         Blue and Magenta illuminations
Fig. 5: (a) Image; (b) color histogram; (c) normalized color histogram; (d) recovered image (in R-G-B coordinate)




   Two standard illuminations         Change two main illuminations pose         Red and Yellow illuminations




  Red and Green illuminations             Red and Blue illuminations            Red and Magenta illuminations




 Yellow and Green illuminations          Yellow and Blue illuminations        Yellow and Magenta illuminations



                                                       8
  Green and Blue illuminations         Green and Magenta illuminations         Blue and Magenta illuminations
Fig. 6: (a) Image; (b) color histogram; (c) normalized color histogram; (d) recovered image (in R-G-B coordinate)




                                                       9

								
To top