Docstoc

siegel

Document Sample
siegel Powered By Docstoc
					                  Information Rates for
              Two-Dimensional ISI Channels



             Jiangxin Chen and Paul H. Siegel

          Center for Magnetic Recording Research
            University of California, San Diego

                    DIMACS Workshop
                    March 22-24, 2004




3/23/04                                            1
                              Outline
•      Motivation: Two-dimensional recording
•      Channel model
•      Information rates
•      Bounds on the Symmetric Information Rate (SIR)
      • Upper Bound
      •     Lower Bound
      •     Convergence
•         Alternative upper bound
•         Numerical results
•         Conclusions

3/23/04                   DIMACS Workshop               2
          Two-Dimensional Channel Model
• Constrained input array                    x[ i , j ]


• Linear intersymbol interference                     h[i, j ]


• Additive, i.i.d.      Gaussian noise                    n[i, j ] ~ N (0,  2 )


                     n1 1 n2 1
            y[i, j ]       h[k , l ]x[i  k , j  l ]  n[i, j ]
                     k 0   l 0




3/23/04                     DIMACS Workshop                                        3
          Two-Dimensional Processes
• Input process:             X  X [ i , j ]

• Output process:              Y  Y [ i , j ]

• Array     Yii, m 1, j  n 1
                j


   upper left corner:              Y [ i, j ]


   lower right corner: Y [ i  m  1, j  n  1]



3/23/04                       DIMACS Workshop      4
                       Entropy Rates


• Output entropy rate: H Y   limm,n 1 H Y1m,n 
                                                ,1
                                                      mn
                                           1
• Noise entropy rate: H  N   log eN 0 
                                           2

• Conditional entropy rate:

          H Y | X   lim
                             1
                      m ,n mn
                                               
                                H Y1m ,n | X1,1,n  H N 
                                    ,1
                                            m




3/23/04                   DIMACS Workshop                    5
               Mutual Information Rates
• Mutual information rate:

          I  X ;Y   H Y   H Y | X   H Y   H  N 

• Capacity:           C  max I  X ;Y 
                            P X 

• Symmetric information rate (SIR):
  Inputs X  x[i, j ] are constrained to be
    independent, identically distributed, and
    equiprobable binary.



3/23/04                    DIMACS Workshop                      6
              Capacity and SIR

• The capacity and SIR are useful measures of
  the achievable storage densities on the two-
  dimensional channel.

• They serve as performance benchmarks for
  channel coding and detection methods.

• So, it would be nice to be able to compute
  them.


3/23/04          DIMACS Workshop               7
          Finding the Output Entropy Rate
• For one-dimensional ISI channel model:

                H Y   lim H Y1n
                              1
                         n  n
                                   
   and
             HY    Elog pY
                1
                 n
                                   1
                                    n
                                        yn
                                          1    
   where
                Y1n  Y 1, Y 2, Y n


3/23/04               DIMACS Workshop               8
                        Sample Entropy Rate

• If we simulate the channel N times, using inputs
  with specified (Markovian) statistics and generating
  output realizations

           y
               (k )
                                                       
                       y[1]( k ) , y[2]( k ) ,, y[n]( k ) , k  1,2,, N
    then

                                     log py 
                                     N
                                1
                              
                                                 (k )

                                N    k 1



    converges to H Y1n               with probability 1 as     N  .

3/23/04                          DIMACS Workshop                             9
           Computing Sample Entropy Rate
• The forward recursion of the sum-product (BCJR)
  algorithm can be used to calculate the probability
           
  P y1n of a sample realization of the channel output.

• In fact, we can write

                 1
                            1 n
                                           
                 log p y1    log p yi | y11
                 n
                         n
                              n i 1
                                             i
                                                   
   where the quantity         
                       p yi | y11
                               i
                                     
                                     is precisely the
   normalization constant in the (normalized) forward
   recursion.
3/23/04                  DIMACS Workshop                10
          Computing Entropy Rates
• Shannon-McMillan-Breimann theorem
  implies


             n
              1
             log p     H Y 
                       n
                      y1
                           a .s .
                            n
   as n   , where y       1 is a single long
   sample realization of the channel output
   process.



3/23/04            DIMACS Workshop               11
          SIR for Partial-Response Channels




3/23/04              DIMACS Workshop          12
          Capacity Bounds for Dicode




3/23/04          DIMACS Workshop       13
              Markovian Sufficiency


   Remark: It can be shown that optimized
   Markovian processes whose states are determined
   by their previous r symbols can asymptotically
   achieve the capacity of finite-state intersymbol
   interference channels with AWGN as the order r
   of the input process approaches .
   (J. Chen and P.H. Siegel, ISIT 2004)




3/23/04            DIMACS Workshop                14
     Capacity and SIR in Two Dimensions

• In two dimensions, we could estimate H Y  by
  calculating the sample entropy rate of a very large
   simulated output array.

• However, there is no counterpart of the BCJR
  algorithm in two dimensions to simplify the
  calculation.

• Instead, we use conditional entropies to derive upper
     and lower bounds on H Y  .

3/23/04              DIMACS Workshop                    15
                          Array Ordering
• Permuted lexicographic ordering:

• Choose vector         k  k1 ,k 2 , a permutation of  2.
                                                          1,
• Map each array index        t1 ,t2     to   tk ,tk 
                                                  1   2
                                                            .

• Then       s1 , s2  precedes t1 ,t2  if
                                sk1  tk1
   or     sk1  tk1   and    sk 2  tk 2    .

• Therefore,
    k  1,2 :   row-by-row ordering
    k  2,1 :   column-by-column ordering


3/23/04                      DIMACS Workshop                     16
                    Two-Dimensional “Past”

• Let   l  l1 ,l2 ,l3 ,l4          be a non-negative
  vector.

• Define               Pastk ,l Y i , j  to be the elements

   preceding Y i , j  inside the region

            i  l2 , j  l4
          Yi l1 , j l3      (with permutation k )


3/23/04                        DIMACS Workshop                    17
          Examples of Past{Y[i,j]}




3/23/04         DIMACS Workshop      18
                 Conditional Entropies
• For a stationary two-dimensional random field Y on
  the integer lattice, the entropy rate satisfies:


                                                    
          H Y   H Y i , j | Pastk , Y i , j 

  (The proof uses the entropy chain rule. See [5-6])
• This extends to random fields on the hexagonal
  lattice,via the natural mapping to the integer lattice.




3/23/04                    DIMACS Workshop               19
                 Upper Bound on H(Y)
• For a stationary two-dimensional random field Y,


                      H Y       min H k ,l
                                         U1
                                    k
   where

                                                           
           H k ,l1 Y   H Y i , j | Pastk ,l Y i , j 
             U




3/23/04                   DIMACS Workshop                       20
      Two-Dimensional Boundary of Past{Y[i,j]}

• Define Strip  i , j  to be the boundary
                   Y
              k ,l

   of Pastk ,l Y i , j  .

• The exact expression for Stripk ,l  i , j 
                                      Y
  is messy, but the geometrical concept is
  simple.



3/23/04                DIMACS Workshop             21
      Two-Dimensional Boundary of Past{Y[i,j]}




3/23/04            DIMACS Workshop               22
                  Lower Bound on H(Y)
• For a stationary two-dimensional hidden Markov
  field Y,
                         H Y         max H k ,l
                                              L1

   where                                 k


                                                                       
  H k ,1 Y   H Y i , j | Pastk ,l Y i , j , X Stk ,l Y i , j 
    L
       l


   and                           
           X St  i , j  is the “state information” for
               k ,l
                    Y

   the strip Strip
                        k ,l
                             Y i , j  .

3/23/04                       DIMACS Workshop                           23
                    Sketch of Proof
• Upper bound:

    Note that    Pastk ,l Y i , j   Pastk , Y i , j 

    and that conditioning reduces entropy.

• Lower bound:

    Markov property of Y i , j  , given “state

                               
    information” X Stk ,l Y i , j          .
3/23/04                   DIMACS Workshop                        24
            Convergence Properties
                         U1
• The upper bound H k ,l on the entropy rate is
  monotonically non-increasing as the size of the
  array defined by l  l1 ,l2 ,l3 ,l4  increases.


                        L1
• The lower bound     H k ,l  on the entropy rate is
  monotonically non-decreasing as the size of the
  array defined by l  l1 ,l2 ,l3 ,l4  increases.




3/23/04              DIMACS Workshop                   25
                    Convergence Rate
                             U1                             L1
• The upper bound          H k ,l   and lower bound       H k ,l

   converge to the true entropy rate         H Y     at least


   as fast as    O(1/lmin) ,   where

            min l1 , l3 , l4 , for row - by - row ordering k
     lmin  
            min l1 , l2 , l3  for column - by - column ordering k



3/23/04                   DIMACS Workshop                          26
              Computing the SIR Bounds

• Estimate the two-dimensional conditional entropies
  H AB        
               over a small array.

                                         
• Calculate P  A, B , P B  to get P A B      
  for many realizations of output array.

• For column-by-column ordering, treat each row
  Y i as a variable and calculate the joint
   probability P 1 , Y 2 ,, Y m 
                 Y                  row-by-row
   using the BCJR forward recursion.

3/23/04              DIMACS Workshop                 27
                   2x2 Impulse Response
•   “Worst-case” scenario - large ISI:
                              0.5 0.5
                  h1[i, j ]         
                              0.5 0.5
                                     

•   Conditional entropies computed from 100,000
    realizations.
                                                                
                 min  H U ,11,7 , 7 ,3, 0   log eN 0 , 1
                                                 1
•   Upper bound:
                     
                           2
                                                 2               

    Lower bound: H 2,1,7,7,3,0  log eN 0 
                                            1
•                    L1

                                            2

    (corresponds to element in middle of last column)

3/23/04                       DIMACS Workshop                        28
          Two-Dimensional “State”




3/23/04         DIMACS Workshop     29
          SIR Bounds for 2x2 Channel




3/23/04          DIMACS Workshop       30
          Computing the SIR Bounds

• The number of states for each variable increases
  exponentially with the number of columns in the
  array.

• This requires that the two-dimensional impulse
  response have a small support region.

• It is desirable to find other approaches to computing
  bounds that reduce the complexity, perhaps at the
  cost of weakening the resulting bounds.



3/23/04             DIMACS Workshop                  31
                        Alternative Upper Bound
• Modified BCJR approach limited to small impulse
  response support region.
• Introduce “auxiliary ISI channel” and bound
                                         H Y   H k ,l2
                                                    U


   where

                     p yi, j , Past yi, j  log q yi, j | Past yi, j d y
                        
                                                                                     
          U2
    H     k ,l
                                        k, l                            k, l       

   and q yi, j | Pastk , l yi, j  is an arbitrary conditional
                                       
                                              

   probability distribution.
3/23/04                               DIMACS Workshop                                   32
          Choosing the Auxiliary Channel
• Assume q yi, j | Pastk , l yi, j  is conditional
                                         
                                         
  probability distribution of the output from an
  auxiliary ISI channel

• A one-dimensional auxiliary channel permits a
  calculation based upon a larger number of columns
  in the output array.

• Conversion of the two-dimensional array into a one-
  dimensional sequence should “preserve” the
  statistical properties of the array.

• Pseudo-Peano-Hilbert space-filling curves can be
  used on a rectangular array to convert it to a
  sequence.
3/23/04                  DIMACS Workshop                     33
             Pseudo-Peano-Hilbert Curve


 Y i , j   Past 2 ,1,7 ,8,7 ,l4 Y i , j 




3/23/04                          DIMACS Workshop      34
          SIR Bounds for 2x2 Channel



             Alternative upper
             bounds --------->




3/23/04             DIMACS Workshop    35
              3x3 Impulse Response
•   Two-DOS transfer function
                                0   1 1
                 h2[i, j ]  1 1
                                    2
                                          
                                         1
                             10          
                                1   1   0

•   Auxiliary one-dimensional ISI channel with memory
    length 4.

•   Useful upper bound up to Eb/N0 = 3 dB.



3/23/04                DIMACS Workshop              36
          SIR Upper Bound for 3x3 Channel




3/23/04             DIMACS Workshop         37
             Concluding Remarks
• Upper and lower bounds on the SIR of two-
  dimensional finite-state ISI channels were presented.

• Monte Carlo methods were used to compute the
  bounds for channels with small impulse response
  support region.

• Bounds can be extended to multi-dimensional ISI
  channels

• Further work is required to develop computable,
  tighter bounds for general multi-dimensional ISI
  channels.

3/23/04             DIMACS Workshop                  38
                                References
1.     D. Arnold and H.-A. Loeliger, “On the information rate of binary-
       input channels with memory,” IEEE International Conference on
       Communications, Helsinki, Finland, June 2001, vol. 9, pp.2692-2695.
2.     H.D. Pfister, J.B. Soriaga, and P.H. Siegel, “On the achievable
       information rate of finite state ISI channels,” Proc. Globecom 2001,
       San Antonio, TX, November2001, vol. 5, pp. 2992-2996.
3.     V. Sharma and S.K. Singh, “Entropy and channel capacity in the
       regenerative setup with applications to Markov channels,” Proc.
       IEEE International Symposium on Information Theory,
       Washington, DC, June 2001, p. 283.
4.     A. Kavcic, “On the capacity of Markov sources over noisy channels,”
       Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp.
       2997-3001.
5.     D. Arnold, H.-A. Loeliger, and P.O. Vontobel, “Computation of
       information rates from finite-state source/channel models,” Proc.40th
       Annual Allerton Conf. Commun., Control, and Computing,
       Monticello, IL, October 2002, pp. 457-466.


 3/23/04                     DIMACS Workshop                             39
                           References
6.     Y. Katznelson and B. Weiss, “Commuting measure-
       preserving transformations,” Israel J. Math., vol. 12, pp.
       161-173, 1972.
7.     D. Anastassiou and D.J. Sakrison, “Some results
       regarding the entropy rates of random fields,” IEEE
       Trans. Inform. Theory, vol. 28, vol. 2, pp. 340-343, March
       1982.




 3/23/04                 DIMACS Workshop                      40

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:13
posted:5/13/2011
language:English
pages:40