Your Federal Quarterly Tax Payments are due April 15th Get Help Now >>

Complexity of Insertion Sort by fdshsdhs

VIEWS: 15 PAGES: 20

									    Asymptotic Notation (O, Ω, )
   Describes the behavior of the time or
    space complexity for large instance
    characteristics
   Common asymptotic functions
    –     1 (constant), log n, n (linear)
    –     n log n, n2 , n3
    –     2n ( exponential), n!
    –    where n usually refer to the number of
        instance of data (data的個數)
Common asymptotic functions
Common asymptotic functions
Common asymptotic functions
               Big-O notation
   The big O notation provides an upper
    bound for the function f
   Definition:
    – f(n) = O(g(n)) iff positive constants c and n0
      exist such that f(n)  c g(n) for all n, nn0
    – e.g. f(n) = 10n2 + 4n + 2
           then, f(n) = O(n2), or O(n3), O(n4), …
           Ω(Omega) notation
   The Ω notation provides a lower bound
    for the function f
   Definition:
    – f(n) = Ω(g(n)) iff positive constants c and n0
      exist such that f(n)  c g(n) for all n, nn0
    – e.g. f(n) = 10n2 + 4n + 2
           then, f(n) = Ω(n2), or Ω(n), Ω(1)
                   notation
   The  notation is used when the function
    f can be bounded both from above and
    below by the same function g
   Definition:
    – f(n) =  (g(n)) iff positive constants c1and c2,
      and an n0 exist such that c1g(n)  f(n)  c2g(n)
      for all n, nn0
    – e.g. f(n) = 10n2 + 4n + 2
      then, f(n) = Ω(n2), and f(n)=O(n2), therefore,
      f(n)=  (n2)
    Practical Complexities
      109 instructions/second
                             2         3
n       n        nlogn n              n
1000    1mic     10mic     1milli     1sec


10000   10mic    130mic    100milli   17min


106     1milli   20milli   17min      32years
Impractical Complexities
  109 instructions/second
          4        10            n
 n       n       n              2
                           13            283
 1000    17min   3.2 x 10       3.2 x 10
                 years          years

 10000   116      ???           ???
         days

 10^6    3 x 10^7 ??????        ??????
         years
    Faster Computer Vs Better
            Algorithm

Algorithmic improvement more useful
than hardware improvement.

E.g. 2n to n3
Amortized Complexity of Task Sequence
   Suppose that a sequence of n tasks is
    performed.
   The worst-case cost of a task is cwc.
   Let ci be the (actual) cost of the ith task in this
    sequence.
   So, ci <= cwc, 1 <= i <= n.
   n * cwc is an upper bound on the cost of the
    sequence.
   j * cwc is an upper bound on the cost of the
    first j tasks.
              Task Sequence
   Let cavg be the average cost of a task in
    this sequence.
   So, cavg = Sci/n.
   n * cavg is the cost of the sequence.
   j * cavg is not an upper bound on the
    cost of the first j tasks.
   Usually, determining cavg is quite hard.
         Amortized complexity

   At times, a better upper bound than j *
    cwc or n * cwc on sequence cost is
    obtained using amortized complexity.
   The purpose of analyzing amortized
    complexity is to get tighter upper bound
    of the actual cost
            Amortized Complexity
   The amortized complexity of a task is the amount
    you charge the task.
   The conventional way to bound the cost of doing a
    task n times is to use one of the expressions
     n*(worst-case cost of task)
     S(worst-case cost of task i)
   The amortized complexity way to bound the cost
    of doing a task n times is to use one of the
    expressions
     n*(amortized cost of task)
     S(amortized cost of task i)
          Amortized Complexity
   The amortized complexity/cost of individual
    tasks in any task sequence must satisfy:
     S(actual cost of task i)
     <= S(amortized cost of task i)
   So, we can use
     S(amortized cost of task i)
     as a bound on the actual complexity of the
    task sequence.
           Amortized Complexity
   The amortized complexity of a task may
    bear no direct relationship to the actual
    complexity of the task.
           Amortized Complexity
   In conventional complexity analysis, each
    task is charged an amount that is >= its
    cost.
     S(actual cost of task i)
     <= S(worst-case cost of task i)
   In amortized analysis, some tasks may be
    charged an amount that is < their cost.
     S(actual cost of task i)
     <= S(amortized cost of task i)
        Amortized complexity
example:
  – suppose that a sequence of insert/delete
    operations: i1, i2, d1, i3, i4, i5, i6, i7, i8, d2, i9
    actual cost of each insert is 1 and d1 is 8, d2 is 12.
    So, the total cost is 29.
  – In convention, for the upper bound of the cost of
    the sequence of tasks, we will consider the worst
    case of each operations; i.e., the cost of insert is 1
    and the cost of delete is 12. Therefore, the upper
    bound of the cost of the sequence of tasks is
    9*1+12*2 = 33
     Amortized complexity
– In amortization scheme we charge the
  actual cost of some operations to others.
– The amortized cost of each insert could be
  2, d1 and d2 become 6. total is
  9*2+6*2=30, which is less than worst case
  upper bound 33.
        Amortized Complexity
   more examples
    – the amortized complexity of a sequence of
      Union and Find operations (will be discuss
      in chapter 8)
    – the amortized complexity of a sequence of
      inserts and deletes of binomial heaps

								
To top