CSCI 3330 Algorithms

Document Sample
CSCI 3330 Algorithms Powered By Docstoc
					Chapter 0: Prologue
   What is algorithm?
   Why should we study it?
   Example: Fibonacci sequence
   Algorithm analysis




                                  1
Algorithm
   Algorithm:
       Webster Dictionary: In mathematics, any
        special method of solving a certain kind of
        problems
       CS Textbook: A step-by-step procedure for
        solving a problem in a finite amount of time
   In this course, we will study methodologies
    for algorithm design, development and
    analysis with various applications
                                                       2
Why study algorithm?
   Algorithms form the heart of computer
    science in problem-solving and system
    development
   “Fabulous hardware, but software sucks!”
    In order to develop efficient software, one
    must know algorithms well.


                                                  3
Problem solving
   Problem description/specification: Prior of solving
    a problem, we must clearly know the problem to
    be solved.
   Problem formulation: Appropriately define/model
    the problem to be solved problem
   Design algorithm: a step-by-step approach to
    solve the problem
   Verification and evaluation: Does the designed
    algorithm solve the problem? How good is it?

                                                      4
Example 1: Revisit Fibonacci
sequence
   Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, …
   F(0) = 0, F(1) = 1, for n > 1, F(n) = F(n-2) +
    F(n-1)
   The definition directly implies a recursive
    approach to solve the problem
      function Fib1(n)
      if n = 1 return 1
      if n = 2 return 1
      return Fib1(n-1) + Fib1(n-2)
                                                 5
Questions:
   Does it work correctly?
       Yes – it directly implements the definition of
        Fibonacci numbers.
   How long does it take?
       This is not so obvious…




                                                         6
   Running time analysis
          function Fib1(n)
          if n = 1 return 1
          if n = 2 return 1
          return Fib1(n-1) + Fib1(n-2)
Let T(n) = number of steps needed to compute Fn. Then: T(n)
  > T(n-1) + T(n-2)
But Fn = Fn-1 + Fn-2. Therefore T(n) > Fn » 20.694n !
Exponential time... how bad is this?

Eg. Computing F200 needs about 2140 operations.
How long does this take on a fast computer?
                                                              7
Why exponential time bad?
   The Earth simulator needs 295 seconds for
    F200.
    Time in seconds   Interpretation
             210      17 minutes
             220      12 days
             230      32 years
             240      cave paintings
             245      homo erectus discovers fire
             251      extinction of dinosaurs
             257      creation of Earth
             260      origin of universe
                                                    8
   Postmortem
                                             function Fib1(n)
What takes so long?                          if n = 1 return 1
Let’s unravel the recursion…
                                             if n = 2 return 1
                              Fn             return Fib1(n-1) + Fib1(n-2)

                    Fn-1                  Fn-2


           Fn-2      Fn-3           Fn-3          Fn-4


    Fn-3   Fn-4   Fn-4     Fn-5    Fn-4    Fn-5   Fn-5   Fn-6

The same subproblems get solved over and over again!
                                                                            9
     A better algorithm
    Subproblems: F1, F2, …, Fn. Solve them in order and save their values!
         function Fib2(n)
         Create an array fib[1..n]
         fib[1] = 1
         fib[2] = 1
         for i = 3 to n:
             fib[i] = fib[i-1] + fib[i-2]
         return fib[n]

   Does it return the correct answer?
   How fast is it?
        Number of operations is proportional to n. [Previous method: 20.7n]
        F200 is now reasonable to compute, as are F2000 and F20000.
        Moral: the right algorithm makes all the difference.                  10
Polynomial vs. exponential
   Running times like
       n, n2, n3, are polynomial.
   Running times like
                    n
       2n, en, 2       are exponential.
   To an excellent first approximation:
       polynomial is reasonable
       exponential is not reasonable
   This is one of the most fundamental dichotomies
    in the analysis of algorithms.
                                                      11
Analyzing algorithms
   Count operations, not time
       An operation is “small step”
       e.g., a single program statement; an arithmetic
        operation; assignment to a variable; etc.
   No. of operations depends on the input
       “the larger the number N, the larger the number
        of operations”


                                                          12
Big-O Notation
   The magnitude of the number of operations
   Less precise than the exact number
   More useful for comparing two algorithms
    as input grows larger
   Rough idea: “term in the formula which
    grows most quickly”


                                            13
Big-O Notation
   Quadratic Time
       largest term no more than c  n 2
       “big-O of n-squared” O n 2 
       doubling the input increases the number of
        operations approximately 4 times or less




                                                     15
Big-O Notation
   Linear Time
       largest term no more than c  n
       “big-O of n” On 
       doubling the input increases the number of
        operations approximately 2 times or less




                                                     16
Big-O Notation
   Logarithmic Time
       largest term no more than c  log n 
       “big-O of log n” Olog n
       doubling the input increases the running time
        by a fixed number of operations




                                                        17
Big-Oh examples
   7n-2 is O(?)
   3n3 + 20n2 + 5 is O(?)
   3log n + log log n is O(?)




                                 18
Relatives of Big-Oh
   big-Omega
       f(n) is Ω(g(n)) if there is a constant c > 0 and an integer

        constant n0≥1 such that f(n) ≥ c•g(n) for n ≥ n0
   big-Theta
       f(n) is Ө(g(n)) if there are constants c’ > 0 and c’’ > 0 and an

        integer constant n0≥1 such that c’•g(n) ≤ f(n) ≤ c’’•g(n) for n
        ≥ n0
   little-oh
       f(n) is o(g(n)) if, for any constant c > 0, there is an integer

        constant n0≥0 such that f(n) < c•g(n) for n ≥ n0
   little-omega
       f(n) is ω(g(n)) if, for any constant c > 0, there is an integer

        constant n0≥0 such that f(n) > c•g(n) for n ≥ n0
                                                                      21
    General Rules For Running
    Time Calculation
   Rule One: Loops
    The running time of a loop is at most the running time of the
    statements inside the loop, multiplied by the number of iterations.
     Example:
      for (i = 0; i < n; i++)      // n iterations
       A[i] = (1-t)*X[i] + t*Y[i];

     the total running time is O(n).




                                                                          24
  General Rules For Running
  Time Calculation
Rule Two: Nested Loops
The running time of a nested loop is at most the running time
of the statements inside the innermost loop, multiplied by the
product of the number of iterations of all of the loops.
Example:
   for (i = 0; i < n; i++) // n iterations
      for (j = 0; j < n; j++) // n iterations
         C[i,j] = j*A[i] + i*B[j];
Total running time: O(n2).



                                                                 25
 General Rules For Running
 Time Calculation
Rule Three: Consecutive Statements
The running time of a sequence of statements is
merely the sum of the running times of the
individual statements.
 Example:
 for (i = 0; i < n; i++)
 {
     A[i] = (1-t)*X[i] + t*Y[i];
     B[i] = (1-s)*X[i] + s*Y[i];
 }
 for (i = 0; i < n; i++)            Total running time: O(n2).
     for (j = 0; j < n; j++)
        C[i,j] = j*A[i] + i*B[j];
                                                                 26
   Rule Four: Conditional Statements
   The running time of an if-else statement is at most the
   running time of the conditional test, added to the maximum
   of the running times of the if and else blocks of statements.
Example:
 if (amt > cost + tax)
 {
     count = 0;
     while ((count<n) && (amt>cost+tax))
     {
        amt -= (cost + tax);
        count++;
      }
     cout << “CAPACITY:” << count;         Total running time:
 }                                         O(n).
else
     cout << “INSUFFICIENT FUNDS”;                                 27

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:0
posted:5/12/2013
language:Unknown
pages:22