• 沒有找到結果。

Course information

N/A
N/A
Protected

Academic year: 2022

Share "Course information"

Copied!
26
0
0

加載中.... (立即查看全文)

全文

(1)

Introduction to Algorithms 6.046J/18.401J

LECTURE1

Analysis of Algorithms

•Insertion sort

•Asymptotic analysis

•Merge sort

•Recurrences

Copyright © 2001-5 Erik D. Demaineand Charles E. Leiserson

Prof. Charles E. Leiserson

Course information

1.Staff

2.Distance learning 3.Prerequisites 4.Lectures 5.Recitations 6.Handouts 7.Textbook

8.Course website 9.Extra help

10.Registration 11.Problem sets

12.Describing algorithms 13.Grading policy

14.Collaboration policy

September 7, 2005 Introduction to Algorithms L1.2

(2)

Analysis of algorithms

The theoretical study of computer-program performance and resource usage.

What’s more important than performance?

•modularity

•correctness

•maintainability

•functionality

•robustness

•user-friendliness

•programmer time

•simplicity

•extensibility

•reliability

September 7, 2005

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.3

Why study algorithms and performance?

‧Algorithms help us to understand scalability.

‧Performance often draws the line between what is feasible and what is impossible.

‧Algorithmic mathematics provides a language for talking about program behavior.

‧Performance is the currency of computing.

‧The lessons of program performance generalize to other computing resources.

‧Speed is fun!

September 7, 2005

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.4

(3)

The problem of sorting

Input: sequence 〈a 1 , a 2 , …, a n 〉 of numbers.

Output: permutation 〈a' 1 , a' 2 , …, a' n 〉 Such that a' 1 ≤a' 2 ≤…≤a' n .

Example:

Input: 8 2 4 9 3 6 Output: 2 3 4 6 8 9

September 7, 2005 L1.5

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

INSERTION-SORT

I

NSERTION

-S

ORT

(A, n)A[1 . . n]

for j ←2 to n do key ← A[ j]

i ← j –1

while i > 0 and A[i] > key do A[i+1] ← A[i]

i ← i –1 A[i+1] = key

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.6

September 7, 2005

“pseudocode”

(4)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.7

September 7, 2005

I

NSERTION

-S

ORT

(A, n)A[1 . . n]

for j ←2 to n do key ← A[ j]

i ← j –1

while i > 0 and A[i] > key do A[i+1] ← A[i]

i ← i –1 A[i+1] = key

“pseudocode”

INSERTION-SORT

i

A:

sorted key

n 1 j

Example of insertion sort

8 2 4 9 3 6

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.8

September 7, 2005

(5)

Example of insertion sort

8 2 4 9 3 6

September 7, 2005 L1.9

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.10

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

8 2 4 9 3 6 2 8 4 9 3 6

Example of insertion sort

(6)

September 7, 2005

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.11

Example of insertion sort

8 2 4 9 3 6 2 8 4 9 3 6

September 7, 2005

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.12

Example of insertion sort

8 2 4 9 3 6

2 8 4 9 3 6

2 4 8 9 3 6

(7)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.13

Example of insertion sort

8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6

September 7, 2005

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Introduction to Algorithms L1.14

Example of insertion sort

8 2 4 9 3 6

2 8 4 9 3 6

2 4 8 9 3 6

2 4 8 9 3 6

(8)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.15

Example of insertion sort

8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6

September 7, 2005 L1.16

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

Example of insertion sort

8 2 4 9 3 6

2 8 4 9 3 6

2 4 8 9 3 6

2 4 8 9 3 6

2 3 4 8 9 6

(9)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.17

Example of insertion sort

8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.18

Example of insertion sort

8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6

2 3 4 6 8 9 done

(10)

Running time

•The running time depends on the input: an already sorted sequence is easier to sort.

•Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones.

•Generally, we seek upper bounds on the running time, because everybody likes a guarantee.

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.19

Kinds of analyses

Worst-case: (usually)

T(n) =maximum time of algorithm on any input of size n.

Average-case: (sometimes)

T(n) =expected time of algorithm over all inputs of size n.

• Need assumption of statistical distribution of inputs.

Best-case: (bogus)

• Cheat with a slow algorithm that works fast on some input.

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.20

(11)

“Asymptotic Analysis”

Machine-independent time

What is insertion sort’s worst-case time?

•It depends on the speed of our computer:

•relative speed (on the same machine),

•absolute speed (on different machines).

BIG IDEA:

•Ignore machine-dependent constants.

•Look at growth of T(n) as n→∞.

“Asymptotic Analysis”

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.21

Θ-notation

Math:

Θ(g(n)) = { f (n): there exist positive constants c 1 , c 2 , and n 0

such that 0 ≤c 1 g(n) ≤f (n) ≤c 2 g(n) for all n≥n 0 }

Engineering:

•Drop low-order terms; ignore leading constants.

•Example: 3n 3 + 90n 2 –5n+ 6046 = Θ(n 3 )

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.22

(12)

Asymptotic performance

When n gets large enough, a Θ(n 2 )algorithm always beats a Θ(n 3 )algorithm.

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.23

•We shouldn’t ignore asymptotically slower algorithms, however.

•Real-world design situations often call for a careful balancing of engineering objectives.

•Asymptotic analysis is a useful tool to help to structure our thinking.

T(n)

n n

0

Insertion sort analysis

Worst case: Input reverse sorted.

[arithmetic series ] Average case: All permutations equally likely.

Is insertion sort a fast sorting algorithm?

•Moderately so, for small n.

•Not at all, for large n.

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.24

) ( ) ( )

(

2

2

n j

n T

n

j

Θ

= Θ

= ∑

=

) ( ) 2 / ( )

(

2

2

n j

n T

n

j

Θ

= Θ

= ∑

=

(13)

Merge sort

MERGE-SORT A[1 . . n]

1.If n= 1, done.

2.Recursively sort A[ 1 . . .n/2.]and A[ [n/2]+1 . . n ] .

3.“Merge” the 2 sorted lists.

Key subroutine: MERGE

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.25

Merging two sorted arrays

20 12 13 11 7 9 2 1

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.26

(14)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1. 27

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.28

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

20 12

13 11

7 9

2

(15)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.29

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

20 12 13 11 7 9 2

2

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.30

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

20 12 13 11 7 9 2

2

20 12

13 11

7 9

(16)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.31

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

20 12 13 11 7 9 2

20 12 13 11 7 9

2 7

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.32

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

20 12 13 11 7 9 2

20 12 13 11 7 9

2 7

20 12

13 11

9

(17)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.33

Merging two sorted arrays

20 12 13 11 7 9 2 1

1

20 12 13 11 7 9 2

20 12 13 11 7 9

20 12 13 11 9

2 7 9

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.34

Merging two sorted arrays

20 12 13 11 7 9 2 1

20 12 13 11 7 9 2

20 12 13 11 7 9

20 12 13 11 9

20 12 13 11

1 2 7 9

(18)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.35

Merging two sorted arrays

20 12 13 11 7 9 2 1

20 12 13 11 7 9 2

20 12 13 11 7 9

20 12 13 11 9

20 12 13 11

1 2 7 9 11

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.36

Merging two sorted arrays

20 12 13 11 7 9 2 1 1

20 12 13 11 7 9 2

2

20 12 13 11 7 9

20 12 13 11 9

20 12 13 11

20 12 13

7 9 11

(19)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.37

Merging two sorted arrays

20 12 13 11 7 9 2 1

20 12 13 11 7 9 2

20 12 13 11 7 9

20 12 13 11 9

20 12 13 11

20 12 13

1 2 7 9 11 12

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.37

Merging two sorted arrays

20 12 13 11 7 9 2 1

20 12 13 11 7 9 2

20 12 13 11 7 9

20 12 13 11 9

20 12 13 11

20 12 13

1 2 7 9 11 12

Time = Θ(n) to merge a total

of n elements (linear time).

(20)

Analyzing merge sort

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.38

MERGE-SORTA[1 . . n]

1.If n= 1, done.

2.Recursively sort A[ 1 . . n/2 ] and A[ n/2 +1 . . n ] .

3.“Merge”the 2sorted lists Sloppiness: Should be T( 「 n/2 」 ) + T( 「 n/2 」 ) , but it turns out not to matter asymptotically.

T(n) Θ(1) 2T(n/2) Θ(n) Abuse

Recurrence for merge sort

• We shall usually omit stating the base case when T(n) = Θ(1) for sufficiently small n, but only when it has no effect on the asymptotic solution to the recurrence.

• CLRS and Lecture 2 provide several ways to find a good upper bound on T(n).

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.39

Θ(1) if n= 1;

2T(n/2)+ Θ(n) if n> 1.

T(n) =

(21)

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.40

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

T(n)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.41

(22)

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

T(n/2) T(n/2)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.42

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2

T(n/4) T(n/4) T(n/4) T(n/4)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.43

(23)

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2

cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2 cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

h= lgn

(24)

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2

cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

h= lgn

cn

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2 cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

h= lgn

cn

cn

(25)

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2

cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

h= lgn

cn

cn

cn

.. .

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2 cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

h= lgn

cn

cn

cn

.. .

#leaves = n Θ(n)

(26)

Recursion tree

Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

cn

cn/2 cn/2

cn/4 cn/4 cn/4 cn/4

Θ(1)

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.44

h= lgn

cn

cn

cn

.. .

#leaves = n Θ(n)

Total= Θ( n lg n)

Conclusions

• Θ(n lg n) grows more slowly than Θ(n 2 ).

• Therefore, merge sort asymptotically beats insertion sort in the worst case.

• In practice, merge sort beats insertion sort for n> 30 or so.

• Go test it out for yourself!

Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

September 7, 2005 L1.48

參考文獻

相關文件

• By definition, a pseudo-polynomial-time algorithm becomes polynomial-time if each integer parameter is limited to having a value polynomial in the input length.. • Corollary 42

• Suppose, instead, we run the algorithm for the same running time mkT (n) once and rejects the input if it does not stop within the time bound.. • By Markov’s inequality, this

Too good security is trumping deployment Practical security isn’ t glamorous... USENIX Security

Since we use the Fourier transform in time to reduce our inverse source problem to identification of the initial data in the time-dependent Maxwell equations by data on the

Reading Task 6: Genre Structure and Language Features. • Now let’s look at how language features (e.g. sentence patterns) are connected to the structure

This kind of algorithm has also been a powerful tool for solving many other optimization problems, including symmetric cone complementarity problems [15, 16, 20–22], symmetric

x=spawn P-FIB(n-1)

 Corollary: Let be the running time of a multithreaded computation produced by a g reedy scheduler on an ideal parallel comp uter with P processors, and let and be the work