• 沒有找到結果。

Algorithm Design and Analysis Divide and Conquer (1)

N/A
N/A
Protected

Academic year: 2022

Share "Algorithm Design and Analysis Divide and Conquer (1)"

Copied!
106
0
0
顯示更多 ( 頁)

全文

(1)

Algorithm Design and Analysis

Divide and Conquer (1)

(2)

Algorithm Design Strategy

• Do not focus on “specific algorithms”

• But “some strategies” to “design” algorithms

• First Skill: Divide-and-Conquer (各個擊破/分治)

(3)

Outline

• Recurrence (遞迴)

• Divide-and-Conquer

• D&C #1: Tower of Hanoi (河內塔)

• D&C #2: Merge Sort

• D&C #3: Bitonic Champion

• D&C #4: Maximum Subarray

• Solving Recurrences

• Substitution Method

• Recursion-Tree Method

• Master Method

• D&C #5: Matrix Multiplication

• D&C #6: Selection Problem

• D&C #7: Closest Pair of Points Problem

Divide-and-Conquer 之神乎奇技

Divide-and-Conquer 首部曲

(4)

What is Divide-and-Conquer?

• Solve a problem recursively

• Apply three steps at each level of the recursion

1. Divide the problem into a number of subproblems that are smaller instances of the same problem (比較小的同樣問題)

2. Conquer the subproblems by solving them recursively If the subproblem sizes are small enough

• then solve the subproblems

• else recursively solve itself

3. Combine the solutions to the subproblems into the solution for the original problem

base case recursive case

(5)

Divide-and-Conquer Benefits

• Easy to solve difficult problems

• Thinking: solve easiest case + combine smaller solutions into the original solution

• Easy to find an efficient algorithm

• Better time complexity

• Suitable for parallel computing (multi-core systems)

• More efficient memory access

• Subprograms and their data can be put in cache in stead of accessing main memory

(6)

Recurrence (遞迴)

(7)

Recurrence Relation

• Definition

A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs.

• Example

Fibonacci sequence (費波那契數列)

• Base case: F(0) = F(1) = 1

• Recursive case: F(n) = F(n-1) + F(n-2)

n 0 1 2 3 4 5 6 7 8

F(n) 1 1 2 3 5 8 13 21 34

2

1 1 3 5 8

13

21

(8)

Recurrent Neural Network (RNN)

(9)

Recurrence Benefits

• Easy & Clear

• Define base case and recursive case

• Define a long sequence

Base case

Recursive case

F(0), F(1), F(2)………

unlimited sequence

a program for solving F(n)

Fibonacci(n) // recursive function: 程式中會呼叫自己的函數 if n < 2 // base case: termination condition

return 1

// recursive case: call itself for solving subproblems return Fibonacci(n-1) + Fibonacci(n-2)

important otherwise the program cannot stop

(10)

Recurrence v.s. Non-Recurrence

Fibonacci(n) if n < 2

____return 1 a[0] <- 1 a[1] <- 1

for i = 2 … n

____a[i] = a[i-1] + a[i-2]

return a[n]

Fibonacci(n)

if n < 2 // base case ____return 1

// recursive case

return Fibonacci(n-1) + Fibonacci(n-2)

Recursive function

Clear structure

Poor efficiency

Non-recursive function

Better efficiency

Unclear structure

(11)

Recurrence Benefits

• Easy & Clear

• Define base case and recursive case

• Define a long sequence

Hanoi(n) is not easy to solve.

✓ It is easy to solve when n is small

✓ we can find the relation between Hanoi(n) & Hanoi(n-1) Base case

Recursive case

If a problem can be simplified into a base case and a recursive case, then we can find an algorithm that solves this problem.

Base case

Recursive case

F(0), F(1), F(2)………

unlimited sequence

a program for solving F(n)

(12)

D&C #1: Tower of Hanoi

(13)

Tower of Hanoi (河內塔)

• Problem: move n disks from A to C

• Rules

• Move one disk at a time

• Cannot place a larger disk onto a smaller disk

A B C

(14)

Hanoi(1)

• Move 1 from A to C

Disk 1

A B C

→ 1 move in total Base case

Disk 1

(15)

Hanoi(2)

• Move 1 from A to B

• Move 2 from A to C

• Move 1 from B to C

Disk 2 Disk 1

→ 3 moves in total

Disk 1 Disk 2

Disk 1

(16)

Hanoi(3)

• How to move 3 disks?

• How many moves in total?

Disk 3

A B C

Disk 2 Disk 1

(17)

Hanoi(n)

• How to move n disks?

• How many moves in total?

Disk n Disk n-1 Disk n-2

(18)

Hanoi(n)

• To move n disks from A to C (for n > 1):

1. Move Disk 1~n-1 from A to B

Disk n

A B C

Disk n-1 Disk n-2

(19)

Hanoi(n)

• To move n disks from A to C (for n > 1):

1. Move Disk 1~n-1 from A to B

Disk n Disk n-1

Disk n-2

(20)

Hanoi(n)

• To move n disks from A to C (for n > 1):

1. Move Disk 1~n-1 from A to B 2. Move Disk n from A to C

Disk n

A B C

Disk n-1 Disk n-2

(21)

Hanoi(n)

• To move n disks from A to C (for n > 1):

1. Move Disk 1~n-1 from A to B 2. Move Disk n from A to C

Disk n Disk n-1

Disk n-2

(22)

Hanoi(n)

• To move n disks from A to C (for n > 1):

1. Move Disk 1~n-1 from A to B 2. Move Disk n from A to C

3. Move Disk 1~n-1 from B to C

A B C

Disk n-1 Disk n-2

Disk n

(23)

Hanoi(n)

• To move n disks from A to C (for n > 1):

1. Move Disk 1~n-1 from A to B 2. Move Disk n from A to C

3. Move Disk 1~n-1 from B to C

Disk n-1 Disk n-2 Disk n

→ 2Hanoi(n-1) + 1 moves in total recursive case

(24)

Pseudocode for Hanoi

• Call tree

Hanoi(n, src, dest, spare) if n==1 // base case

Move disk from src to dest else // recursive case

Hanoi(n-1, src, spare, dest) Move disk from src to dest Hanoi(n-1, spare, dest, src)

No need to combine the results in this case

Hanoi(3, A, C, B)

Hanoi(2, A, B, C) Hanoi(2, B, C, A)

Hanoi(1,A,C,B) Hanoi(1,C,B,A) Hanoi(1,B,A,C) Hanoi(1,A,C,B)

(25)

Algorithm Time Complexity

• 𝑇 𝑛 = #moves with n disks

• Base case: 𝑇 1 = 1

• Recursive case (𝑛 > 1): 𝑇 𝑛 = 2𝑇 𝑛 − 1 + 1

• We will learn how to derive 𝑇 𝑛 later

Hanoi(n, src, dest, spare) if n==1 // base case

Move disk from src to dest else // recursive case

Hanoi(n-1, src, spare, dest) Move disk from src to dest Hanoi(n-1, spare, dest, src)

(26)

Further Questions

• Q1: Is 𝑂 2𝑛 tight for Hanoi? Can 𝑇 𝑛 < 2𝑛 − 1?

• Q2: What about more than 3 pegs?

• Q3: Double-color Hanoi problem

• Input: 2 interleaved-color towers

• Output: 2 same-color towers

(27)

D&C #2: Merge Sort

Textbook Chapter 2.3.1 – The divide-and-conquer approach

(28)

Sorting Problem

6

Input: unsorted list of size n

Output: sorted list of size n

What are the base case and recursive case?

3 5 1 8 7 2 4

1 2 3 4 5 6 7 8

(29)

Divide-and-Conquer

• Base case (n = 1)

• Directly output the list

• Recursive case (n > 1)

• Divide the list into two sub-lists

• Sort each sub-list recursively

• Merge the two sorted lists

1 3 5 6 2 4 7 8 2 sublists of size n/2

# of comparisons = Θ(𝑛)

How?

(30)

Illustration for n = 10

6 3 5 1 8 9 7 2 10 4

6 3 5 1 8 9 7 2 10 4

6 3 5 1 8 9 7 2 10 4

6 3 5 1 8 9 7 2 10 4

6 3 9 7

(31)

Illustration for n = 10

6 3 5 1 8 9 7 2 10 4

6 3 5 1 8 9 7 2 10 4

6 3 5 1 8 9 7 2 10 4

6 3 5 1 8 9 7 2 10 4

6 3 9 7

6 3

6

3 5 1 8

6

3 5 8

1

7 9

2 7 9 4 10

10 9

4 7 2

9 10 7 8

6 5

3 4 1 2

(32)

Pseudocode for Merge Sort

• Divide a list of size n into 2 sublists of size n/2

• Recursive case (𝑛 > 1)

• Sort 2 sublists recursively using merge sort

• Base case (𝑛 = 1)

• Return itself

• Merge 2 sorted sublists into one sorted list in linear time

MergeSort(A, p, r) // base case

if p == r ___return

// recursive case // divide

q = [(p+r-1)/2]

// conquer

MergeSort(A, p, q) MergeSort(A, q+1, r) // combine

Merge(A, p, q, r)

1. Divide

2. Conquer

3. Combine

(33)

Time Complexity for Merge Sort

• Divide a list of size n into 2 sublists of size n/2

• Recursive case (𝑛 > 1)

• Sort 2 sublists recursively using merge sort

• Base case (𝑛 = 1)

• Return itself

• Merge 2 sorted sublists into one sorted list in linear time

MergeSort(A, p, r) // base case

if p == r ___return

// recursive case // divide

q = [(p+r-1)/2]

// conquer

MergeSort(A, p, q) MergeSort(A, q+1, r) // combine

Merge(A, p, q, r)

1. Divide

2. Conquer

3. Combine

𝑇 𝑛 = time for running MergeSort(A, p, r)

(34)

Time Complexity for Merge Sort

• Simplify recurrences

• Ignore floors and ceilings (boundary conditions)

• Assume base cases are constant (for small n)

2nd expansion 1st expansion

The expansion stops when 2𝑘 = 𝑛 kth expansion

(35)

Theorem 1

• Theorem

• Proof

• There exists positive constant 𝑎, 𝑏 s.t.

• Use induction to prove

• n = 1, trivial

• n > 1,

Inductive hypothesis

(36)

How to Solve Recurrence Relations?

1. Substitution Method (取代法)

• Guess a bound and then prove by induction

2. Recursion-Tree Method (遞迴樹法)

• Expand the recurrence into a tree and sum up the cost

3. Master Method (套公式大法/大師法)

• Apply Master Theorem to a specific form of recurrences

Let’s see more examples first and come back to this later

(37)

D&C #3: Bitonic Champion Problem

(38)

Bitonic Champion Problem

The bitonic sequence means “increasing before the champion and decreasing after the champion” (冠軍之前遞增、冠軍之後遞減)

3 7 9 17 35 28 21 18 6 4

(39)

Bitonic Champion Problem Complexity

Why not Ω(n)?

Why?

(40)

Bitonic Champion Problem Complexity

• When there are n inputs, any solution has n different outputs

• Any comparison-based algorithm needs Ω(log 𝑛) time in the worst case

n

Ω(log 𝑛)

(41)

Bitonic Champion Problem Complexity

(42)

Divide-and-Conquer

• Idea: divide A into two subproblems and then find the final champion based on the champions from two subproblems

Champion(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) l = Champion(i, k) r = Champion(k+1, j) if A[l] > A[r]

return l

if A[l] < A[r]

return r

Output = Champion(1, n)

(43)

Illustration for n = 10

3 7 9 17 35 28 21 18 6 4

3 7 9 17 35 28 21 18 6 4

3 7 9 17 35 28 21 18 6 4

3 7 9 17 35 28 21 18 6 4

3 7 28 21

(44)

Proof of Correctness

• Practice by yourself!

Champion(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) l = Champion(i, k) r = Champion(k+1, j) if A[l] > A[r]

return l

if A[l] < A[r]

return r

Output = Chamption(1, n)

Hint: use induction on (j – i) to prove Champion(i, j) can return the champion from A[i … j]

(45)

Algorithm Time Complexity

• 𝑇 𝑛 = time for running Champion(i, j) with 𝑗 – 𝑖 + 1 = 𝑛

Champion(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) l = Champion(i, k) r = Champion(k+1, j) if A[l] > A[r]

return l

if A[l] < A[r]

return r

• Divide a list of size n into 2 sublists of size n/2

• Recursive case

• Find champions from 2 sublists recursively

• Base case

• Return itself

• Choose the final champion by a single comparison

1. Divide

2. Conquer

3. Combine

(46)

Theorem 2

• Theorem

• Proof

• There exists positive constant 𝑎, 𝑏 s.t.

• Use induction to prove

• n = 1, trivial

• n > 1,

Inductive hypothesis

(47)

Bitonic Champion Problem Complexity

Can we have a better algorithm by using the bitonic sequence property?

(48)

Improved Algorithm

Champion-2(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) if A[k] > A[k+1]

return Champion(i, k) if A[k] < A[k+1]

return Champion(k+1, j) Champion(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) l = Champion(i, k) r = Champion(k+1, j) if A[l] > A[r]

return l

if A[l] < A[r]

return r

(49)

Illustration for n = 10

3 7 9 17 35 28 21 18 6 4

3 7 9 17 35

17 35

35

(50)

Correctness Proof

• Practice by yourself!

Champion-2(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) if A[k] > A[k+1]

return Champion(i, k) if A[k] < A[k+1]

return Champion(k+1, j) Output = Champion-2(1, n)

Two crucial observations:

• If 𝐴[1 … 𝑛] is bitonic, then so is 𝐴[𝑖, 𝑗] for any indices 𝑖 and 𝑗 with 1 ≤ 𝑖 ≤ 𝑗 ≤ 𝑛.

• For any indices 𝑖, 𝑗, and 𝑘 with 1 ≤ 𝑖 ≤ 𝑗 ≤ 𝑛, we know that 𝐴[𝑘] > 𝐴[𝑘 + 1] if and only if the maximum of 𝐴[𝑖 … 𝑗] lies in 𝐴[𝑖 … 𝑘].

(51)

Algorithm Time Complexity

• 𝑇 𝑛 = time for running Champion-2(i, j) with 𝑗 – 𝑖 + 1 = 𝑛

Divide a list of size n into 2 sublists of size n/2

Recursive case

Find champions from 1 sublists recursively

Base case

Return itself

Return the champion

1. Divide

2. Conquer

3. Combine

Champion-2(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) if A[k] > A[k+1]

return Champion(i, k) if A[k] < A[k+1]

return Champion(k+1, j)

(52)

Algorithm Time Complexity

• 𝑇 𝑛 = time for running Champion-2(i, j) with 𝑗 – 𝑖 + 1 = 𝑛

Champion-2(i, j)

if i==j // base case return i

else // recursive case k = floor((i+j)/2) if A[k] > A[k+1]

return Champion(i, k) if A[k] < A[k+1]

return Champion(k+1, j)

The algorithm time complexity is 𝑂 log 𝑛

• each recursive call reduces the size of (j - i) into half

• there are 𝑂 log 𝑛 levels

• each level takes 𝑂 1

(53)

Theorem 3

• Theorem

• Proof Practice to prove by induction

(54)

Bitonic Champion Problem Complexity

(55)

D&C #4: Maximum Subarray

Textbook Chapter 4.1 – The maximum-subarray problem

(56)

Coding Efficiency

• How can we find the most efficient time interval for continuous coding?

5pm 6pm 7pm 8pm 9pm 10pm 11pm 12am 1am 2am 3am

1 2 3 4

-4-3 -2 -1 0

Coding power 戰鬥力 (K)

7pm-2:59am

Coding power= 8k

(57)

Maximum Subarray Problem

3 7 9 17 5 28 21 18 6 4

-3 7 -9 17 -5 28 -21 18 -6 4

(58)

O(n 3 ) Brute Force Algorithm

MaxSubarray-1(i, j) for i = 1,…,n

for j = 1,…,n S[i][j] = - ∞ for i = 1,…,n

for j = i,i+1,…,n

S[i][j] = A[i] + A[i+1] + … + A[j]

return Champion(S)

(59)

O(n 2 ) Brute Force Algorithm

MaxSubarray-2(i, j) for i = 1,…,n

for j = 1,…,n S[i][j] = - ∞ R[0] = 0

for i = 1,…,n

R[i] = R[i-1] + A[i]

for i = 1,…,n

for j = i+1,i+2,…,n

S[i][j] = R[j] - R[i-1]

R[n] is the sum over A[1…n]

(60)

Max Subarray Problem Complexity

(61)

Divide-and-Conquer

• Base case (n = 1)

• Return itself (maximum subarray)

• Recursive case (n > 1)

• Divide the array into two sub-arrays

• Find the maximum sub-array recursively

• Merge the results

How?

(62)

Where is the Solution?

• The maximum subarray for any input must be in one of following cases:

Case 1: left

Case 2: right

Case 3: cross the middle

Case 1: MaxSub(A, i, j) = MaxSub(A, i, k) Case 2: MaxSub(A, i, j) = MaxSub(A, k+1, j)

Case 3: MaxSub(A, i, j) cannot be expressed using MaxSub!

(63)

Case 3: Cross the Middle

• Goal: find the maximum subarray that crosses the middle

• Observation

• The sum of 𝐴[𝑥 … 𝑘] must be the maximum among 𝐴[𝑖 … 𝑘] (left: 𝑖 ≤ 𝑘)

• The sum of 𝐴[𝑘 + 1 … 𝑦] must be the maximum among 𝐴[𝑘 + 1 … 𝑗] (right: 𝑗 > 𝑘)

• Solvable in linear time → Θ 𝑛

(1) Start from the middle to find the left maximum subarray

(2) Start from the middle to find the right maximum subarray The solution of Case 3 is the combination of (1) and (2)

(64)

Divide-and-Conquer Algorithm

MaxCrossSubarray(A, i, k, j) left_sum = -∞

sum=0

for p = k downto i sum = sum + A[p]

if sum > left_sum left_sum = sum max_left = p right_sum = -∞

sum=0

for q = k+1 to j sum = sum + A[q]

if sum > right_sum right_sum = sum max_right = q

return (max_left, max_right, left_sum + right_sum)

(65)

Combine Conquer Divide

Divide-and-Conquer Algorithm

MaxSubarray(A, i, j)

if i == j // base case return (i, j, A[i]) else // recursive case

k = floor((i + j) / 2)

(l_low, l_high, l_sum) = MaxSubarray(A, i, k) (r_low, r_high, r_sum) = MaxSubarray(A, k+1, j)

(c_low, c_high, c_sum) = MaxCrossSubarray(A, i, k, j) if l_sum >= r_sum and l_sum >= c_sum // case 1

return (l_low, l_high, l_sum)

else if r_sum >= l_sum and r_sum >= c_sum // case 2 return (r_low, r_high, r_sum)

else // case 3

return (c_low, c_high, c_sum)

(66)

MaxSubarray(A, i, j)

if i == j // base case return (i, j, A[i]) else // recursive case

k = floor((i + j) / 2)

(l_low, l_high, l_sum) = MaxSubarray(A, i, k) (r_low, r_high, r_sum) = MaxSubarray(A, k+1, j)

(c_low, c_high, c_sum) = MaxCrossSubarray(A, i, k, j) if l_sum >= r_sum and l_sum >= c_sum // case 1

return (l_low, l_high, l_sum)

else if r_sum >= l_sum and r_sum >= c_sum // case 2 return (r_low, r_high, r_sum)

else // case 3

return (c_low, c_high, c_sum)

Divide-and-Conquer Algorithm

(67)

Algorithm Time Complexity

• Divide a list of size n into 2 subarrays of size n/2

• Recursive case (𝑛 > 1)

• find MaxSub for each subarrays

• Base case (𝑛 = 1)

• Return itself

• Find MaxCrossSub for the original list

• Pick the subarray with the maximum sum among 3 subarrays

1. Divide

2. Conquer

3. Combine

𝑇 𝑛 = time for running MaxSubarray(A, i, j) with 𝑗 – 𝑖 + 1 = 𝑛

(68)

Theorem 1

• Theorem

• Proof

• There exists positive constant 𝑎, 𝑏 s.t.

• Use induction to prove

• n = 1, trivial

• n > 1,

Inductive hypothesis

(69)

Theorem 1 (Simplified)

• Theorem

• Proof

• There exists positive constant 𝑎, 𝑏 s.t.

• Use induction to prove

• n = 1, trivial

• n > 1,

Inductive hypothesis

(70)

Max Subarray Problem Complexity

(71)

Max Subarray Problem Complexity

Exercise 4.1-5 page 75 of textbook

Next topic!

(72)

Solving Recurrences

Textbook Chapter 4.3 – The substitution method for solving recurrences Textbook Chapter 4.4 – The recursion-tree method for solving recurrences Textbook Chapter 4.5 – The master method for solving recurrences

(73)

D&C Algorithm Time Complexity

• 𝑇 𝑛 : running time for input size 𝑛

• 𝐷 𝑛 : time of Divide for input size 𝑛

• 𝐶 𝑛 : time of Combine for input size 𝑛

• 𝑎: number of subproblems

• 𝑛/𝑏: size of each subproblem

(74)

Solving Recurrences

1. Substitution Method (取代法)

• Guess a bound and then prove by induction

2. Recursion-Tree Method (遞迴樹法)

• Expand the recurrence into a tree and sum up the cost

3. Master Method (套公式大法/大師法)

• Apply Master Theorem to a specific form of recurrences

• Useful simplification tricks

• Ignore floors, ceilings, boundary conditions (proof in Ch. 4.6)

• Assume base cases are constant (for small n)

(75)

Substitution Method

Textbook Chapter 4.3 – The substitution method for solving recurrences

(76)

Review

• Time Complexity for Merge Sort

• Theorem

• Proof

• There exists positive constant 𝑎, 𝑏 s.t.

• Use induction to prove

• n = 1, trivial

• n > 1,

Substitution Method (取代法)

guess a bound and then prove by induction

(77)

Substitution Method (取代法)

• Guess the form of the solution

• Verify by mathematical induction (數學歸納法)

• Prove it works for 𝑛 = 1

• Prove that if it works for 𝑛 = 𝑚, then it works for 𝑛 = 𝑚 + 1

→ It can work for all positive integer 𝑛

• Solve constants to show that the solution works

• Prove 𝑂 and Ω separately 1. Guess

2. Verify

3. Solve

(78)

Substitution Method Example

• Proof

There exists positive constants 𝑛0, 𝑐 s.t. for all 𝑛 ≥ 𝑛0,

• Use induction to find the constants 𝑛0, 𝑐

• n = 1, trivial

• n > 1,

holds when

e.g.

Inductive hypothesis

Guess

Verify

Solve

(79)

Substitution Method Example

• Proof

There exists positive constants 𝑛0, 𝑐 s.t. for all 𝑛 ≥ 𝑛0,

• Use induction to find the constants 𝑛0, 𝑐

• n = 1, trivial

• n > 1,

Inductive hypothesis

Tighter upper bound?

証不出來…

猜錯了?還是推導錯了?

沒猜錯 推導也沒錯

(80)

Substitution Method Example

• Proof

There exists positive constants 𝑛0, 𝑐1, 𝑐2 s.t. for all 𝑛 ≥ 𝑛0,

• Use induction to find the constants 𝑛0,𝑐1, 𝑐2

• n = 1, holds for

• n > 1,

holds when

e.g.

Inductive hypothesis

Guess Verify

Solve Strengthen the inductive hypothesis

by subtracting a low-order term

(81)

Useful Tricks

• Guess based on seen recurrences

• Use the recursion-tree method

• From loose bound to tight bound

• Strengthen the inductive hypothesis by subtracting a low-order term

• Change variables

• E.g.,

1. Change variable:

2. Change variable again:

3. Solve recurrence

(82)

Recursion-Tree Method

Textbook Chapter 4.4 – The recursion-tree method for solving recurrences

(83)

Review

• Time Complexity for Merge Sort

• Theorem

• Proof

2nd expansion 1st expansion

kth expansion

Recursion-Tree Method (遞迴樹法)

Expand the recurrence into a tree and sum up the cost

(84)

Recursion-Tree Method (遞迴樹法)

• Expand a recurrence into a tree

• Sum up the cost of all nodes as a good guess

• Verify the guess as in the substitution method

• Advantages

• Promote intuition

• Generate good guesses for the substitution method

1. Expand 2. Sumup

3. Verify

(85)

Recursion-Tree Example

(86)

Recursion-Tree Example

(87)

Recursion-Tree Example

(88)

Recursion-Tree Example

+

(89)

Master Theorem

Textbook Chapter 4.4 – The recursion-tree method for solving recurrences

(90)

Master Theorem

log 𝑎

divide a problem of size 𝑛 into 𝑎 subproblems, each of size 𝑛

𝑏 is solved in time 𝑇 𝑛

𝑏 recursively

The proof is in Ch. 4.6

Should follow this format

(91)

Recursion-Tree for Master Theorem

+

𝑎 𝑎

(92)

Three Cases

• 𝑎 ≥ 1, the number of subproblems

• 𝑏 > 1, the factor by which the subproblem size decreases

• 𝑓(𝑛) = work to divide/combine subproblems

• Compare 𝑓 𝑛 with 𝑛log𝑏 𝑎

1. Case 1: 𝑓 𝑛 grows polynomially slower than 𝑛log𝑏𝑎 2. Case 2: 𝑓 𝑛 and 𝑛log𝑏 𝑎 grow at similar rates

3. Case 3: 𝑓 𝑛 grows polynomially faster than 𝑛log𝑏 𝑎

(93)

Case 1:

Total cost dominated by the leaves

𝑎 𝑎

(94)

Case 1:

Total cost dominated by the leaves

(95)

𝑎 𝑎

Case 2:

Total cost evenly distributed among levels

(96)

Case 2:

Total cost evenly distributed among levels

(97)

Case 3:

Total cost dominated by root cost

𝑎 𝑎

(98)

Case 3:

Total cost dominated by root cost

(99)

Master Theorem

divide a problem of size 𝑛 into 𝑎 subproblems, each of size 𝑛

𝑏 is solved in time 𝑇 𝑛

𝑏 recursively

The proof is in Ch. 4.6

(100)

Examples

compare 𝑓 𝑛 with 𝑛log𝑏𝑎

(101)

Floors and Ceilings

• Master theorem can be extended to recurrences with floors and ceilings

• The proof is in the Ch. 4.6

(102)

Theorem 1

• Case 2

(103)

Theorem 2

• Case 1

(104)

Theorem 3

• Case 2

(105)

To Be Continue…

(106)

Question?

Important announcement will be sent to

@ntu.edu.tw mailbox & post to the course website

Course Website: http://ada.miulab.tw Email: ada-ta@csie.ntu.edu.tw

參考文獻

相關文件

4 Solving optimization problems Kernel: decomposition methods Linear: coordinate descent method Linear: second-order methods Experiments. Chih-Jen Lin (National Taiwan Univ.) 86

Textbook Chapter 4.3 – The substitution method for solving recurrences Textbook Chapter 4.4 – The recursion-tree method for solving recurrences Textbook Chapter 4.5 – The master

Textbook Chapter 33.4 – Finding the closest pair of points.. Closest Pair of

In this case Lagrange’s method is to look for extreme values by solving five equations in the five unknowns x, y, z, λ, and µ ... Example 5

See Chapter 5, Interrupt and Exception Handling, in the IA-32 Intel Architecture Software Developer’s Manual, Volume 3, for a detailed description of the processor’s mechanism

if no candidates for expansion then return failure choose leaf node for expansion according to strategy if node contains goal state then return solution. else expand the node and

• The start node representing the initial configuration has zero in degree.... The Reachability

This theorem does not establish the existence of a consis- tent estimator sequence since, with the true value θ 0 unknown, the data do not tell us which root to choose so as to obtain

The underlying idea was to use the power of sampling, in a fashion similar to the way it is used in empirical samples from large universes of data, in order to approximate the

4.6 Indeterminate Forms and L’Hôpital’s Rule 4.7 The Natural Logarithmic Function: Integration 4.8 Inverse Trigonometric Functions: Integration.. Hung-Yuan Fan

In this chapter we develop the Lanczos method, a technique that is applicable to large sparse, symmetric eigenproblems.. The method involves tridiagonalizing the given

For the proposed algorithm, we establish a global convergence estimate in terms of the objective value, and moreover present a dual application to the standard SCLP, which leads to

Abstract In this paper, we consider the smoothing Newton method for solving a type of absolute value equations associated with second order cone (SOCAVE for short), which.. 1

Like the proximal point algorithm using D-function [5, 8], we under some mild assumptions es- tablish the global convergence of the algorithm expressed in terms of function values,

Algorithm Design Methods Divide &amp; Conquer for Sprout 2014

Textbook Chapter 15.4 – Longest common subsequence Textbook Problem 15-5 – Edit

Multiple images from a sequence tracked with 6DOF SLAM on a client, while a localization server provides the global pose used to overlay the building outlines with transparent

Biases in Pricing Continuously Monitored Options with Monte Carlo (continued).. • If all of the sampled prices are below the barrier, this sample path pays max(S(t n ) −

Moreover, this chapter also presents the basic of the Taguchi method, artificial neural network, genetic algorithm, particle swarm optimization, soft computing and

Approach and a Boundary Element Method for the Calculation of Sound Fields in the Human Ear Canal, &#34; Journal of the Acoustical Society of America, 118(4), pp. Axelsson,

problem-solving; 3) analyze the type of problems posed by children; and, 4) display categories of misconceptions exhibited when children did problem posing. The stages

[22] Guillermo Gonzalez, Microwave Transistor Amplifier Analysis and Design: Chapter 4, Noise, Broadband, And High-Power Design Methods, New Jersey, Prentice Hall Inc,

Input data is the window data and the fall motion technique is the machine learning model with the extension of the checkpoint ensemble method...