**Lower Bounds **

### •

### •

### •

### • **Best-, Average-, and Worst-Case Time ** ** Complexity **

**Ex.**** ****Insertion sort of x**_{1}**, x**_{2}**, …, x**_{n}**. **

**For**_{ }**i = 2, 3, …, n,**_{ }**insert x**_{i}** into x**_{1}**, x**_{2}**, …, x***_{i−}*−−

**−1**

**such that these i data items are sorted. **

**input :**_{ }**7, 5, 1, 4, 3, 2, 6 **
**i = 2 :**** ****5, 7 **

**i = 3 :**** ****1, 5, 7 **
**i = 4 :**** ****1, 4, 5, 7 **
**i = 5 :**** ****1, 3, 4, 5, 7 **
**i = 6 :**** ****1, 2, 3, 4, 5, 7 **
**i = 7 :**** ****1, 2, 3, 4, 5, 6, 7 **

**T(n) :**** ****the number of comparisons made **

**best case :**_{ }**T(n) = O(n) ****worst case :**_{ }**T(n) = O(n**^{2}**) **
**average case :**** ****T(n) = O(n**^{2}**) **

**Consider the insertion of x**_{i}**. **

**P(k) :**_{ }**the probability of making k comparisons **

⇒* P(1) = P(2) = … = P(i*−−−−

**2) =**

^{1}

**i***−−−−*

**P(i****1) =**

^{2}**i**

⇒** the average number of comparisons for x**^{i}

**= ** **(1+2+ … +(i**−−−−**2))**×××× ^{1}

**i****+(i**−−−−**1)**×××× ^{2}**i**

**= ** **i****1**
**2**

**+** −−−− ^{1}**i**

**The average number of comparisons for **
**x**_{2}**, x**_{3}**, …, x**_{n}_{ }**is equal to ** ( )

**n****i**

**i**

− **i**

=

### ∑

**2**

**1** **1**
**2**

**+** **= O(n**^{2}**). **

**Ex.**^{ }**Binary search of a**_{1}**, a**_{2}**, …, a**_{n}**. **

**Assume**** ****n****=2***^{k }*−−−−

**1.**

**T(n) :**** ****the number of comparisons made **

**best case :**_{ }**T(n) = O(1) **

**worst case :**_{ }**T(n) = O(log****n) ****average case :**_{ }**T(n) = O(log****n) **

**P(i) :**_{ }**the probability of making i comparisons **
**for a successful search **

⇒** P(i) = ****n**^{i}

−**1**

**2** **,**** ****for i = 1, 2, …, k **

**The average number of comparisons for a **
**successful search is equal to **

( )

**k****i**

**i****i****n**

−

=

### ∑

^{1}**1**

× **2**

×

×

× _{=}_{ }**n**

**1** ××××**(2**^{k}**(k**−−−−**1)+1) =**** ****O(log****n). **

**(** ^{k}^{(} ^{i}^{)}

**i**

**i**^{−}

=

### ∑

^{1}**1**

×**2**

×

×

× _{=}_{ }_{2}^{k}** _{(k}**−−−−

**1)+1 can be proved by**

**induction on k) **

**There are k = O(log****n) comparisons for each ****unsuccessful search. **

**Exercise 1.** **Analyze the time complexity of the quick **
**sort in the best, average, and worst cases. **

**(refer to page 32 of the textbook) **

### • •

### • • **Lower Bound for a Problem **

**A problem has a lower bound of **^{Ω}^{Ω}^{Ω}^{Ω}**(g(n)). **

⇒

⇒⇒

**⇒ Any algorithm that can solve it takes **

ΩΩΩ

Ω**(g(n)) time. **

**For example,**_{ }**sorting n data items requires **

Ω ΩΩ

Ω**(nlog****n) time. **

**Unless stated otherwise,**_{ }**“lower bound”**_{ }**means **

**“lower bound in the worst case”. **

**For a problem,**_{ }**if the time complexity of an **
**algorithm matches a lower bound,**_{ }**then the **
**algorithm is time optimal and the lower bound **
**is tight. **

**Otherwise (if the lower bound is lower than the **
**time complexity of the algorithm),**_{ }**the lower **
**bound or the algorithm can be improved. **

### • •

### • • **Lower Bound by Comparison Tree **

**The method of comparison tree is applicable **
**to comparison-based algorithms which make **
**comparisons among input data items. **

**Most of sorting (exclusive of radix sort and **
**bucket sort), searching, selection, and merging **
**algorithms are comparison-based. **

**The execution of a comparison-based algorithm **
**can be described by a comparison tree,**_{ }**and the **
**tree depth is the greatest number of comparisons,**_{ }**i.e.,**_{ }**the worst-case time complexity. **

⇒

⇒⇒

**⇒ The minimal tree depth of all possible **
**comparison trees is a lower bound. **

**Ex.**_{ }**Sequential search and binary search of**** **

**A(1), A(2), …, A(n) for x. **

: (1)
*x A*

### ...

Failure

Failure

Failure Failure : (2)

*x A*

: ( )
*x A n*

: ( 1 )

2
*x* *A**n*+

: ( 1 )

4
*n*
*x A* +

3 1

: ( )

4
*n*
*x A* +

: ( 1 1)

2
*n*
*x A* +

−

: ( 1 1)

2
*n*
*x A* +

+

: (1)

*x* *A*

### ... ...

*x*: ( )

*A n*

### ... ...

Failure Failure Failure Failure Failure Failure Failure Failure

⇒

⇒

⇒**⇒ Searching has a lower bound of **^{Ω}^{Ω}^{Ω}^{Ω}**(log****n). **

**Since binary search takes ****O****(log****n) time, ****the lower bound is tight. **

**Ex.**_{ }**Sorting a**_{1}**, a**_{2}**, …, a**_{n}**. **

**Straight insertion sort of a**_{1}**, a**_{2}**, a**_{3 }**: **

**Sorting**_{ }**3, 1, 2: **

**3**** ^{ }**→→→→

_{ }**3, 1**

**→→→→**

^{ }

_{ }**1, 3**

**→→→→**

^{ }

_{ }**1, 3, 2**

**→→→→**

^{ }

_{ }**1, 2, 3**

**(a**

_{1 }**:**

**a**

_{2}**)**

**(a**

_{2 }**:**

**a**

_{3}**)**

**(a**

_{1 }**:**

**a**

_{2}**)**

**Sorting**_{ }**2, 1, 3: **

**2**** ^{ }**→→→→

_{ }**2, 1**

**→→→→**

^{ }

_{ }**1, 2**

**→→→→**

^{ }

_{ }**1, 2, 3**

^{ }**(a**

_{1 }**:**

**a**

_{2}**)**

**(a**

_{2 }**:**

**a**

_{3}**)**

**Bubble sort of a**_{1}**, a**_{2}**, a**_{3 }**: **

**Sorting**_{ }**3, 1, 2: **

**3, 1, 2**** ^{ }**→→→→

_{ }**1, 3, 2**

**→→→→**

^{ }

_{ }**1, 2, 3**

^{ }**(a**

_{1 }**:**

**a**

_{2}**)**

**(a**

_{2 }**:**

**a**

_{3}**)**

**(a**

_{1 }**:**

**a**

_{2}**)**

**Sorting**_{ }**2, 1, 3: **

**2, 1, 3**** ^{ }**→→→→

_{ }**1, 2, 3**

^{ }**(a**

_{1 }**:**

**a**

_{2}**)**

**(a**

_{2 }**:**

**a**

_{3}**)**

**one-to-one **
**correspondence **

* n! leaf nodes ←*←←←→→

**→**→

**n! possible****input sequences**

**When the comparison tree is balanced,**_{ }**the tree **
**depth is **

**log****n! =**** Ω**ΩΩΩ**(nlog****n)**** ****(refer to page 47 of the **
**textbook), **

**which is minimum. **

**Heap sort takes O(nlog****n) time. **

⇒

⇒⇒

**⇒ ** ^{Ω}^{Ω}^{Ω}^{Ω}**(nlog****n) is tight for sorting. **

**The worst-case time complexity of sorting was **
**considered above.**** ****In what follows,**** ****the average- **
**case time complexity of sorting is considered.**

**Average time complexity of a sorting algorithm **
**can be estimated as ** **L**

**n!****,**_{ }**where L is the total **
** length in the comparison tree from the root to **
**each of the leaf nodes. **

**Let L**_{min}** be the minimal L of all comparison trees **

⇒

⇒⇒

**⇒ L****n**

**min**

**!** ** is an average-case lower bound for sorting **

**L**_{min}** occurs when the comparison tree is balanced. **

**For example, **

**The left tree has L=13,**_{ }**and the right tree has **
**L****=12 (=****L**_{min}** for n=9). **

**Suppose that T is a balanced comparison tree with **
**n! leaf nodes. **

**Let**** ****N = n!****+(n!**−−−−**1) = 2(n!)**−−−−**1. **

⇒

⇒⇒

**⇒ T is of height**^{ }**h = log****N. **

**Assume that there are x**_{1}** leaf nodes of depth h**−−−−**1 **
**and x**_{2}** leaf nodes of depth h. **

** ⇒**⇒* ⇒ x*⇒

**1**

**+ x**

_{2}**= n!**

**(A)**

**x**_{1}** +** ^{1}

**2**^{x}^{2}^{ = 2}

* h−*−−

**−1**

** (x**_{2}** is even) ** **(B) **

**(A), (B) ⇒**⇒⇒**⇒ x****1**** = 2***^{h}*−−−−

**n!,**

**x**

_{2}**= 2(n!**−−−−

**2**

^{h−}^{−}

^{−1}^{−}

**)**

⇒

⇒

⇒**⇒ L****min**** =**** ****(2***^{h}*−−−−

*−−−−*

**n!)(h****1) + 2(n!**−−−−

**2**

^{h−}^{−}

^{−1}^{−}

**)h**

**=**** ****(h+1)n! −**−−**− 2**^{h}

**Since**** ****log*** N*−−−−

**1 < h ≤**≤≤

**≤ log**

**N,**

**we have**

**L**

_{min}**> (log**

*−−*

**N)n! −****− 2**

^{log}

^{N}**= (log*** N*−−−−

**2)n! + 1.**

**For example,**** ****n = 3,**** ****N = 11,**** ****h = 3,**** ****x**_{1 }**=2,**** ****and**** ****x**_{2 }**=4.**** **

⇒

⇒

⇒**⇒ x****1**** +** ^{1}

**2**^{x}^{2}^{ = 2}

**2****. **

**Quick sort takes O(nlog****n) time in the average case. **

⇒

⇒

⇒**⇒ Ω**ΩΩ**Ω(n****log****n) is the tight average-case lower bound ****for sorting. **

**Since the heap sort takes O(nlog****n) time in the worst ****case,**_{ }**it also takes O(nlog****n) time in the average case. **

**Ex.**** ****Selection from n data items. **

**Let L(n) denote a lower bound for selecting the **
**greatest data item from a**_{1}**, a**_{2}**, …, a**_{n}**. **

**Any comparison tree has leaf nodes labeled with **
**a**_{1}**, a**_{2}**, …, a**_{n}**,**_{ }**and each root-to-a**_{i}** path represents a **
**process to recognize that a**_{i}** is the greatest element. **

**Since at least n**−−−−**1 comparisons are required to **
**find the greatest data item,**** ****each root-to-leaf **
**path has length ≥**≥≥* ≥ n*−−−−

**1.**

⇒⇒⇒

* ⇒ L(n) = n*−−−−

**1**

**Exercise 2.** **Let L**_{k}**(n) denote a lower bound for **
**selecting the k greatest data items **
**from a**_{1}**, a**_{2}**, …, a**_{n}**. **

**Prove that for 2**≤≤≤≤* k*≤≤≤≤

**n,****L**_{k}**(n) ≥**≥* ≥ n*≥ −−−−

**k****+log**

*−−−−*

**n(n****1)…(n**−−−−

**k****+2).**

**Ex.**_{ }**An n-player tournament. **

**An 8-player tennis tournament. **

*C*

*C*
*C*
*C*
*A*

*A* *B* *D*

*H*
*H*
*E*

*H*
*G*
*E* *F*

**C is the best player. **

** Consider each match a comparison. **

⇒** finding the best player is equivalent to **

**finding the greatest data item. **

**According to Exercise 2,**_{ }**finding the first two best **
** players requires at least**** *** n*−−−−

**2+log**

**n**

**matches**

**There is an approach to finding the first two best **
**players with exactly**** *** n*−−−−

**2+log**

**n**

**matches.**

**Consider the 8-player tennis tournament again. **

**The best player can be found with 7 (=*** n*−−−−

**1)**

**matches.**

_{ }**The second best player is one of 3 (= log****n) ****candidates,**_{ }**i.e.,**_{ }**D, A, and H. **

⇒

⇒

⇒**⇒ The second best player can be found with **

**log*** n −*−−−

**1**

**matches.**

**Therefore,**** *** n*−−−−

**2+log**

**n**

**is a tight lower bound**

**for finding the first two best players.**

### • •

### • • **Lower Bound by a Particular Problem ** ** Instance **

**Ex.**** ****Merging two sorted sequences a**** _{1 }**≤≤≤≤

**a****≤≤≤**

_{2 }**≤ … ≤**≤≤≤

**a**

_{n}**and**

**b****≤≤≤≤**

_{1 }

**b****≤≤≤**

_{2 }**≤ … ≤**≤≤≤

**b**

_{n}**.**

**Consider a problem instance with **
**a**_{1 }**<****b**_{1 }**<****a**_{2 }**<****b**_{2 }**< … <****a**_{n }**<****b**_{n}**. **

**When**_{ }**a**_{1 }**<****b**_{1 }**<****a**_{2 }**<****b**_{2 }**< … <****a**_{i }**<****b**_{i}_{ }**is obtained, **
**b**_{i+1}** must be compared with a**_{i+1}** and a**_{i+2}** before **
**it is placed properly. **

⇒* a lower bound of 2n*−−−−

**1 comparisons**

**The merging algorithm,**_{ }**which continuously **
**compares the two currently smallest elements **
**of the two sorted lists and outputs the smaller **
**one,**** ****performs 2n**−−−−**1 comparisons. **

⇒** the lower bound is tight **

**Ex.**** ****Fault diameter of the hypercube **
**H**_{n}**:**_{ }**the n-dimensional hypercube. **

01

00 10

11 1

0

*H*_{1} *H*_{2}

001 000 010

011 101

100 110

111

*H*_{3}

**D***_{n−}*−−

**−1**

**:**

**the (n**−−−−

**1)-fault diameter of H**

_{n}**,**

**which is**

**the maximal diameter of H**

_{n}**with n**−−−−

**1**

**edges removed.**

**Consider the following problem instance. **

### ...

10

^{n}^{-1}0

*01*

^{n}

^{n}^{-1}

* n*−−−−

**1 edges**

**removed**

**The distance between 0**^{n}** and 01**^{n−}^{−}^{−}^{−1}** is n+1. **

⇒** D**^{n−}^{−}^{−}^{−1}** ≥**≥≥**≥ n****+1 **

**Between any two distinct nodes of H**_{n}**,**_{ }**there are **
**n node-disjoint paths whose maximal length is at ****most n+1. **

### ...

### ...

### ...

### ...

⇒** D**^{n−}^{−}^{−}^{−1}** ≤**≤≤**≤ n****+1 **

⇒** Therefore,**^{ }**D***_{n−}*−−

**−1**

**= n+1.**

**There is an advanced technique,**_{ }**named oracles,**_{ }**for **
**deriving lower bounds. **

**In fact,**** ****an oracle (e.g., ** 籤詩籤詩籤詩籤詩**) can be considered a **
**scenario for a particular problem instance. **

**You are suggested to read Sec. 10.2.4 of Ref. (2) (or **
**L. Hyafil, “Bounds for Selection,” SIAM J. Comput., **
**vol. 5, no. 1, 1976, pp. 109-114),**_{ }**where an example of **
**selection is illustrated. **

### • •

### • • **Lower Bound by State Transition **

**Ex.**_{ }**Finding the maximum and minimum of a**_{1}**, a**_{2}**, …, **
**a**_{n}**. **

**Let**** ****(k, k**^{(+)}**, k**^{(−}^{−}^{−}^{−)}**, k**^{(±}^{±}^{±}^{±)}**)**** ****be a state,**** ****where **

**k****: the number of a**_{i}**’s that are not compared yet; **

**k**^{(+)}**: the number of a**_{i}**’s that have won but never **
**lost; **

**k**^{(−}^{−}^{−}^{−)}**: the number of a**_{i}**’s that have lost but never **
**won; **

**k**^{(±}^{±}^{±}^{±)}**: the number of a**_{i}**’s that have both won and **
**lost. **

**The problem is equivalent to the state transition **
**from**_{ }**(n, 0, 0, 0)**_{ }**to**_{ }**(0, 1, 1, n**−−−−**2). **

** Each comparison induces a state transition from **
** (k, k**^{(+)}**, k**^{(−}^{−}^{−)}^{−} **, k**^{(±}^{±}^{±}^{±)}**)**** ****to one of the following states: **

** (1) (k**−−−−**2, k**^{(+)}**+1, k**^{(−}^{−}^{−}^{−)}**+1, k**^{(±}^{±}^{±}^{±)}**); **

** (2) (k**−−−−**1, k**^{(+)}**, k**^{(−}^{−}^{−}^{−)}**+1, k**^{(±}^{±}^{±}^{±)}**)**** ****or**** ****(k**−−−−**1, k**^{(+) }**+1, k**^{(−}^{−}^{−}^{−)}**, k**^{(±}^{±}^{±}^{±)}**) **
** ** **or**** ****(k**−−−−**1, k**^{(+)}**, k**^{(−}^{−}^{−)}^{−} **, k**^{(±}^{±}^{±) }^{±} **+1); **

** (3) (k, k**** ^{(+) }**−−−−

**1, k**

^{(−}^{−}

^{−}

^{−)}**, k**

^{(±}^{±}

^{±}

^{±) }**+1);**

** (4) (k, k**^{(+)}**, k**^{(−}^{−}^{−}** ^{−) }**−−−−

**1, k**

^{(±}^{±}

^{±}

^{±) }**+1),**

**where**

** (1) occurs when k**≥≥≥≥**2 and two from k are compared; **

** (2) occurs when k**≥≥≥≥**1 and one from k is compared **
** ** **with one from k**^{(+)}** or k**^{(−}^{−}^{−}^{−)}**; **

** (3) occurs when k**** ^{(+)}**≥≥≥≥

**2 and two from k**

^{(+)}**are**

**compared;**

**(4)**** occurs when k**^{(−}^{−}^{−)}^{−} ≥≥≥≥**2 and two from k**^{(−}^{−}^{−}^{−)}** are **

**Since the elements of**^{ }**k**^{(±}^{±}^{±)}^{±} ^{ }**come from**^{ }**k**^{(+)}^{ }**or**^{ }**k**^{(−}^{−}^{−}^{−)}**, **
**not from**^{ }**k,**** ****the quickest way from**^{ }**(n, 0, 0, 0)**^{ }**to **
**(0, 1, 1, n**−−−−**2)**^{ }**is as follows. **

**Case 1.**** ****n****=2p. ** ** **

**(n, 0, 0, 0) →**→→→_{p}** (0, p, p, 0) →**→→→_{2p−}_{−}_{−2}_{−} ** (0, 1, 1, 2p**−−−−**2). **

**(“→**→→→_{p}**” means**^{ }**p**^{ }**state transitions.) **

**There are**^{ }**3p**−−−−**2=(3n/2)**−−−−**2**^{ }**state transitions. **

**Case 2.**** ****n****=2p+1. **

**(n, 0, 0, 0) →**→→→_{p}** (1, p, p, 0) →**→→→_{1}** (0, p, p, 1) →**→→→_{2p−}_{−}_{−}_{−2}**(0, 1, 1, 2p**−−−−**1). **

**There are**^{ }**3p**−−−−**1=(3n/2)**−−−−**5/2**^{ }**state transitions. **

**There is an algorithm that can find the maximum **
**and minimum of n data items with**^{ }* 3n/2*−−−−

**2**

**comparisons. (Refer to Sec. 3.3 of Ref. (2)) **

**When**^{ }**n****=2p,**^{ }**(3n/2)**−−−−**2**^{ }**=**^{ }* 3n/2*−−−−

**2.**

**When**^{ }**n****=2p+1,**^{ }**(3n/2)**−−−−**5/2**^{ }**<**^{ }* 3n/2* −−−−

**2.**

### • •

### • • **Lower Bound by Reduction **

**A problem P**_{1}** reduces to another problem P**_{2}**, **
**denoted by**** ****P**_{1}** ∝**∝∝**∝ P**_{2}**,**** ****if any instance of P**_{1}** can **
**be transformed into an instance of P**_{2}** such that **
**the solution for P**_{1}** can be obtained from the **

**solution for P**_{2}**. **

* T*∝∝∝

**∝**

**: the reduction time.**

**T****: the time required to obtain the solution for P**_{1}** **
**from the solution for P**_{2}**. **

**Ex.** **the problem of selection**** **∝∝∝∝_{ }**the problem of sorting. **

* T*∝∝∝

**∝**

**:**

**O(1).****T****: O(1). **

**Ex.** **Suppose that S**_{1}** and S**_{2}** are two sets of n elements **
**and m elements, respectively. **

**P**_{1 }**: the problem of determining if**** ****S**** _{1 }**∩∩∩∩

**S**

_{2 }**=**∅

**∅.**∅∅

**P**

_{2 }**: the problem of sorting.**

* T*∝∝∝

**∝**

**:**

**O(n****+**

**m).****T****: O(n+****m). **

**S**_{1 }**={a**_{1}**, a**_{2}**, …, a**_{n}**},**** ****S**_{2 }**={b**_{1}**, b**_{2}**, …, b**_{m}**}: **
**an arbitrary instance of P**_{1}**. **

**(a**_{1}**, 1), (a**_{2}**, 1), …, (a**_{n}**, 1), (b**_{1}**, 2), (b**_{2}**, 2), …, (b**_{m}**, 2): **
**an instance of P**_{2}** created from P**_{1}**. **

⇒** S**** ^{1 }**∩∩∩∩

**S****≠≠≠≠∅∅∅∅**

_{2 }

_{ }**iff**

**the sorted sequence contains**

**two successive elements**

^{ }**(a**

_{i}**, 1)**

^{ }**and**

^{ }**(b**

_{j}**, 2)**

^{ }**with**^{ }**a**_{i }**= b**_{j}**. **

**L**_{1 }**: a lower bound of P**_{1}**. **
**L**_{2 }**: a lower bound of P**_{2}**. **

⇒

⇒⇒

**⇒ L****1**** ≤**≤≤* ≤ T*∝∝∝

**∝**

**+**

**L**

_{2 }**+**

**T****When**** ****L**_{1}**,**^{ }* T*∝∝∝∝

**,**

^{ }

**T**

**are known and**

*∝∝∝*

**T****∝**≤≤≤≤

**L**

_{1}**,**

^{ }*≤≤≤≤*

**T**

**L**

_{1}**,**

**we have L**

**≤≤≤≤**

_{1 }

**L**

_{2}**,**

**i.e.,**

**L**

_{1}

^{ }**is also a lower bound**

**of**

^{ }

**P**

_{2}**.**

**Ex.**_{ }**P**_{1 }**: the sorting problem. **

**P**_{2 }**: the convex hull problem. **

* T*∝∝∝

**∝**

**:**

**O(n).****T****: O(n). **

**x**_{1}**, x**_{2}**, …, x**_{n }**: an arbitrary instance of P**_{1}**. **

**(x**_{1}**,** **x**_{1}^{2}**), (x**_{2}**,** **x**_{2}^{2}**), …, (x**_{n}**,** **x**_{n}^{2}**): an instance of P**_{2}**from P**_{1}**. **

*(x*4*, x*4
2)

*(x*3*, x*3
2)

*(x*1*, x*1
2)

*(x*2*, x*2
2)

**The sorting problem requires Ω**ΩΩ**Ω(n****log****n) time. **

⇒** The convex hull problem requires Ω**Ω* Ω(n*Ω

**log**

**n)****time.**

**There are O(nlog****n) time algorithms for the convex ****hull problem. **

⇒

⇒⇒

**⇒ Ω**ΩΩ**Ω(n****log****n) is tight for the convex hull problem. **

**Ex.**_{ }**P**_{1 }**: the sorting problem. **

**P**_{2 }**: the Euclidean minimum spanning tree **
**(E-MST) problem. **

* T*∝∝∝

**∝**

**:**

**O(n).****T****: O(n). **

**x**_{1}**, x**_{2}**, …, x**_{n }**: an arbitrary instance of P**_{1}**. **
**(x**_{1}**, 0), (x**_{2}**, 0), …, (x**_{n}**, 0): an instance of P**_{2}** **

**from P**_{1}**. **

## ...

*(x*1, 0) *(x*2, 0) *(x*3, 0) *(x**n-1*, 0) *(x**n*, 0)

**The E-MST problem requires Ω**ΩΩ**Ω(n****log****n) time. **

**The E-MST problem can be solved in O(nlog****n) time. **

⇒⇒

⇒**⇒ Ω**Ω* Ω(n*Ω

**log**

**n) is tight for the E-MST problem.****Exercise 3.** **For the example of page 25,**_{ }**prove that **
**P**_{1}** has a lower bound of Ω**ΩΩ**Ω(n****log****n) by ****showing**** ****P**_{2}** ∝**∝∝**∝ P**_{1}**.**^{ }**(Refer to Sec. 10.3.2 **
**on page 475 of Ref. (2).) **

**Given n data items at intervals of one time step,**_{ }**the on-line median finding problem is to compute **
**the median of the first i data items at the end of **
**the ith time step,**** ****where 1**≤≤≤≤* i*≤≤≤≤

**n.****For example,**_{ }**if the input sequence is **
**(7, 15, 3, 17, 8, 11, 5),**_{ }

**then the output sequence is **

** (7, 7 or 15, 7, 7 or 15, 8, 8 or 11, 8). **

**Exercise 4.** **Prove that the on-line median finding **
**problem has a lower bound of Ω**ΩΩ**Ω(n****log****n) ****by showing a reduction from the sorting **
**problem to it.**^{ }**(Refer to Sec. 10.3.3 on **

**Project I** **: Lower Bounds of Some ** ** ** **Problems **

**You are required to survey lower bounds for **
**some problems.**** ****You must provide the proofs **
**of lower bounds in your report. **