**Integer Weights in Linear Time**

MIKKEL THORUP

*AT&T Labs Research, Florham Park, New Jersey*

Abstract. The single-source shortest paths problem (SSSP) is one of the classic problems in
*algorithmic graph theory: given a positively weighted graph G with a source vertex s, find the shortest*
*path from s to all other vertices in the graph.*

Since 1959, all theoretical developments in SSSP for general directed and undirected graphs have
*been based on Dijkstra’s algorithm, visiting the vertices in order of increasing distance from s. Thus,*
*any implementation of Dijkstra’s algorithm sorts the vertices according to their distances from s.*

However, we do not know how to sort in linear time.

Here, a deterministic linear time and linear space algorithm is presented for the undirected single source shortest paths problem with positive integer weights. The algorithm avoids the sorting bottleneck by building a hierarchical bucketing structure, identifying vertex pairs that may be visited in any order.

**Categories and Subject Descriptors: F.1.1 [Computation by Abstract Devices]: Models of Computa-**
**tion—bounded-action devices (random access machines); F.2.2 [Analysis of Algorithms and Problem****Complexity]: Nonnumerical Algorithms and Problems—Computations on discrete structures, sorting**
**and searching; G.2.2 [Discrete Mathematics]: Graph algorithms—computations on discrete structures,***sorting and searching*

General Terms: Algorithms

Additional Key Words and Phrases: RAM algorithms, shortest paths

*1. Introduction*

*Let G* *5 (V, E), uVu 5 n, uEu 5 m, be an undirected connected graph with a*
positive integer edge weight function *,: E 3 N and a distinguished source*
*vertex s* *[ V. If (v, w) [y E, define ,(v, w) 5 `. The single source shortest path*
*problem (SSSP) is for every vertex v to find the distance d(v)* *5 dist(s, v) from s*
*to v. This is one of the classic problems in algorithmic graph theory. In this*

*A preliminary short version of this paper appeared in Proceedings of the 38th IEEE Symposium on*
*Foundations of Computer Science (FOCS ’97). IEEE Computer Society Press, Los Alamitos, Calif.,*
pp. 12–21.

Some of this work was done while the author was at the University of Copenhagen.

Author’s address: AT&T Labs, 180 Park Avenue, Florham Park, NJ 07932, e-mail: mthorup@

research.att.com.

Permission to make digital / hard copy of part or all of this work for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial advantage, the copyright notice, the title of the publication, and its date appear, and notice is given that copying is by permission of the Association for Computing Machinery (ACM), Inc. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and / or a fee.

© 1999 ACM 0004-5411/99/0500-0362 $5.00

Journal of the ACM, vol. 46, No. 3, May 1999, pp. 362–394.

paper, we present a deterministic linear time and linear space algorithm for undirected SSSP with positive integer weights. So far a linear time SSSP algorithm has only been known for planar graphs [Henzinger et al. 1997].

1.1. MODEL. Our algorithm runs on a RAM, which models what we program
in imperative programming languages such as C. The memory is divided into
addressable words of length v. Addresses are themselves contained in words, so
*v $ log n. Moreover, we have a constant number of registers, each with capacity*
for one word. The basic assembler instructions are: conditional jumps, direct and
indirect addressing for loading and storing words in registers, and some compu-
tational instructions, such as comparisons, addition, and multiplication, for
manipulating words in registers. The space complexity is the maximal memory
address used, and the time complexity is the number of instructions performed.

All weights and distances are assumed to be integers represented as binary
strings. For simplicity, we assume they all weights and distances each fit in one
*word so that the input and output size is O(m); otherwise, the output size may*
be asymptotically larger than the input size, say, if we start with a huge weight
when leaving the source. Our algorithm is easily modified to run in time and
space linear in the output size for arbitrarily large integer weights.

Within the RAM model, one may prefer to use only the AC^{0}operations among
the computational instructions. A computational instruction is an AC^{0}operation
if it is computable by an v^{O(1)}*-sized constant depth circuit with O(*v) input and
output bits. In the circuit we may have negation and and-gates and or-gates with
unbounded fan-in. Addition, shift, and bit-wise Boolean operations are all AC^{0}
operations. On the other hand, multiplication is not. Our linear time algorithm
does use multiplication, but if we restrict ourselves to AC^{0}operations, it can be
*implemented in O(a(m, n)m) time.*

In contrast to the RAM, we have the pointer machine model, disallowing address arithmetic, and hence bucketing which is essential to our algorithm.

Also, we have the comparison based model where weights may only be com- pared. Of all the algorithms mentioned below, it is only those from Dijkstra [1959]; Williams [1964]; and Fredman and Tarjan [1987] that work in any of these two restricted models. All the other algorithms assume a RAM model with integer weights, like ours.

1.2. HISTORY. Since 1959, all theoretical developments in SSSP for general
directed or undirected graphs have been based on Dijkstra’s algorithm [Dijkstra
*1959]. For each vertex we have a super distance D(v)* *$ d(v). Moreover, we*
*have a set S* *# V such that @v [ S;D(v) 5 d(v) and @v [y S;D(v) 5*
min_{u[S}*{d(u)* *1 ,(u, v)}. Initially, S 5 {s}, D(s) 5 d(s) 5 0 and @v Þ*
*s;D(v) 5 ,(s, v). In each round of the algorithm, we visit a vertex v [y S*
*minimizing D(v). Then, as proved by Dijkstra, D(v)* *5 d(v), so we can move v*
*to S. Consequently, for all (v, w)* *[ E, if D(v) 1 ,(v, w) , D(w), we have to*
*decrease D(w) to D(v)* *1 ,(v, w). Dijkstra’s algorithm finishes when S 5 V,*
*returning D[ 5 d[.*

*The complexity of Dijkstra’s algorithm is determined by the n* 2 1 times that
*we find a vertex v* *[ V\S minimizing D(v) and the at most m times we*
*decrement some D(w). All subsequent theoretical developments in SSSP for*
general graphs have been based on various speed-ups and trade-offs in priority
queues/heaps supporting these two operations. If we just find the minimum by

*searching all vertices, we solve SSSP in O(n*^{2} *1 m) time. Applying Williams’*

*heap [Williams 1964], we get O(m log n) time. Fredman and Tarjan’s Fibonacci*
heaps [Fredman and Tarjan 1987] had SSSP as a prime application, and reduced
*the running time to O(m* *1 n log n). They noted that this was an optimal*
implementation of Dijkstra’s algorithm in a comparison model since Dijkstra’s
algorithm visits the vertices in sorted order. Using Fredman and Willard’s fusion
*trees, we get an O(m=log n) randomized bound [Fredman and Willard 1993].*

*Their later atomic heaps give an O(m* *1 n log n/log log n) bound [Fredman and*
*Willard 1994]. More recently, Thorup’s priority queues gave an O(m log log n)*
*bound and an O(m* *1 n=log n*^{11«}) bound [Thorup 1996]. These bounds are
randomized assuming that we want linear space. Finally, Raman has obtained an
*O(m* *1 n=log n log log n) bound in deterministic linear space [Raman 1996]*

*and an O(m* *1 n=*^{3} *log n*^{11«}) randomized bound [Raman 1997].

There has also been a substantial development based on the maximal edge
*weight C, again assuming integer edge weights, each fitting in one word. First*
note that using van Emde Boas’s general search structure [van Emde Boas 1977;

van Emde Boas et al. 1977; Melhorn and Na¨hler 1990], and bucketing according
to D(v)/n, we get an O(m log log C) algorithm for SSSP. Ahuja, Melhorn,
Orlin, and Tarjan have found a priority queue for SSSP giving a running time of
*O(m* *1 n=log C) [Ahuja et al. 1990]. Recently, this has been improved by*
*Cherkassky et al. [1997] to O(m* *1 n=*^{3}*log C log log C) expected time and a*
*further improvement to O(m* *1 n(log C)*^{1/41«}) has been presented by Raman
[1997].

For the case of undirected graphs, we end the above quest by presenting an
*O(m) algorithm.*

1.3. TECHNIQUES. As observed in Fredman and Tarjan [1987], implementing
Dijkstra’s algorithm in linear time would require sorting in linear time. In fact,
the converse also holds, in that Thorup has shown that linear time sorting implies
that Dijkstra’s algorithm can be implemented in linear time [Thorup 1996]. In
*this paper, we solve the undirected version of SSSP deterministically in O(m)*
time and space. Since we do not know how to sort in linear time, this implies that
we are deviating from Dijkstra’s algorithm in that we do not visit the vertices in
*order of increasing distance from s. Our algorithm is based on a hierarchical*
bucketing structure, where the bucketing helps identifying vertex pairs that can
be visited in any order. It should be mentioned that using bucketing is not in
itself new in connection with SSSP. In 1978, Dinitz [Dinic 1978] argued that if d
is the minimum edge weight, then in Dijkstra’s algorithm, we can visit any vertex
*v minimizing* D(v)/d. Thus, bucketing according toD(v)/d, we can visit the
vertices in the minimal bucket in any order. In this paper, we are in some sense
applying Dinitz’s idea recursively, identifying cuts where the minimum weight d
of the crossing edges is large.

1.4. CONTENTS. The paper is divided as follows: After the preliminaries in Section 2, in Section 3, we present the general idea of using bucketing to avoid the sorting bottle-neck. This idea is then implemented recursively over Sections 4 – 8, allowing us to conclude in Section 9 with a linear time algorithm for the undirected SSSP problem with positive integer weights. Finally, in Appendix A, we discuss how to get a linear time algorithm if the weights are not integers but floating point numbers.

*2. Preliminaries*

*Throughout the paper, we will assume that G, V, E,* *,, s, D, d, S are as defined*
in the introduction in the description of Dijkstra’s algorithm. In particular,
*concerning S and D, @v* *[ S;D(v) 5 d(v) and @v [ V\S;D(v) 5*
min_{u[s}*{d(u)* *1 ,(u, v)}. As in Dijkstra’s algorithm, initially, S 5 {s}, D(s) 5*
*d(s)* *5 0, and @v Þ s;D(v) 5 ,(s, v). We also inherit that we can visit a vertex*
*v* [*y S only if D(v) 5 d(v). Visiting v implies that v is moved to S and that for*
*all (v, w)* *[ E, w [y S, we set D(w) 5 min{D(w), D(v) 1 ,(v, w)}. As for*
Dijkstra’s algorithm, we have:

LEMMA *1. If v* *[ V\S minimizes D(v), D(v) 5 d(v).*

PROOF*. Let u be the first vertex outside S on a shortest path from s to v.*

*Then D(u)* *5 d(u) by definition of S. Hence, we have D(v) $ d(v) $ d(u) 5*
*D(u)* *$ D(v), implying D(v) 5 d(v). e*

*However, in contrast to Dijkstra’s algorithm, we may visit a vertex v* [*y S that*
*does not minimize D(v). Nevertheless, we inherit the following additional result*
from Dijkstra’s algorithm:

LEMMA *2. min D(V\S) 5 min d(V\S) is nondecreasing.*

PROOF*. If S* *5 V, min D(V\S) 5 min d(V\S) 5 min À 5 `. Otherwise, there*
*is a v* *[ V\S minimizing d(v). Let u be the first vertex outside S on a shortest*
*path from s to v. Then, D(u)* *5 d(u) and d(u) # d(v). However, v minimized*
*d(v), so d(u)* *5 d(v). Hence, we have min D(V\S) # D(u) 5 d(u) 5*
*min d(V\S). On the other hand, D(w) $ d(w) for all w [ V, so min D(V\S) $*
*min d(V\S). Hence, we conclude that min D(V\S) 5 min d(V\S). Now,*
*min d(V\S) is nondecreasing because S is only increased and d(w) does not*
*change for any w* *[ V. Thus, we are minimizing over a smaller and smaller set of*
*constant d-values, implying that min d(V\S) can only increase. e*

We will let v denote the word length. We will write x/ 2* ^{i}*

*as x*

*.. i to*

*emphasize that it may be calculated simply by shifting the i least significant bits*

*out to the right. Note that x*

*# y f x .. i # y .. i while x , y d x .. i ,*

*y*

*.. i. If f is a function on the elements from a set X, we let f(X) denote*

*{ f( x)ux [ X}. We also adopt the standard that min À 5 `. We define ‘..’ to*have lower precedence than ‘min’, ‘

*1’, and ‘2’. For example, if W # V,*

*min D(W)*

*.. i 2 1 5 (min{D(w)uw [ W}) .. (i 2 1).*

*By a bucket, we refer to a dynamic set B into which elements can be inserted*
and deleted, and from which we can pick out an unspecified element. Each
operation should be supported in constant time. A bucket could, for example, be
implemented as a doubly-linked list where one can just insert and pick elements
from the head. Using the indirect addressing of the RAM, we typically create an
*array B(1 . . . l ) of buckets. Then, we can insert and pick elements from an*
*arbitrary bucket B(i) in constant time. Also, in constant time, we can delete an*
element from whatever bucket it is in.

*3. Avoiding the Sorting Bottleneck*

We will now briefly indicate how the sorting bottleneck can be avoided. That is,
*we will discuss some simple conditions for D(v)* *5 d(v) where possibly D(v) .*

*min D(V\S). The resulting algorithm is far from efficient but over the subse-*
quent sections, we will apply the ideas recursively, so as to achieve a linear time
solution to the SSSP problem.

LEMMA *3. Suppose the vertex set V divides into disjoint subsets V*_{1}*, . . . , V*_{k}*and*
*that all edges between the subsets have length at least* *d. Further suppose for some i,*
*v* *[ V**i**\S that D(v) 5 min D(V**i**\S) # min D(V\S) 1* *d. Then d(v) 5 D(v).*

PROOF*. To see that d(v)* *5 D(v), let u be the first vertex outside S on a*
*shortest path from s to v. Then, d(u)* *5 D(u) follows by definition of S. If u [*
*V*_{i}*, as in the proof of Lemma 1, we have D(v)* *$ d(v) $ d(u) 5 D(u) $ D(v),*
*implying d(u)* *5 D(u), as desired. Now, suppose u [y V**i*. Since all edges from
*V\V**i* *to V** _{i}* are of length

*d, we have D(v) $ d(v) $ d(u) 1 d 5 D(u) 1 d. On*

*the other hand, D(u)*1

*d $ min D(V\S) 1 d $ min D(V*

*i*

*\S) 5 D(v), so we*conclude that we have equality everywhere. In particular, this allows us to

*conclude that d(v)*

*5 D(v). e*

Approaching a first simple SSSP bucketing algorithm, suppose d 5 2^{a}. Then
*min D(V*_{i}*\S) # min D(V\S) 1* *d is implied by min D(V**i**\S) ..* a #
*min D(V\S) ..* *a. Algorithmically, we will now bucket each i [ {1, . . . , k}*

*according to min D(V*_{i}*\S) ..* *a. That is, we have an array B of buckets where i*
*belongs in bucket B(min D(V*_{i}*\S) ..* *a). Note that min D(V\S) .. a 5*
min_{i}*(min D(V*_{i}*\S) ..a). Hence, if i is in the smallest indexed nonempty bucket,*
*min D(V*_{i}*\S) ..* *a 5 min D(V\S) .. a. In more algorithmic terms, suppose ix*
is maintained as *# the smallest index of a nonempty bucket. If i [ B(ix) and*
*v* *[ V**i**\S minimizes D(v), then D(v) 5 min D(V**i**\S) # min D(V\S) 1* d, so
*D(v)* *5 d(v) by Lemma 3, and hence v can be visited.*

*For the maintenance of ix recall that min D(V\S) is nondecreasing by Lemma*
*2. Hence, min D(V\S) ..* *a is nondecreasing, so ix will never need to be*
*decreased. Also, note that D(v)* *, ` implies that there is a path in G from s to*
*v of length D(v), and hence D(v)* # (_{e[E}*,(e). Consequently, the maximum*
index , ` of any non-empty bucket is bounded by D 5 (_{e[E}*,(e) ..*a. That is,
the only bucket indices used are 0, . . . ,D, `. Viewing ` as the successor of D, we
only need an array ofD 1 2 buckets.

Based on the above discussion, we get the following SSSP algorithm.

**Algorithm A.** *Solves the SSSP problem where V is partitioned into subsets V*1 *. . . , V**k*

where all edges between the subsets have length at least 2^{a}.
**A.1.***S 4 {s}; D(s) 4 0; for all v* *Þ s: D(v) 4 ,(s, v)*
**A.2.***for ix 4 0, 1, . . . ,* *D, `, B(ix) 4 À*

**A.3.***for i 4 1, . . . , k, add i to B(min D(V**i**\S) ..* a)
**A.4.***for ix 4 0 to*D,

**A.4.1.** *while B(ix)* Þ À,
**A.4.1.1.** *pick i* *[ B(ix)*

**A.4.1.2.** *pick v* *[ V**i**\S minimizing D(v)*
**A.4.1.3.** *for all (v, w)* *[ E, w [**y S,*
**A.4.1.3.1.** *let j be such that w* *[ V**j*

**A.4.1.3.2.** *D(w) 4 min{D(w), D(v)* *1 ,(v, w)}: if min D(V**j**\S) ..* a is thereby
*decreased, move j to B(min D(V**j**\S) ..* a).

**A.4.1.4.** *S 4 S* *ø {v}; if min D(V**i**\S) ..* *a is thereby increased, move i to*
*B(min D(V**i**\S) ..* a).

*The complexity of the above algorithm is O(m* 1 D) plus the cost of
*maintaining min D(V*_{i}*\S) for each i. The latter will essentially be done recur-*
sively, and D 5 (*e[E**,(e) ..* a will be kept small by choosing a large.

*4. The Component Hierarchy*

*We are now going to present a recursive condition for concluding D(v)* *5 d(v)*
which will later be used in a linear time SSSP algorithm. It is based on a
*component hierarchy defined as follows: By G*_{i}*, we denote the subgraph of G*
*whose edge set is the edges e from G with* *,(e) , 2*^{i}*. Then, G*_{0} consists of
singleton vertices. Recall thatv denotes the word length. Hence, all edge lengths
and distances are , 2^{v}*, so G*_{v} *5 G. On level i in the component hierarchy, we*
*have the components (maximal connected subgraphs) of G** _{i}*. The component on

*level i containing v is denoted [v]*

_{i}*. The children of [v]*

*are the components*

_{i}*[w]*

_{i21}*with [w]*

_{i}*5 [v]*

*i*

*, that is, with w*

*[ [v]*

*i*.

LEMMA *4. If [v]*_{i}*Þ [w]**i**, dist(v, w)* $ 2* ^{i}*.

PROOF*. Since [v]*_{i}*Þ [w]**i**, any path from v to w contains an edge of length*$
2* ^{i}*. e

*By [v]*_{i}^{2}*, we will denote [v]*_{i}*\S, noting that [v]**i*2 may not be connected. We say
*that [v]*_{i}*is a min-child of [v]*_{i11}*if min(D([v]*_{i}^{2}) *.. i 5 min(D([v]*_{i11}^{2} ) *.. i. We*
*say that [v]*_{i}*is minimal if [v]*_{i}^{2} *Þ À and for j 5 i, . . . , b 2 1, [v]**j*is a min-child
*of [v]*_{j11}*. The requirement [v]*_{i}^{2} *Þ À is only significant if V 5 S. If V 5 S,*
*min D([v]*_{i}^{2}) *5 ` for all [v]**i**, and hence all [v]** _{i}* would be minimal without the

*requirement [v]*

_{i}^{2}

*Þ À; now, no [v]*

*i*

*is minimal if V*

*5 S.*

*Below, in Lemma 8, we will show that D(v)* *5 d(v) if [v]*0 *is minimal. If v* [
*V\S minimizes D(v), as in Dijkstra’s algorithm, @i;min D([v]**i*2) *.. i 5*
*D([v]*_{i11}^{2} ) *.. i 5 D(v) .. i, so [v]*0 *is minimal. The point is that [v]*_{0} may be
*minimal even if D(v) is not minimized, thus providing us with a more general*
*condition for D(v)* *5 d(v) than the one used in Dijkstra’s algorithm.*

*The condition for D(v)* *5 d(v) that [v]*0 is minimal also holds for directed
graphs. Our efficient use of the condition hinges, however, on the property of
*undirected graphs that @u, w* *[ [v]**i**; dist(u, v) # (*_{e[[v]}_{i}*,(e). We shall return*
to the latter property in Section 6, where it will be used to limit the size of an
underlying bucketing structure.

We will now prove several properties of the component hierarchy, one of
*which is that D(v)* *5 d(v) if [v]*0 is minimal. All these properties will prove
relevant in our later algorithmic developments.

LEMMA *5. If v*[*y S, [v]**i* *is minimal, and i* *# j #v, min D([v]**i*2) *.. j 2 1 5*
*min D([v]*_{j}^{2}) *.. j 2 1.*

PROOF*. The proof is by induction on j. If j* *5 i, the statement is vacuously*
*true. If j* *. i, inductively, min D([v]**i*2) *.. j 2 2 5 min D([v]**j21*2 ) *.. j 2 2,*
*implying min D([v]*_{i}^{2}) *.. j 2 1 5 min D([v]**j21*2 ) *.. j 2 1. Moreover, the*
*minimality of [v]*_{i}*implies that [v]*_{j21}*is a min-child of [v]** _{j}*; hence, that

*min D([v]*

_{j21}^{2})

*.. j 2 1 5 min D([v]*

*j*2)

*.. j 2 1. e*

LEMMA *6. Suppose v*[*y S and there is a shortest path to v where the first vertex*
*u outside S is in [v]*_{i}*. Then d(v)$ min D([v]**i*2).

PROOF*. Since u is the first vertex outside S on our path, D(u) is a lower*
*bound on its length, so D(u)* *# d(v). Moreover, since u [ [v]*^{2}*, D(u)* $
*min D([v]*_{i}^{2}). e

LEMMA *7. Suppose v*[*y S and [v]**i11**is minimal. If there is no shortest path to v*
*where the first vertex outside S is in [v]*_{i}*, d(v)* *.. i . min D([v]**i11*2 )*.. i.*

PROOF*. Among all shortest paths to v, pick one P so that the first vertex u*
*outside S is in [v]*_{k}*with k minimized. Then k* *. i and d(v) 5 ,(P) 5 D(u) 1*
*dist(u, v).*

We prove the statement of the lemma by induction on *v 2 i. If u [y [v]**i11*,
*we have i* 1 1 , *v and the minimality of [v]**i11* *implies minimality of [v]** _{i12}*.

*Hence, by induction, d(v)*

*.. i 1 1 . min D([v]*

*i12*2 )

*.. i 1 1. By minimality*

*of [v]*

_{i11}*, min D([v]*

_{i12}^{2})

*.. i 1 1 5 min D([v]*

*i11*2 )

*.. i 1 1. Thus, d(v) ..*

*i* *1 1 . min D([v]**i11*2 ) *.. i 1 1, implying d(v) .. i . min D([v]**i11*2 ) *.. i.*

*If u* *[ [v]**i11*2 *, D(u)* *.. i $ min D([v]**i11*2 ) *.. i. Moreover, since u [y [v]**i*, by
*Lemma 4, dist(u, v)* $ 2^{i}*. Hence, d(v)* *.. i 5 (D(u) 1 dist(u, v)) .. i $*
*(min D([v]*_{i11}^{2} ) *.. i) 1 1. e*

*We are now in the position to prove that the minimality of [v]*_{0}*implies D(v)*5
*d(v):*

LEMMA *8. If v* [*y S and [v]**i* *is minimal, min D([v]*_{i}^{2}) *5 min d([v]**i*2*). In*
*particular, D(v)* *5 d(v) if [v]*0*5 {v} is minimal.*

PROOF*. Since D(w)* *$ d(w) for all w, min D([v]**i*2) *$ min d([v]**i*2). Viewing
*v as an arbitrary vertex in [v]*_{i}*, it remains to show that d(v)* *$ min D([v]**i*2).

*Among all shortest paths to v, pick one P so that the first vertex u outside S is*
*in [v]*_{i}*if possible. If u* *[ [v]**i**, Lemma 6 gives the result directly. If u* [*y [v]**i*,
*by Lemma 7, d(v)* *.. i . min D([v]**i11*2 ) *.. i. However, [v]**i* is min-child of
*[v]*_{i11}*, so min D([v]*_{i11}^{2} ) *.. i 5 min D([v]**i*2) *.. i. Thus, d(v) .. i .*
*min D([v]*_{i}^{2}) *.. i, implying d(v) . min D([v]**i*2). e

The above lemma gives us our basic condition for visiting vertices, moving
*them to S. Our algorithms will need one more consequence of Lemma 6 and 7.*

LEMMA *9. If v* [*y S and [v]**i* *is not minimal but [v]*_{i11}*is minimal, then min*
*d([v]*_{i}^{2})*.. i . min D([v]**i11*2 )*.. i.*

PROOF*. Consider any w* *[ [v]**i*2*. If there is no shortest path to w where the*
*first vertex outside S is in [v]*_{i}*, d(w)* *.. i . min D([v]**i11*2 ) *.. i follows directly*
*from Lemma 7. Otherwise, by Lemma 6, d(w)* *$ min D([v]**i*2). Moreover,
*the nonminimality of [v]*_{i}*implies min D([v]*_{i}^{2}) *.. i . min D([v]**i11*2 ) *.. i.*

*Hence, we conclude that d(w)* *.. i . min D([v]**i11*2 )*.. i for all w [ [v]**i*2, as
desired. e

*5. Visiting Minimal Vertices*

*In this section, we will discuss the basic dynamics of visiting vertices v with [v]*_{0}
minimal. First, we show a series of lemmas culminating in Lemma 13, stating that
*if [v]*_{i}*has once been minimal, min D([v]*_{i}^{2}) *.. i 5 min d([v]**i*2) *.. i in all*
future. Based on this, we will present an abstract SSSP algorithm displaying the

*basic order in which we want to visit the vertices of G in our later linear time*
algorithm.

*Definition 1. In the rest of this paper, visiting a vertex v requires that [v]*_{0} 5
*{v} is minimal. When v is visited, it is moved to S, setting D(w) to min{D(w),*
*D(v)* *1 ,(v, w)} for all (v, w) [ E.*

*Note by Lemma 8 that D(v)* *5 d(v) whenever we visit a vertex v.*

LEMMA *10. For all [v]*_{i}*, max d([v]*_{i}*\[v]**i*2)*.. i 2 1 # min d([v]**i*2)*.. i 2 1.*

PROOF*. Since d is a constant function, it suffices to show that just before w* [
*[v]*_{i}^{2} *is visited, d(w)* *.. i 2 1 5 min d([v]**i*2) *.. i 2 1. By definition, [w]*0 is
*minimal just before the visit, so by Lemma 5, D(w)* *.. i 2 1 5 min D([w]*02) ..

*i* *2 1 5 min D([w]**i*2) *.. i 2 1. On the other hand, by Lemma 8, D(w) 5*
*d(w) and min D([w]*_{i}^{2}) *5 min d([w]**i*2*), so we conclude that d(w)* *.. i 2 1 5*
*min d([v]*_{i}^{2}) *.. i 2 1, as desired. e*

In the following, we will frequently study the situation before and after the
event of visiting some vertex. We will then use the notation *^e&** ^{b}* and

*^e&*

*to*

^{a}*denote that the expression e should be evaluated before respectively after the*

*event. By Lemma 10, if j*

*$ i 2 1, ^min d([v]*

*i*2)

*.. j&*

^{a}*$ ^min d([v]*

*i*2)

*.. j&*

*.*

^{b}*Hence, since @w;D(w) $ d(w),*

*^min D~@v#**i*2*! .. j&** ^{b}*5

*^min d~@v#*

*i*2

*! .. j&*

*f*

^{b}*^min D~@v#**i*2*! .. j&** ^{a}*$

*^min D~@v#*

*i*2

*! .. j&*

*(1) LEMMA*

^{b}*11. Suppose min D([v]*

_{i}^{2})

*.. i 5 min d([v]*

*i*2)

*.. i and that visiting a*

*vertex w[ V\S changes min D([v]*

*i*2)

*.. i. Then w [ [v]*

*i*

*and if [v]*

_{i}^{2}

*is not emptied,*

*the change in min D([v]*

_{i}^{2})

*.. i is an increase by one.*

PROOF*. We are studying the event of visiting the vertex w. By assumption,*

*^min D((v]**i*2) *.. i&*^{b}*5 ^min d([v]**i*2) *.. i&** ^{b}*. Hence, by (1),

*^min D([v]*

*i*2)

*.. i&*

*$*

^{a}*^min D([v]**i*2) *.. i&** ^{b}*. By assumption,

*^min D([v]*

*i*2)

*.. i&*

^{a}*Þ ^min D([v]*

*i*2)

*.. i&*

*, so*

^{b}*^min D([v]*

*i*2)

*.. i&*

^{a}*. ^min D([v]*

*i*2)

*.. i&*

^{b}*. Since D-values never increase,*we conclude

*^[v]*

*i*2&

^{a}*, ^[v]*

*i*2&

^{b}*; hence that w*

*[ ^[v]*

*i*2&

*and*

^{b}*^[v]*

*i*2&

*5*

^{a}*^[v]**i*2&^{b}*\{w}.*

Suppose *^[v]**i*2&^{a}*is nonempty. Since [v]** _{i}* is connected, there must be an edge

*(u, x) in [v]*

_{i}*with u*[

*y ^[v]*

*i*2&

^{a}*and x*

*[ ^[v]*

*i*2&

*. We will now argue that*

^{a}*d~u! .. i # min^D~@v#**i*2*! .. i&** ^{b}*. (2)

*If u*[

*y ^[v]*

*i*2&

^{b}*, (2) follows from Lemma 10. Otherwise, u*

*5 w. By Lemma 8 and*

*Lemma 5, the minimality of [u]*

_{0}

*5 [w]*0

*implies d(u)*

*.. i 5 D(u) .. i 5*

*^min D([v]**i*2) *.. i&** ^{b}*. Thus, (2) follows.

Based on (2), since *,(u, x) , 2** ^{i}*, we conclude

*^min D~@v#**i*2*! .. i&** ^{a}*#

*^D~ x! .. i&*

^{a}#*~d~u! 1 ,~u, x!! .. i*

#*^min D~@v#**i*2*! .. i&** ^{b}*1 1. e

In connection with Lemma 11, it should be noted that with directed graphs, the increase could be by more than one. This is the first time in this paper, that we use the undirectedness.

LEMMA *12. If [v]*_{i}*is minimal, it remains minimal until min D([v]*_{i}^{2}) *.. i is*
*increased, in which case min d([v]*_{i}^{2})*.. i is also increased.*

PROOF*. Suppose [v]*_{i}*is minimal, but visiting some vertex w stops [v]** _{i}* from

*being minimal. If w was the last vertex not in S, the visit increases both*

*min D([v]*

_{i}^{2}

*) and min d([v]*

_{i}^{2}) to

*`. Otherwise, some ancestor of [v]*

*i*is minimal,

*and we pick the smallest j such that [v]*

_{j11}*is minimal. Moreover, we pick u*[

*[v]*

_{j11}^{2}

*such that [u]*

_{j}*is a min-child of [v]*

_{j11}*. Hence, [u]*

_{j}*is minimal while [v]*

*is not minimal.*

_{j}*Before the visit to w, [v]** _{i}* was minimal, so

*^min D([v]*

*i*2)

*.. j&*

*5*

^{b}*^min D([v]*_{j11}^{2} ) *.. j&*^{b}*by Lemma 5. Also, [v]** _{j11}* was minimal, so by Lemma 8
and (1),

*^min D([v]*

_{j11}^{2})

*.. j&*

^{a}*$ ^min D([v]*

_{j11}^{2})

*.. j&*

*.*

^{b}*After the visit, since [v]*_{j11}*is minimal and [v]*_{j}*is not a min-child of [v]** _{j11}*, by
Lemma 9,

*^min d([v]*

*j*2)

*.. j&*

^{a}*. ^min D([v]*

_{j11}^{2})

*.. j&*

*. Thus*

^{a}*^min D~@v#**i*2*! .. j&** ^{a}*$

*^min d~@v#*

*i*2

*! .. j&*

^{a}$*^min d~@v#**j*2*! .. j&** ^{a}*
.

*^min D~@v#*

*j11*2

*! .. j&*

^{a}$*^min D~@v#**j11*2 *! .. j&** ^{b}*
5

*^min D~@v#*

*i*2

*! .. j&*

*5*

^{b}*^min d~@v#*

*i*2

*! .. j&*

*. LEMMA*

^{b}*13. If [v]*

_{i}*has once been minimal, in all future,*

*min D~@v#**i*2*! .. i 5 min d~@v#**i*2*! .. i.* (3)
PROOF*. First time [v]** _{i}* turns minimal, (3) gets satisfied by Lemma 8. Now,

*suppose (3) is satisfied before visiting some vertex w. Since @u;D(u) $ d(u),*

*(3) can only be violated by an increase in min D([v]*

_{i}^{2}

*). If min D([v]*

_{i}^{2})

*.. i is*

*increased, by Lemma 11, w*

*[ [v]*

*i*

*and the increase is by one. Visiting w requires*

*that [w]*

_{0}

*is minimal, hence that [w]*

_{i}*5 [v]*

*i*

*is minimal. If [v]*

*is minimal after*

_{i}*the visit, (3) follows from Lemma 8. Also, if [v]*

_{i}^{2}is emptied, (3) follows with

*min D([v]*

_{i}^{2})

*.. i 5 min d([v]*

*i*2)

*.. i 5 `. If [v]*

*i*becomes nonminimal and

*[v]*

_{i}^{2}

*is not emptied, by Lemma 12, min d([v]*

_{i}^{2})

*.. i is also increased. Since*

*min d([v]*

_{i}^{2})

*.. i # min D([v]*

*i*2)

*.. i and min D([v]*

*i*2)

*.. i was increased by*one, we conclude that (3) is restored. e

We are now ready to derive an algorithm for the undirected SSSP problem based on the component hierarchy. The algorithm is so far inefficient, but it shows the ordering in which we intend to visit the vertices in a later linear time algorithm. As our main routine, we have:

**Algorithm B.** *SSSP is given an input graph G* *5 (V, E) with weight function , and*
*distinguished vertex s. It outputs D with D(v)* *5 d(v) 5 dist(s, v) for all v [ V.*

**B.1.***S 4 {s}*

**B.2.***D(s) 4 0, for all v* *Þ s;D(v) 4 ,(s, v)*

**B.3.***Visit([s]*_{v}) (Algorithm C below and later Algorithm F)
**B.4.***return D*

*A recursive procedure is now presented for visiting a minimal component [v]** _{i}*.

*The goal is to visit all w*

*[ [v]*

*i*2

*with d(w)*

*.. i equal to the call time value of*

*min D([v]*

_{i}^{2})

*.. i. By Lemma 13, the call time minimality of [v]*

*i*implies that we

*preserve min D([v]*

_{i}^{2})

*.. i 5 min d([v]*

*i*2)

*.. i throughout the call. Thus,*

*min D([v]*

_{i}^{2})

*.. i will not increase until we have visited the last vertex w with*

*d(w)*

*.. i equal to the call time value of min D([v]*

*i*2)

*.. i. By Lemma 12, this*

*in turn implies that [v]*

*will remain minimal until we have visited the last vertex*

_{i}*we want to visit. We will maintain an index ix([v]*

*) that essentially equals*

_{i}*min D([v]*

_{i}^{2})

*.. i 2 1. Then, a child [w]*

*i21*

*of [v]*

_{i}*is minimal if min D([w]*

_{i21}^{2})

*.. i 2 1 5 ix([v]*

*i*

*). Hence, recursively, we can visit all vertices z*

*[ [w]*

*i21*2 with

*d( z)*

*.. i 2 1 5 min D([w]*

*i21*2 )

*.. i 2 1. Since min D([w]*

*i21*2 )

*.. i 2 1 5*

*ix([v]*

*)*

_{i}*5 min D([v]*

*i*2)

*.. i 2 1, d( z) .. i 5 min D([v]*

*i*2)

*.. i, as desired.*

*Finally, the visiting of [v]*_{i}*is completed when ix([v]** _{i}*)

*.. 1 5 D([v]*

*i*2)

*.. i is*increased. Formalizing in pseudo-code, we get

**Algorithm C.** *Visit([v]**i**) presumes that [v]**i**is minimal. It visits all w**[ [v]**i*2*with d(w)*..

*i equal to the value of min D([v]**i*2) *.. i when the call is made.*

**C.1.***if i* *5 0, visit v and return*

**C.2.***if [v]**i* *has not been visited previously, ix([v]**i**) 4 min D([v]**i*2) *.. i 2 1.*

**C.3.***repeat until [v]**i*2 *5 À or ix([v]**i*) .. 1 is increased:

**C.3.1.** *while ? child [w]*_{i21}*of [v]**i* *such that min D([w]*_{i21}^{2} ) *.. i 2 1 5 ix([v]**i*),
**C.3.1.1.** *let [w]*_{i21}*be a child of [v]**i* *with min D([w]*_{i21}^{2} ) *.. i 2 1 5 ix([v]**i*)
**C.3.1.2.** *Visit([w]** _{i21}*)

**C.3.2.** *increment ix([v]**i*) by one

*Correctness. We now prove that Algorithm C is correct, that is, if [v]** _{i}* is

*minimal, Visit([v]*

_{i}*) visits exactly the vertices w*

*[ [v]*

*i*2

*with d(w)*

*.. i equal to*

*the value of min D([v]*

_{i}^{2})

*.. i when the call is made. The proof is by induction*

*on i.*

*If i* *5 0, we just visit v in Step C.1. By Lemma 8, D(v) 5 d(v). Hence,*
*d(v)* *.. i equals the call time value of min D([v]**i*2) *.. i 5 D(v) .. i, as*
*desired. After the visit to v, [v]*_{i}^{2} 5 À, and we are done.

*Now, assume i* *. 0. Inductively, if a subcall Visit([w]** _{i21}*) (step C.3.1.2) is

*made with [w]*

_{i21}*minimal, we may assume that it correctly visits all u*

*[ [w]*

_{i21}^{2}

*with d(u)*

*.. i 2 1 equal to the value of min D([w]*

_{i21}^{2})

*.. i 2 1 when the*

*subcall is made. We will prove the following invariants for when [v]*

_{i}^{2}Þ À:

*ix~@v#**i**! .. 1 5 min D~@v#**i*2*! .. i 5 min d~@v#**i*2*! .. i* (4)
*ix~@v#**i**! # min d~@v#**i**! .. i 2 1* (5)
*When ix([v]*_{i}*) is first assigned in step C.2, it is assigned min D([v]*_{i}^{2}) *.. i 2 1.*

*Also, at that time, [v]*_{i}*is minimal, so min D([v]*_{i}^{2}) *5 min d([v]**i*2) by Lemma 8.

*Thus ix([v]** _{i}*)

*5 min D([v]*

*i*2)

*.. i 2 1 5 min d([v]*

*i*2)

*.. i 2 1, implying both*(4) and (5). Now, assume (4) and (5) both hold at the beginning of an iteration of the repeat-loop C.3.

LEMMA *14. If min D([v]*_{i}^{2}) *.. i has not increased, [v]**i* *remains minimal and*
*(4) and (5) remain true.*

PROOF*. By Lemma 12 and Lemma 8, [v]** _{i}* remains minimal with

*min D([v]*

_{i}^{2})

*5 min d([v]*

*i*2

*). Then, by (1), min D([v]*

_{i}^{2})

*.. i 2 1 is nonde-*

*creasing, so a violation of (5) should be due to an increase in ix([v]*

*). However,*

_{i}*ix([v]*

_{i}*) is only increased in step C.3.2, which is only entered if @[w]*

*#*

_{i21}*[v]*

_{i}*;min D([w]*

_{i21}^{2})

*.. i 2 1 Þ ix([v]*

*i*). In particular, before the increase,

*ix([v]*

*)*

_{i}*Þ min D([v]*

*i*2)

*.. i 2 1 5 min*

*[w]*

_{i21}*#[v]*

*i*

*(min D([w]*

_{i21}^{2})

*.. i 2 1).*

*Moreover, by (5), ix([v]** _{i}*)

*# min D([v]*

*i*2)

*.. i 2 1. Hence, ix([v]*

*i*) ,

*min D([v]*

_{i}^{2})

*.. i 2 1 5 min d([v]*

*i*2)

*.. i 2 1, so the increase in ix([v]*

*i*) by

*one cannot violate (5). Moreover, since min D([v]*

_{i}^{2})

*.. i is not increased and*

*min D([v]*

_{i}^{2})

*5 min d([v]*

*i*2), (5) implies that (4) is preserved. e

LEMMA *15. If a subcall Visit([w]*_{i21}*) (step C.3.1.2) is made before min*
*D([v]*_{i}^{2})*.. i is increased, all vertices u visited have d(u) .. i equal to the original*
*value of min D([v]*_{i}^{2})*.. i (as required for visits within Visit([v]**i*)).

PROOF*. By assumption, Lemma 14 applies when the subcall Visit([w]** _{i21}*) is

*made, so (4) and (5) hold true. The assignment C.3.1.1 implies that ix([v]*

*) 5*

_{i}*min D([w]*

_{i21}^{2})

*.. i 2 1, and clearly, min D([w]*

*i21*2 )

*.. i 2 1 $*

*min D([v]*

_{i}^{2})

*.. i 2 1 $ min d([v]*

*i*2)

*.. i 2 1. Then (5) implies equality*

*everywhere, so min D([w]*

_{i21}^{2})

*.. i 2 1 5 min D([v]*

*i*2)

*.. i 2 1, and hence*

*[w]*

_{i21}*inherits the minimality of [v]*

_{i}*. Thus, by induction, Visit([w]*

*) correctly*

_{i21}*visits the vertices u*

*[ [w]*

*i*2

*with d(u)*

*.. i 2 1 equal to the value of*

*min D([w]*

_{i21}^{2})

*.. i 2 1 at the time of the subcall. However, at the time of the*

*subcall, min D([w]*

_{i21}^{2})

*.. i 2 1 5 ix([v]*

*i*

*) and by (4), ix([v]*

*) .. 1 5*

_{i}*min D([v]*

_{i}^{2})

*.. i, so d(u) .. i 5 min D([v]*

*i*2)

*.. i. e*

LEMMA *16. min D([v]*_{i}^{2}) *.. i has increased when the repeat-loop C.3 termi-*
*nates.*

PROOF*. If min D([v]*_{i}^{2}) *.. i did not increase, (5) holds by Lemma 14, and*
*(5) implies ix([v]** _{i}*)

*.. 1 # min D([v]*

*i*2)

*.. i. Initially, we have equality by (4).*

*However, the repeat-loop can only terminate if ix([v]** _{i}*)

*.. 1 increases or [v]*

*i*2

*becomes empty, setting min D([v]*_{i}^{2}) *.. i 5 `. e*

So far, we will just assume termination deferring the proof of termination to
*the proof of efficiency in the next section. Thus, by Lemma 16, min D([v]*_{i}^{2}) ..

*i increases eventually. Let Visit([w]** _{i21}*) be the subcall during which the increase
happen.

*By Lemma 13, min d([v]*_{i}^{2}) *.. i increases with min D([v]**i*2) *.. i. Hence, by*
*Lemma 15, Visit([w]** _{i}*) will visit no more vertices. Moreover, it implies that we

*have visited all vertices u*

*[ [v]*

*i*

*with d(u)*

*.. i equal to the original value of*

*min D([v]*

_{i}^{2}

*), so we have now visited all vertices u*

*[ [v]*

*i*

*with d(u)*

*.. i equal*

*to the original value of min D([v]*

_{i}^{2}), so we have now visited exactly the required vertices.

*Since min d([v]*_{i}^{2}) *.. i is increased and @[w]**i21* *# [v]**i**;min D([w]**i21*2 ) ..

*i* *2 1 $ min D([v]**i*2) *.. i 2 1 $ min d([v]**i*2) *.. i 2 1, ix([v]**i*) will now just
*be incremented without recursive subcalls Visit([w]*_{i21}*) until either [v]*_{i}^{2} is
*emptied, or ix([v]** _{i}*) .. 1 is increased by one.

*Since no more vertices are visited after the increase of min D([v]*_{i}^{2}) *.. i, by*
*Lemma 11, the increase is by one. Thus, we conclude that all of ix([v]** _{i}*),

*D([v]*

_{i}^{2})

*.. i, and min d([v]*

*i*2)

*.. i are increased by one, restoring the*

*equalities of (4). Since, ix([v]*

_{i}*) now has the smallest value such that ix([v]*

*) ..*

_{i}1 *5 min d([v]**i*2) *.. i, we conclude that (5) is also satisfied.*

*By Lemma 13, in all future min d([v]*_{i}^{2}) *.. i 5 min D([v]**i*2)*.. i. Moreover,*
*ix([v]*_{i}*) and min d([v]*_{i}^{2}*) can only change in connection with calls Visit([v]** _{i}*), so
we conclude that (4) and (5) will remain satisfied until the next such call. This
completes the proof that Algorithm C is correct. e

*6. Towards a Linear Time Algorithm*

In this section, we present the ingredients of a linear-time SSSP algorithm.

6.1. THE COMPONENT TREE*. Define the component tree* 7 representing the
*topological structure of the component hierarchy, skipping all nodes [v]** _{i}* 5

*[v]*

*. Thus, the leaves of*

_{i21}*7 are the singleton components [v]*0

*5 {v}, v [ V.*

*The internal nodes are the components [v]*_{i}*, i* *. 0, [v]**i21* *, [v]**i*. The root in 7
*is the node [v]*_{r}*5 G with r minimized. The parent of a node [v]**i* is its nearest
degree $ 2 ancestor in the component hierarchy. Since 7 have no degree one
nodes, the number of nodes is*# 2n 2 1. In Section 7, we show how to construct*
*7 in time O(m). Given 7, it is straightforward to modify our implementation of*
Visit in Algorithm C so that it recurses within 7, thus skipping the components
in the component hierarchy that are not in 7. In the rest of this paper, when we
talk about children or parents, it is understood that we refer to 7 rather than to
*the component hierarchy. A min-child [w]*_{h}*of [v]*_{i}*is minimizing min D([w]*_{h}^{2}) ..

*i* 2 1. Thus, a component of 7 is minimal if and only if it is minimal in the
component hierarchy to 7.

6.2. A LINEAR-SIZEDBUCKETSTRUCTURE*. We say a component [v]** _{i}* [ 7 is

*visited the first time Visit([v]*

*) is called. Note that if a component is visited, then so are all its ancestors in*

_{i}*7. The idea now is that for each visited component [v]*

*i*,

*we will bucket the children [w]*

_{h}*according to min D([w]*

_{h}^{2})

*.. i 2 1. That*

*is, [w]*

_{h}*is found in a bucket denoted B([v]*

_{i}*, min D([w]*

_{h}^{2})

*.. i 2 1). With*

*ix([v]*

*)*

_{i}*5 min D([v]*

*i*2)

*.. i 2 1 as in Algorithm C, the minimal children of [v]*

*i*

*are then readily found in B([v]*_{i}*, ix([v]** _{i}*)).

*Concerning the bucketing of a visited component [v]** _{i}*, we can again use the

*index ix([v]*

_{i}*), for if [v]*

_{i}*has parent [v]*

*in*

_{j}*7, [v]*

*i*

*belongs in B([v]*

_{j}*, ix([v]*

*) ..*

_{i}*j* *2 i) 5 B([v]**j**, min D([v]*_{i}^{2}) *.. j 2 1). The bucketing of unvisited children of*
visited components is deferred till later. In the rest of this subsection, the point is
*to show that we can efficiently embed all “relevant” buckets from B(*z, z) into one
*bucket array A with O(m) entries.*

LEMMA *17. If [w]*_{h}*is a minimal child of v*_{i}*, min d([v]** _{i}*)

*.. i 2 1 #*

*min D([w]*

_{h}^{2})

*.. i 2 1 # max d([v]*

*i*)

*.. i 2 1.*

PROOF*. By Lemma 8, min D([w]*_{h}^{2}) *5 min d([w]**h*2), and by definition of
*minimality, [w]*_{h}^{2} *is a nonempty subset of [v]** _{i}*. e

*Let ix*_{0}*([v]*_{i}*) denote min d([v]** _{i}*)

*.. i 2 1. We are going to compute some*

*ix*

_{`}

*$ max d([v]*

*i*)

*.. i 2 1. Then, by Lemma 17, any minimal child of [v]*

*i*

*should be found in B([v]*_{i}*, ix*_{0}*([v]*_{i}*) . . . ix*_{`}*([v]*_{i}*)). Conversely, if min D([w]*_{h}^{2}) ..

*i* 2 1 [*y {ix*0*([v]*_{i}*), . . . , ix*_{`}*([v]*_{i}*)}, we know [w]** _{h}* is not minimal, and hence it

*is not necessary to have [w]*

_{h}*placed in B([v]*

*, z). We therefore define that*

_{i}*B([v]*

_{i}*, q) is relevant if and only if ix*

_{0}

*([v]*

*)*

_{i}*# q # ix*`

*([v]*

*).*

_{i}*Note that the diameter of [v]** _{i}* is bounded by (

*e[[v]*

*i*

*,(e). This immediately*

*implies max d([v]*

*)*

_{i}*# min d([v]*

*i*) 1 (

*e[[v]*

*i*

*,(e). Define D([v]*

*i*) 5 (

_{e[[v]}

_{i}*,(e)/ 2*

**

^{i21}*and ix*

_{`}

*([v]*

*)*

_{i}*5 ix*0

*([v]*

*)*

_{i}*1 D([v]*

*i*

*). Then, max d([v]*

*)*

_{i}*.. i 2 1 #*

*ix*

_{`}

*([v]*

*), as desired.*

_{i}LEMMA *18. The total number of relevant buckets is, 4m 1 4n.*

PROOF*. In connection with [v]** _{i}*, we have

*D([v]*

*i*) 1 1 # 2 1 (

*e[[v]*

*i*

*,(e)/*

2* ^{i21}* relevant buckets. Since

*7 is a rooted tree with n leaves, where all internal*

*nodes have at least two children, the number of nodes [v]*

*in*

_{i}*7 is # 2n 2 1.*

Thus, the total number of relevant buckets is at most

*[v]*

### O

*i*[7

## S

^{2}

^{1}

^{e[@v#}^{O}

^{i}^{,~e!/2}

^{i21}## D

^{, 4n 1}

^{[v]}

^{i}

^{[7,e[@v#}^{O}

^{i}^{,~e!/2}

^{i21}

^{5 4n 1}

^{e[E}^{O}

^{[v]}^{O}

^{i}

^{]e}^{,~e!/2}

^{i21}^{.}

*Now, consider any edge e* *5 (u, w) [ E. Set j 5* log2*,(e)* *1 1. Then e [*
*[v]*_{i}*N i* *$ j ` [v]**i* *5 [u]**i*. Since *,(e) , 2** ^{j}*, we get

*[v]*

### O

*i*

*]e*

,~e!/ 2* ^{i21}*,

### O

*i$j*

2* ^{j}*/ 2

*, 4.*

^{i21}Thus, the total number of relevant buckets *, 4m 1 4n. e*

*We will now show how to efficiently embed the relevant buckets of B(z, z) into*
*a single bucket array A with index set {0, . . . , N } where N* *5 O(m) is the total*
number of relevant buckets.

TheD-values, or in fact something better, will be found in connection with the
construction of*7 in Section 7. Also, the value of min d([v]**i*2) will turn out to be
*available when we first visit [v]*_{i}*. Hence both ix*_{0}*([v]*_{i}*) and ix*_{`}*([v]** _{i}*) can be

*identified as soon as we start visiting [v]*

*.*

_{i}*For each component [v]*_{i}*[ 7, let N([v]**i*) denote (*[w]**j**,[v]**i*(D([v]*j*) 1 1).

Here, is an arbitrary total ordering of the components in 7, say, postorder. The
*prefix sum N[ is trivially computed in time O(n) as soon as we have computed*
*D[. Now, for any [v]**i* *[ 7, x [ N*0*, if B([v]*_{i}*, x) is relevant, that is, if x* [
*{ix*_{0}*([v]*_{i}*), . . . , ix*_{`}*([v]*_{i}*)}, we identify B([v]*_{i}*, x) with A( x* *2 ix*0*([v]** _{i}*) 1

*N([v]*

_{i}*)); otherwise, the contents of B([v]*

_{i}*, x) is deferred to a “waste” bucket*

*A(N). In conclusion, the bucket structure B(z, z) is implemented in linear time*and space.

6.3. BUCKETING UNVISITEDCHILDREN. Let 8 denote the unvisited subforest
of*7. An unvisited component [v]**i*is a child of a visited component if and only if
*[v]** _{i}*is a root of a tree in8. In Section 8, we will present a data structure that for

*the changing set of roots [v]*

*in*

_{i}*8, maintains the changing values min D([v]*

*i*2) in linear total time. Assuming the above data structure, the rest of this subsection

*presents the pseudo-code needed to maintain that every unvisited child [w]*

*of a*

_{h}*visited component [v]*

_{j}*is correctly bucketed in B([v]*

_{i}*, min D([w]*

*)*

_{h}*.. i 2 1).*