## Logarithmic Space

### reachability Is NL-Complete

• reachability ∈ NL (p. 95).

• Suppose L is decided by the log n space-bounded TM N .

• Given input x, construct in logarithmic space the

polynomial-sized configuration graph G of N on input x (see Theorem 21 on p. 176).

• G has a single initial node, call it 1.

• Assume G has a single accepting node n.

• x ∈ L if and only if the instance of reachability has a

“yes” answer.

### 2sat Is NL-Complete

• 2sat ∈ NL (p. 265).

• As NL = coNL (p. 191), it suffices to reduce the coNL-complete unreachability to 2sat.

• Start without loss of generality an acyclic graph G.

• Identify each edge (x, y) with clause ¬x ∨ y.

• Add clauses (s) and (¬t) for the start and target nodes s and t.

• The resulting 2sat instance is satisfiable if and only if

### The Class RL

• reachability is for directed graphs.

• It is not known if undirected reachability is in L.

• But it is in randomized logarithmic space, called RL.

• RL is RP in which the space bound is logarithmic.

• We shall prove that undirected reachability ∈ RL.^{a}

• As a note, undirected reachability ∈ coRL.^{b}

aAleliunas, Karp, Lipton, Lov´asz, and Rackoff (1979).

bBorodin, Cook, Dymond, Ruzzo, and Tompa (1989).

### Random Walks

• Let G = (V, E) be an undirected graph with 1, n ∈ V .

• Add self-loops { i, i } at each node i.

• The randomized algorithm for testing if there is a path from 1 to n is a random walk.

### The Random Walk Framework

1: x := 1;

2: while x 6= n do

3: Pick y uniformly from x’s neighbors (including x);

4: x := y;

5: end while

### Some Terminology

• v^{t} is the node visited by the random walk at time t.

• In particular, v_{0} = 1.

• d^{i} denotes the degree of i (including the self-loops).

• Let p^{t}[ i ] = prob[ v^{t} = i ].

### A Convergence Result

Lemma 102 If G = (V, E) is connected, then
lim^{t}_{→∞} p^{t}[ i ] = _{2·| E |}^{d}^{i} for all nodes i.

• Here is the intuition.

• The random walk algorithm picks the edges uniformly randomly.

• In the limit, the algorithm will be well “mixed” and forgets about the initial node.

• Then the probability of each node being visited is proportional to its number of incident edges.

• Finally, observe that P^{n}

i=1 d^{i} = 2 · | E |.

### Proof of Lemma 102

• Let δ^{t}[ i ] = p^{t}[ i ] − _{2·| E |}^{d}^{i} , the deviation.

• Define ∆^{t} = P

i∈V |δ^{t}[ i ]|, the total absolute deviation.

• Now we calculate the p^{t}_{+1}[ i ]’s from the p^{t}[ i ]’s.

• Each node divides its p^{i}[ t ] into d^{i} equal parts and
distributes them to its neighbors.

• Each node adds those portions from its neighbors
(including itself) to form p^{i}[ t + 1 ].

### The Flows

*p*_{t+ 1}*[ xi* *]/d*_{i}_{x}

*p*_{t+ 1}*[ x2*]/ *d*_{x2}*p*_{t+ 1}*[ x1*]/ *d*_{x1}

*p*_{t+ 1}*[ i ]*
*p*_{t}*[ i ]/ d*_{i}

*p*_{t}*[ i ]/d*_{i}*p*_{t}*[ i ]/d*_{i}

*p*_{t}*[ i ]*

### Proof of Lemma 102 (continued)

• p^{t}[ i ] = δ^{t}[ i ] + _{2·| E |}^{d}^{i} by definition.

• Splitting and giving the _{2·| E |}^{d}^{i} part does not affect
p^{t}_{+1}[ i ] because the same _{2·| E |}^{1} is exchanged between
any two neighbors.

• So we only consider the splitting of the δ^{t}[ i ] part.

• The δ^{t}[ i ]’s are exchanged between adjacent nodes.

### Proof of Lemma 102 (continued)

• Clearly P

i δ^{t}_{+1}[ i ] = P

i δ^{t}[ i ] because of conservation.

• But ∆^{t}_{+1} = P

i |δ^{t}_{+1}[ i ]| ≤ P

i |δ^{t}[ i ]| = ∆^{t}.
– If δ^{t}[ i ]’s are all of the same sign, then

∆^{t}_{+1} = P

i |δ^{t}_{+1}[ i ]| = P

i |δ^{t}[ i ]| = ∆^{t}.

– When δ^{t}[ i ]’s of opposite signs meet at a node, that
will reduce P

i |δ^{t}_{+1}[ i ]|.

• We next quantify the decrease ∆^{t} − ∆^{t}_{+1}.

### Proof of Lemma 102 (continued)

• There is a node i^{+} with δ^{t}[ i^{+} ] ≥ _{2·| V |}^{∆}^{t} , and there is a
node i^{−} with δ^{t}[ i^{−} ] ≤ −_{2·| V |}^{∆}^{t} .

– Recall that P

i δ^{t}[ i ] = 0 and P

i∈V |δ^{t}[ i ]| = ∆^{t}.
– So the sum of all δ^{t}[ i ] ≥ 0 equals ∆^{t}/2.

– As there are at most | V | such δ^{t}[ i ], there must be
one with magnitude at least (∆^{t}/2)/| V |.

– Similarly for δ^{t}[ i ] ≤ 0.

### Proof of Lemma 102 (continued)

• There is a path [ i_{0} = i^{+}, i_{1}, i_{2}, . . . , i_{2m} = i^{−} ] with an
even number of edges between i^{+} and i^{−}.

– Add self-loops to make it true.

• The positive deviation δ^{t}[ i^{+} ] from i^{+} will travel along
this path for m steps, always subdivided by the degree
of the current node.

• Similarly for the negative deviation δ^{t}[ i^{−} ] from i^{−}.

### Proof of Lemma 102 (continued)

• At least a positive deviation equal to _{|V |}^{1}^{m} of the original
amount will arrive at the middle node i^{m}.

• Similarly for a negative deviation from the opposite direction.

• So after m ≤ n steps, a positive deviation of at least

∆^{t}

2·|V |^{n} will cancel an equal amount of negative deviation.

• We do not need to care about cases where numbers of
the same sign meet at a node; they will not change ∆^{t}.

### Proof of Lemma 102 (concluded)

• So in n steps the total absolute deviation decreases from

∆^{t} to at most ∆^{t}(1 − _{|V |}^{1}^{n}).

• But we already knew that ∆^{t} will never increase.^{a}

• So in the limit, ∆^{t} → 0 (but exponentially slow).

aContributed by Mr. Chih-Duo Hong (R95922079) on January 11, 2007.

### First Return Times

• Lemma 102 (p. 783) and theory of Markov chains^{a} imply
that the walk returns to i every 2 · | E |/d^{i} steps,

asymptotically and on the average.

• Equivalently, if v^{t} = i, then the expected time until the
walk comes back to i for the first time after t is

2 · | E |/d^{i}, asymptotically.

– This is called the mean recurrence time.

aParticularly, theory of homogeneous Markov chains on first passage time.

### First Return Times (concluded)

• Although the above is an asymptotic statement, the said expected return time is the same for any t—including the beginning t = 0.

• So from the beginning and onwards, the expected time between two successive visits to node i is exactly

2 · | E |/d^{i}.

Average Time To Reach Target Node n

• Assume there is a path [ 1, i_{1}, . . . , i^{m} = n ] from 1 to n.

– If there is none, we are done because the algorithm then returns no false positives.

• Starting from 1, we will return to 1 every expected
2 · | E |/d_{1} steps.

• Every cycle of leaving and returning uses at least two edges of 1.

– They may be identical.

Average Time To Reach Target Node n (continued)

• So after an expected ^{d}_{2}^{1} of such returns, the walk will
head to i_{1}.

– There are d^{2}_{1} pairs of edges incident on node 1 used
for the cycles.

– Among them, d_{1} of them leave node 1 by way of i_{1}
and d_{1} of them return by way of i_{1}.

• The expected number of steps is
d_{1}

2

2 · | E |

d_{1} = | E |.

Average Time To Reach Target Node n (concluded)

• Repeat the above argument from i_{1}, i_{2}, . . .

• After an expected number of ≤ n · |E| steps, we will have arrived at node n.

• Markov’s inequality (p. 410) suggests that we run the algorithm for 2n · | E | steps to obtain the desired

probability of success, 0.5.

Probability To Visit All Nodes

Corollary 103 With probability at least 0.5, the random walk algorithm visits all nodes in 2n · | E | steps.

• Repeat the above arguments for this particular path:

[ 1, 2, . . . , n ].

### The Complete Algorithm

1: x := 1;

2: c := 0;

3: while x 6= n and c < 2n · | E | do

4: Pick y uniformly from x’s neighbors (including x);

5: x := y;

6: c := c + 1;

7: end while

8: if x = n then

9: “yes”;

10: else

11: “no”;

### Some Graph-Theoretic Notions

• A d-regular (undirected) graph has degree d for each node.

• Let G be d-regular.

• Each node’s incident edge is labeled from 1 to d.

– An edge is labeled at both ends.

1 2 3

1 2

3

1 2

3

1 2

3

1 2

3 1

2 3

1 2

3

1 2

3 2

4

3

5

6

7 1

8

### Universal Sequences

^{a}

• A sequence of numbers between 1 and d results in a walk on the graph if given the starting node.

– E.g., (1, 3, 2, 2, 1, 3) from node 1.

• A sequence of numbers between 1 and d is called universal for d-regular graphs with n nodes if:

– For any labeling of any n-node d-regular graph G, and for any starting node, all nodes of G are visited.

– A node may be visited more than once.

• Useful for museum visitors, security guards, etc.

### Existence of Universal Sequences

Theorem 104 For any n, a universal sequence exists for the set of d-regular connected undirected n-node graphs.

• Enumerate all the different labelings of d-regular n-node connected graphs and all starting nodes.

• Call them (G_{1}, v_{1}), (G_{2}, v_{2}), . . . (finitely many).

• S_{1} is a sequence that traverses G_{1}, starting from v_{1}.
– A spanning tree will accomplish this.

• S_{2} is a sequence that traverses G_{2}, starting from the
node at which S_{1} ends when applied to (G_{2}, v_{2}).

### The Proof (concluded)

• S_{3} is a sequence that traverses G_{3}, starting from the
node at which S_{1}S_{2} ends when applied to (G_{3}, v_{3}), etc.

• The sequence S ≡ S_{1}S_{2}S_{3} · · · is universal.

– Suppose S starts from node v of a labeled d-regular
n-node graph G^{′}.

– Let (G^{′}, v) = (G^{k}, n^{k}), the kth enumerated pair.

– By construction, S^{k} will traverse G^{′} (if not earlier).

### A O(n

^{3}

### log n) Bound on Universal Sequences

Theorem 105 For any n and d, a universal sequence of
length O(n^{3} log n) for d-regular n-node connected graphs
exists.

• Fix a d-regular labeled n-node graph G.

• A random walk of length 2n · | E | = n^{2}d = O(n^{2}) fails to
traverse G with probability at most 1/2.

– By Corollary 103 (p. 797).

– This holds wherever the walk starts.

• The failure probability for G drops to 2−Θ(n log n) if the
random walk has length Θ(n^{3} log n).

### The Proof (continued)

• There are 2^{O}^{(n log n)} d-regular labeled n-node graphs.

– Each node has ≤ n^{d} choices of neighbors.

– So there are ≤ n^{d}^{+1} d-regular graphs on nodes
{ 1, 2, . . . , n }.

– Each node’s d edges are labeled with unique integers between 1 and d.

– Hence the count is

≤ n^{d}^{+1}(d!)^{n} = n^{O}^{(n)} = 2^{O}^{(n log n)}.

### The Proof (concluded)

• The probability that there exists a d-regular labeled

n-node graph that the random walk fails to traverse can be made at most 1/2.

– Lengthen the length of the walk suitably.

• Because the probability is less than one, there exists a walk that traverses all labeled d-regular graphs.