Tracking of random walks with geodesics

In this post I’ll tell you about a property of random walks on hyperbolic groups. To make a random walk on a group G just start from the identity in the Cayley graph, then move to a neighbor with uniform probability and keep going. You have for each n a probability measure \mu_n on G telling you the probability of ending up at any given g\in G. \mu_n is the law of a random variable X_n, and this sequence of random variables is referred to as (simple) random walk.

Theorem: Let \{X_n\} be a simple random walk on a non-virtually cyclic hyperbolic group G. Then the expected Hausdorff distance between \{X_i\}_{i\leq n} and a geodesic connecting 1 to X_n is O(\log(n)).

Surprisingly, I couldn’t find this result in the literature, except for free groups (Ledrappier) [EDIT: I’ve been pointed out a reference (Corollary 3.9). The proof is quite different, but the overall aims are deeper.]. This post contains an essentially complete proof of the theorem. It’ll be a bit more technical than usual, but I just couldn’t resist blogging the full proof as it’s so short. I’m almost done writing up the same result for relatively hyperbolic groups (as well as a related result for mapping class groups). [EDIT: here it is.]

First, we show that any point on [1,X_n] is close to a point in \gamma_n=\{X_i\}_{i\leq n}. This has very little to do with random walks, it’s just a fact about paths connecting the endpoints of a geodesic in a hyperbolic space.

Lemma 1: There exists C_1\geq 1 so that if the path \alpha (say of length at leat 2) connects the endpoints of a geodesic \gamma in a \delta-hyperbolic space and p\in\gamma, then d(p,\alpha)\leq C_1\log(length(\alpha)).

Another way of saying this is: hyperbolic spaces have (at least) exponential divergence.

The proof is not hard, you can find all the details in Bridson-Haefliger’s book, Proposition III.H.1.6.
The procedure is the following: you split \alpha into two subpaths of equal length and consider the triangle as in the picture:

Divergence in hyperbolic spaces

p is close to some p_1 on one of the sides. Then you repeat using p_1 instead of p, and if you keep going in logarithmically many steps you get to \alpha.

And now we have to exclude the existence of large detours, something like in the following picture:

no detours

The idea is that the first and last point of the detour X_{i_0},X_{i_1} will be close to some point in [1,X_n], and hence close to each other, the details are below. Anyway, this motivates showing that if |i-j| is at least logarithmically large, then so is d(X_i,X_j).

Lemma 2: There exists C_2 so that
\mathbb{P}[\exists i,j\leq n: |i-j|\geq C_2\log(n), d(X_i,X_j)\leq |i-j|/C_2]  =o(n^{-2}).

The “|i-j|\geq C_2\log(n)” part is there just to make the proof work, but we’re happy if |i_1-i_2|\leq C_2\log(n) for i_1,i_2 as in the picture above, because this gives a logarithmic bound on the length of the detour.

The proof is based on the following classical inequality due (afaik) to Kesten, which holds for nonamenable groups: There exists K>0 so that

\mathbb{P}[d(1,X_n)\leq n/K]\leq Ke^{-n/K}.

In words, the probability that X_n does not make linear progress decays exponentially.

In order to prove the lemma, we just have to sum these inequalities up (using the fact that d(X_i,X_j) has the same distribution as d(1,X_{|i-j|})).
Fixing i first (n possibilities) and letting the difference between i and j vary, we see that the probability in the statement of Lemma 2 is at most

n\times \sum_{j\geq C_2\log(n)} Ke^{-j/K}\leq K n\cdot n^{-C_2/K} \times \sum_{j\geq 0} e^{-j/K},

and this is it, for C_2 large.

Ok, the endgame. If there are no points on \alpha_n at distance more than C_1\log(n) from [1,X_n] we are happy. Otherwise, let i be so that d(X_i,[1,X_n])> C_1\log(n). Then each point on [1,X_n] is at distance at most C_1\log(n) from either \alpha^-_n=\{X_j\}_{j<i} or \alpha^+_n=\{X_j\}_{j>i}. As you move from 1 to X_n along the geodesic you’ll be at first close to \alpha^-_n and then you’ll be close to \alpha^+_n, so at some point p you’ll be close to both. In particular, there are i_0<j<i_1 so that d(X_i,X_j)\leq 2C_1\log(n). By Lemma 2 (we can assume that) we have |i_0-i_1|\leq 2C_1C_2\log(n). Hence, d(X_i,X_{i_0})\leq |i-i_0| is also logarithmically bounded, as well as d(X_i,[1,X_n])\leq d(X_i,X_{i_0})+d(X_{i_0},[1,X_n]). The end.

Advertisements
This entry was posted in random walks. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s