Analyticity of the entropy and the escape rate of random walks in hyperbolic groups
- LMJL, Nantes, CNRS
- More about Sebastien Gouezel
Editorial introduction
Analyticity of the entropy and the escape rate of random walks in hyperbolic groups, Discrete Analysis 2017:7, 37pp.
Let μ be a probability measure on an infinite finitely generated group G. We can use μ to define a random walk: we start at the identity, and then at each step, if we are at a point g, we move to gh with probability μ(h). The distribution of the element of G arrived at after n steps of this random walk is given by the n-fold convolution μ∗n of μ.
The entropy H(ν) of a probability distribution ν on a discrete set X is defined to be ∑x∈Xν(x)log(1/ν(x)). Given a random walk as just described, one can define its limiting entropy to be limn→∞n−1H(μ∗n), the rough idea being that we expect the random walk to spread out in such a way that a typical probability is exponentially small (where “typical” is in terms of the probability distribution itself), and then the limiting entropy gives us the relevant exponent.
The limiting entropy is easily seen to be zero for some groups. For example, the standard random walk in Zd will be supported in a set of size less than (2n+1)d after n steps, so its entropy will grow at most logarithmically with n. But for other groups, the limiting entropy is positive: for example, the standard random walk on the free group with d generators spreads out much more quickly.
A closely related quantity is the escape rate. Given a symmetric set of generators of the group, we can define the distance between two elements x and y to be the length of the shortest word in the generators that is equal to x−1y. The escape rate of μ is the limit as n tends to infinity of n−1 times the expected distance from the identity after n steps of the associated random walk. Again, this is zero for groups like Zd, at least if the measure μ is symmetric, and positive for groups that are more like free groups.
In 1987, Gromov introduced the important and influential notion of a hyperbolic group. We shall not give the definition here, but roughly speaking a group is hyperbolic if it is more like a free group than like a group Zd. In such groups, an interesting question arises. Suppose we look at all the probability measures μ defined on some finite set A that generates the whole group. Each one has a limiting entropy, which is not zero (assuming that the hyperbolic group is not “elementary” – it turns out that finite groups and groups that are close to Z are hyperbolic), so we can ask whether the limiting entropy depends in a nice way on the measure. The purpose of this paper is to give a very strong positive answer to the question, which was asked by Kaimanovich: the entropy and escape rates depend analytically on the measure μ. This improves on partial results by several authors, showing weaker properties such as continuity or differentiability, or proving analyticity under extra assumptions. A by-product of the proof is a new proof of the central limit theorem.
The description of the entropy and the escape rate above as limits is not well suited to proving regularity statements: even the continuity in terms of μ is not obvious. Indeed, there are examples of groups, such as the infinite dihedral group, for which the entropy does not depend continuously on the measure. One useful feature of random walks in hyperbolic groups is the existence of integral formulas for the entropy and the escape rate, which make precise the heuristics that these quantities measure the average increase in entropy and distance to the origin at each step of the random walk. The main technical point in the paper is to show that the objects in these integral formulas (called the stationary measure at infinity and the Martin kernel) depend analytically on the measure μ.
The video below is of the first of four lectures (all available on YouTube) in which the author gives an introduction to the area of entropy and escape rates for hyperbolic groups. He does not discuss the result of this paper, but covers related results and useful background material.