Analyticity of the entropy and the escape rate of random walks in hyperbolic groups
- LMJL, Nantes, CNRS
- More about Sebastien Gouezel
Editorial introduction
Analyticity of the entropy and the escape rate of random walks in hyperbolic groups, Discrete Analysis 2017:7, 37pp.
Let be a probability measure on an infinite finitely generated group . We can use to define a random walk: we start at the identity, and then at each step, if we are at a point , we move to with probability . The distribution of the element of arrived at after steps of this random walk is given by the -fold convolution of .
The entropy of a probability distribution on a discrete set is defined to be . Given a random walk as just described, one can define its limiting entropy to be , the rough idea being that we expect the random walk to spread out in such a way that a typical probability is exponentially small (where “typical” is in terms of the probability distribution itself), and then the limiting entropy gives us the relevant exponent.
The limiting entropy is easily seen to be zero for some groups. For example, the standard random walk in will be supported in a set of size less than after steps, so its entropy will grow at most logarithmically with . But for other groups, the limiting entropy is positive: for example, the standard random walk on the free group with generators spreads out much more quickly.
A closely related quantity is the escape rate. Given a symmetric set of generators of the group, we can define the distance between two elements and to be the length of the shortest word in the generators that is equal to . The escape rate of is the limit as tends to infinity of times the expected distance from the identity after steps of the associated random walk. Again, this is zero for groups like , at least if the measure is symmetric, and positive for groups that are more like free groups.
In 1987, Gromov introduced the important and influential notion of a hyperbolic group. We shall not give the definition here, but roughly speaking a group is hyperbolic if it is more like a free group than like a group . In such groups, an interesting question arises. Suppose we look at all the probability measures defined on some finite set that generates the whole group. Each one has a limiting entropy, which is not zero (assuming that the hyperbolic group is not “elementary” – it turns out that finite groups and groups that are close to are hyperbolic), so we can ask whether the limiting entropy depends in a nice way on the measure. The purpose of this paper is to give a very strong positive answer to the question, which was asked by Kaimanovich: the entropy and escape rates depend analytically on the measure . This improves on partial results by several authors, showing weaker properties such as continuity or differentiability, or proving analyticity under extra assumptions. A by-product of the proof is a new proof of the central limit theorem.
The description of the entropy and the escape rate above as limits is not well suited to proving regularity statements: even the continuity in terms of is not obvious. Indeed, there are examples of groups, such as the infinite dihedral group, for which the entropy does not depend continuously on the measure. One useful feature of random walks in hyperbolic groups is the existence of integral formulas for the entropy and the escape rate, which make precise the heuristics that these quantities measure the average increase in entropy and distance to the origin at each step of the random walk. The main technical point in the paper is to show that the objects in these integral formulas (called the stationary measure at infinity and the Martin kernel) depend analytically on the measure .
The video below is of the first of four lectures (all available on YouTube) in which the author gives an introduction to the area of entropy and escape rates for hyperbolic groups. He does not discuss the result of this paper, but covers related results and useful background material.