The sixth moment of the Riemann zeta function and ternary additive divisor sums

- Mathematics and Computer Science, University of Lethbridge
**ORCID iD:**0000-0002-1660-4824- More about Nathan Ng

*Discrete Analysis*, July. https://doi.org/10.19086/da.22057.

### Editorial introduction

The sixth moment of the Riemann zeta function and ternary additive divisor sums, Discrete Analysis 2021:6, 60 pp.

The Riemann hypothesis states that every non-trivial zero of the Riemann zeta function lies on the critical line ℜ(z)=1/2. It is considered by many people to be the most important unsolved problem in mathematics, partly because it is a fundamental question that relates to many other questions, and in part because large numbers of interesting number-theoretic statements have been shown to follow from it.

It turns out that some of these consequences follow from statements about the zeta function that are weaker than the Riemann hypothesis. In particular, one important weakening is the *Lindelöf hypothesis*. Like the Riemann hypothesis itself, this has several equivalent formulations. One of them states that |ζ(1/2+it)|=O(|t|ϵ) for every ϵ>0. Another is that the number of zeros with real part at least 1/2+ϵ and imaginary part between T and T+1 is o(logT). (For comparison, the number of zeros without the condition on the real part is known to be O(logT).) If the Riemann hypothesis holds, there are no such zeros, so the Lindelöf hypothesis is indeed a weakening.

The first formulation is a pointwise statement about the function t↦ζ(1/2+it). However, it is often easier to prove statements about averages, and it turns out that the Lindelöf hypothesis is also equivalent to the statement that

Ik(T)=1T∫T0|ζ(1/2+it)|2kdt=O(Tϵ)

for every ϵ>0. That is, for each k, the 2kth moment of the zeta function grows more slowly than any positive power of T.

This formulation has the attractive property that it turns the Lindelöf statement into a statement that has small cases that can be looked at separately – that is, one can try to obtain bounds for the moments for individual small values of k.

Unfortunately, even this problem seems to be very hard: Hardy and Littlewood proved in 1918 that the bound holds when k=1, and Ingham proved it for k=2 in 1926. But the case k=3 remains open (and therefore, since the moments increase with k, the problem is open for all k≥3), and is now considered a major problem in its own right.

A more recent development, using heuristics based on the conjectured relationship between the distribution of zeros of the Riemann zeta function and the zeros of random matrices, is a series of surprisingly precise conjectures about the growth rate of the moments Ik(T), which is believed to be of the form CkT(logT)k2 for an explicitly defined constant Ck. These conjectures agree with the known results for k=1,2, and it is now known that the given formulae are indeed lower bounds for Ik(T). Also, conditional on the Riemann hypothesis, they have been established up to a constant in the opposite direction.

The main purpose of this paper is to obtain a power-type improvement for the upper bound for I3(T). Unfortunately, this improvement is not unconditional, but it depends on an assumption that is more basic and elementary than the Riemann hypothesis, so it sheds interesting light on the problem.

The assumption made in the paper concerns sums of the form

σx1,…,xk(n)=∑n1…nk=nn−x11…n−xkk.

Note that if x1=⋯=xk and ∑xi=s, then this sum is equal to n−sdk(n), where dk(n) is the number of ways of writing n as a product of k positive integers (with the order mattering). Sums of the above kind can be used to build another sum, which relates in a natural way to the problem of estimating moments of the zeta function, and for which there is a known asymptotic formula. The assumption in the paper is that the error term in that formula is sufficiently small.

While proving this assumption is probably still very hard, the result is nevertheless interesting, since it means that for the first time the sixth moment problem has been precisely and rigorously reduced to a single concrete counting problem, as opposed to the mixture of (sometimes imprecisely formulated) heuristics that had been applied previously.