Relationships between the number of inputs and other complexity measures of Boolean functions
- Mathematics, MIT
Editorial introduction
Relationships between the number of inputs and other complexity measures of Boolean functions, Discrete Analysis 2022:20, 21 pp.
The paper concerns two questions about Boolean functions, that is, functions f:{−1,1}n→{−1,1}.
Write n(f) for the number of relevant variables of f, meaning the number of i such that there exist x and y that differ only in the ith digit and for which f(x)≠f(y). The first question asks whether we can give tight bounds for n(f) in terms of other well-studied complexity measures. Those other measures are the following.
- The sensitivity s(f,x) of f at a point x is the number of coordinates i such that flipping the i-th bit of x flips the value of f. The max-sensitivity of f is the maximum of s(f,x) over all x∈{−1,1}n.
- The block sensitivity bs(f,x) of f at x is the maximum m such that there are disjoint sets B1,…,Bm (called blocks), such that for each i, if all the values of x in Bi are flipped, then the value of f is flipped. The max-block-sensitivity of f is the maximum of bs(f,x) over all x.
- The degree of f, denoted deg(f), is simply the degree of f when it is considered as a multilinear polynomial in n variables over R.
- The certificate complexity of f at x, denoted C(f,x), is the size of the smallest set of coordinates S such that every y that has the same restriction to S as x satisfies f(y)=f(x). The certificate complexity C(f) of f is the maximum of C(f,x) over all x.
- The decision tree complexity of f, denoted DT(f), is the smallest depth of a decision tree that computes f. (This means that one has a tree, and at each vertex one can decide to go left or right according to the value of some chosen bit of x. And when one reaches a leaf, one must know the value of f(x).)
The second question is whether one can improve the best known bounds between block sensitivity and degree, a question that relates closely to a recent breakthrough by Huang, who proved the long-standing conjecture that sensitivity and block-sensitivity are polynomially related.
For the first question, the paper improves upon known results by obtaining better bounds. The improvements are not asymptotic but rather constant factor improvements. However, in some of these cases, the asymptotic behavior was already determined in previous work, so improving constants is all there is that is left to do.
It is an easy exercise to prove that s(f)≤bs(f)≤C(f)≤DT(f)≤n(f). It is also straightforward to show that deg(f)≤DT(f) (since fixing one coordinate cannot reduce the degree of f by more than 1).
Bounds in the other direction are less simple to prove. In 1994 Nisan and Szegedy proved that n(f)≤deg(f)2deg(f)−1, which was improved to 6.62⋅2deg(f) by Chiarelli, Hatami and Saks in 2018. A lower bound of 2deg(f)−1 is straightforward to prove inductively: if we have a function f of degree d−1 that genuinely depends on m variables, then we can take two disjointly supported copies f0 and f1 of f, and take a further variable xi, and form the function g(x)=(1+xi)f0(x)/2+(1−xi)f1(x)/2, which takes the value f0(x) when xi=1 and f1(x) when xi=−1. This function depends on 2m+1 variables, which proves the inductive step. (The base case is that a degree-1 function can genuinely depend on one variable.) Chiarelli, Hatami and Saks (in the same paper) improved this lower bound by a factor 3/2 – a similar construction was found independently by Shinkar and Tal.
Thus, the upper and lower bounds agree to within a constant factor. One of the main results of this paper is an improvement to the upper bound, which replaces the constant 6.62 by 4.4. As with previous proofs, the argument proceeds inductively. To obtain an improvement, the author keeps careful track of both the degree of the function and its block sensitivity. It was shown by Nisan and Szegedy that bs(f)≤deg(f)2, so in principle one could just keep track of the degree, but keeping track of both allows for more accurate answers. A second idea that feeds into the proof is to obtain better bounds for n(f) when deg(f) and bs(f) are small. For instance, the author shows that when deg(f)≤14, the bound bs(f)≤deg(f)2 can be somewhat improved – by a factor of roughly 3/5.
Another result proved is that n(f)≤124C(f), which improves on a bound of 4C(f)C(f) that follows easily from previously known results. The results in the first part of the paper are proved via a unified framework that leads to shorter proofs and that may be of independent interest.
The second main theme of the paper is improving on the bound bs(f)≤deg(f)2 mentioned above. The improvement is by a factor √2/3. One of the tools used is the Markov brothers inequality, which states that if P is a polynomial of degree n that takes values of modulus at most 1 on the interval [−1,1], then the kth derivative of P takes values of modulus at most
n2(n2−12)…(n2−(k−1)2)1.3.5…(2k−1)
on the same interval. This is applied in the case k=2 in order to exploit information that can be obtained about the second derivative of f. It appears to be the first use of the Markov brothers inequality in Boolean function analysis.
The inequality of Nisan and Szegedy is a lemma in Huang’s proof of the block sensitivity conjecture, so one consequence of the improvement of the constant is a corresponding improvement to the bound that Huang’s proof gives.