Information complexity is an important tool for proving lower bounds in communication complexity. It is a measure of the amount of information revealed by the transcript of a communication protocol about the inputs. It has been used to prove lower bounds about problems such as DISJOINTNESS, as well as composition results such as direct sum theorems. It seems to be quite powerful and previously the relationship between information complexity and other lower bound techniques such as the corruption bound was not well understood.
We will show that information complexity subsumes essentially all known lower bound techniques in communication complexity. Namely, we prove that information complexity is at least the relaxed partition bound, which subsumes the rectangle/corruption, smooth rectangle, discrepancy, and $\gamma_2$ bounds. This gives support to the conjecture that information complexity is equal to communication complexity.
As applications of our results, we give lower bounds on the information complexity of the Gap Hamming Distance problem and the Vector in Subspace problem, which in turn gives an exponential separation between quantum communication complexity and classical information complexity.
This is joint work with Iordanis Kerenidis, Sophie Laplante, Virginie Leray, and Jeremie Roland.