- Prof. Dr. Amin Coja-Oghlan
- Prof. Dr. Raman Sanyal
- Prof. Dr. Thorsten Theobald
- Prof. Dr. Claus-Peter Schnorr
- Dr. Samuel Hetterich
- Dr. Noela Müller
- Stephan Gardoll
- Oliver Gebhard
- Max Hahn-Klimroth
- Joon Lee
- Philipp Loick
- Sebastian Manecke
- Helen Naumann
- Jean Ravelomanana
- Maurice Rolvien
- ehemalige Mitglieder
The systematic study of random discrete structures such as random graphs, codes, formulas and matricescommenced with the seminal work of Paul Erdős and Alfréd Rényi in the 1950s/60s. In one of their most important contributions they identified the phase transition for the emergence of the giant component in a random graph, a mean field version of the famous percolation problem. Phase transitions have remained the protagonists of the discipline ever since.
One of the initial applications of probabilistic combinatorics was the construction of combinatorial structures with seemingly paradoxical properties by way of a random experiment. A prime example is Erdős’ construction of a graph with high girth and high chromatic number. A second well-known example is Shannon’s random code. The general technique, known as the probabilistic method, has become a mainstay of modern discrete mathematics. Applications range from extremal combinatorics to discrete geometry.
Furthermore, over the years intimate connections between probabilistic combinatorics and statistical physics, particularly the physics of disordered systems, emerged. This interdisciplinary angle has led to several remarkable discoveries, one of which is the Survey Propagation algorithm for the k-SAT problem, a prominent benchmark in computer science. In addition, the cavity method has led to important predictions on phase transitions. The rigorous verification of these predictions is an ongoing research effort, to which we have been contributing actively. Our research on probablistic combinatorics was supported by an ERC grant(2011-2016). Currently we receive funding from the DFG for a joint project with colleagues from TU Graz.
Mathematical foundations of data science
A great many fundamental computational tasks can best be described as inference problems. Graph clustering is a prime example: given a complex network, can we detect and infer a latent community structure? A further important example is decoding: a codeword gets sent along a noisy channel. The receiver’s goal is to infer the original message from the noisy observation. The mathematical study of such inference problems is closely related to fundamental questions in probabilistic combinatorics.
Heuristic arguments suggest that depending on the signal-to-noise ratio inference problems typically undergo an impossible-hard-easy transition. Specifically, beyond a certain noise level, inference is information-theoretically impossible. On the other hand, at very low noise levels efficient algorithms may be available to solve the inference problem. In the middle ground, inference may be information-theoretically feasible but computationally intractable. In some examples such as the stochastic block model or the Hopfield neural network, statistical physics calculations predict that these three regimes are separated by sharp phase transitions. One of our research objectives is to investigate these predictions rigorously.
A further line of work deals with the analysis of inference algorithms, particularly message passing algorithms such as Belief Propagation. This algorithm is understood perfectly on acyclic networks. It is also known that Belief Propagation does poorly in the worst case. Yet loopy Belief Propagation turns out to be remarkably successful. Explaining its success and investigating the connections between Belief Propagation and other algorithmic paradigms such as semidefinite programming is one of the research objectives for which we currently receive funding from the DFG.