Staff
Main lecturer: Pr. Florent Krzakala, head of IdePHICS lab (with help from the amazing Pr. Lenka Zdeborová, head of SPOC lab).
Teaching assistants: Yatin Dandi, Guillaume Dalle, Luca Pesce.
Overview
Interest in the methods and concepts of statistical physics is rapidly growing in fields as diverse as theoretical computer science, probability, machine learning, discrete mathematics, optimization and compressed sensing. The present course will cover this rich and active interdisciplinary research landscape.
More specifically, we will review the statistical physics approach to problems ranging from graph theory (percolation, community detection) through discrete optimization and constraint satisfaction (satisfiability, coloring) to machine learning (learning with two-layer neural networks, mean-field dynamics, gradient descent and SGD, low-rank matrix and tensor factorization, etc.).
We will expose theoretical methods of analysis (replica, cavity, …) algorithms (message passing, spectral methods, …), neural networks (theory of SGD, …), discuss concrete applications, highlight rigorous justifications as well as present the connection to the physics of glassy and disordered systems.
The aim is to make this growing field accessible to students from computer science, math, and engineering, not only physics (but of course, physicists are more than welcome). The course is set up so that we shall give open problems during and at the end of the semester.
References
-
Information, Physics and Computation, Oxford Graduate Texts, 2009, M. Mézard & A. Montanari
-
Statistical Physics of inference: Thresholds and algorithms, Advances in Physics Volume 65 Issue 5, 2016, L. Zdeborová & F. Krzakala
Logistics
Lectures will take place on Fridays (10:00-12:00 in CM 011) and exercise sessions will take place on Thursdays (10:00-12:00 in CM 011).
Recorded lectures will be made avalible on mediaspace.
Grading: the final mark will be the average of
- either your 4 best homeworks
- or your 2 best homeworks + a 15-minute presentation of your progress on the open problem
Detailed lecture notes
Chapters 1-13 (still in construction)
Classes
Week 1: The Curie-Weiss model
Lecture content: Curie-Weiss model in Chap 1
Exercises: 1.1 (Bounds on the binomial) and 1.5 (Glauber dynamics). Due on March 2nd at 10 AM
Main lecture
Mean-Field model in Julia
A seminar on large deviations
Week 2: Random Field Ising Model
Lecture content: Random Field Ising Model in Chap 2, see also the slides about the replica method
Exercises: 2.1 (Gaussian Poincare inequality) & 2.3 (Mean-field algorithm). Due on March 9th at 10 AM
Julia notebook: RFIM & mean field
Main lecture
RFIM in Julia
Stieltjes transform of Wigner matrices with replica
Week 3: Random Field Ising Model with non-trivial network topology
Lecture content: Random Field Ising Model on trees and sparse graphs.
The lectures notes for this chapter are here
Exercises (also here): 4.2 (population dynamics), but 4.1 (Erdös-Renyi graphs) is also strongly recommended. Due on March 16th at 10 AM.
Julia notebook: RFIM on random graphs & Potts model
Main lecture
RFIM on random graphs
Potts model & Erdös-Renyi degree
Week 4: Random Energy Model
Lecture content: We study the Random Energy Model.
The lectures notes for this chapter on on chap 13 in the notes.
Exercises: 13.1 (REM as a p-spin model, a replica computation) OR 13.3 (Maximum of Gaussians and denoising). You may do both but it is ok if you just focus on one of them! Dead line is March 23th at 10 AM.
\[G(\mathbf{Q}, \widehat{\mathbf{Q}}) = -\frac{\beta^2 n}{4} - \frac{\beta^2}{2} \sum_{a<b} \left(Q^{ab}\right)^p + \mathrm{i} \sum_{a<b} \widehat{Q}^{ab} Q^{ab} - \frac{1}{N} \log \sum_{\mathbf{S}^1, \ldots, \mathbf{S}^n} \exp\left[\mathrm{i} \sum_{a<b} \widehat{Q}^{ab} \left(\mathbf{S}^a \cdot \mathbf{S}^b \right) \right]\]Errors in Ex 13.1:
- The Hamiltonian for the $p$-spin model is $\mathcal{H}(\mathbf{S}) = -\sum_{i_1 < \cdots < i_p} J_{i_1, \ldots, i_p} S_{i_1} \cdots S_{i_p}$
- The interaction terms are all sampled according to a Gaussian distribution with standard deviation $\sigma = \sqrt{\frac{p!}{2 N^{p-1}}}$, that is, $\mathbb{P}(J) = \sqrt{\frac{N^{p-1}}{\pi p!}} \exp\left(-\frac{N^{p-1}}{p!} J^2\right)$
- In question 2, the actual result you should get is (up to a change of sign on $\widehat{Q}$):
- In question 3, this is for $p \to \infty$$
- In question 4, just show that $\mathbb{E} [\mathcal{H}(\mathbf{S})]$ and $\mathbb{E} [\mathcal{H}(\mathbf{S}) \mathcal{H}(\mathbf{S}’)]$ match with the REM for $p \to \infty$
Solutions: p-spin model (PDF), maximum of Gaussians (Julia notebook)
Main lecture
Maximum of Gaussians
Replica for the p-spin model
Week 5: Graphical models & Open problems
We are arriving at the important moment of chosing your open problem: Fill up your preference for the open problems here https://docs.google.com/spreadsheets/d/1guW79xOLy_fOBGnfnI6lmXtH2IMZAlom-1hMBxD-rAw/edit?usp=sharing.
For this week, the exercice is quite simple: Exercise 4.1: Representing problems by graphical models and Belief Propagation
Lecture content: graphical models & introduction to open problems
Main Lecture
Week 6: Inference problems
Lectures content: Inference problems, the spin glass game & AMP.
Main Lecture
Week 7: AMP & State Evolution
Lectures content: AMP & State Evolution
Homework (for May 4th): exercises on information theory
Warning: the previous version of the HW contained typos, we tried to fix them but there may be some left. Proceed with caution.
Main Lecture
Week 8: I-MMSE & Generalized Linear Models
Main Lecture
Exercises
Week 9: GAMP & Applications
Main Lecture
Week 10: Random constraint satisfaction problem
Main Lecture
Week 11:
Main Lecture