We're sorry. An error has occurred
Please cancel or retry.
Statistical Physics for Sparse Statistical Inference
Physicists, when modelling physical systems with a large number of degrees of freedom, and statisticians, when performing data analysis, have developed their own concepts and methods for making the `best' inference. There are mathematical similarities between the inference problems in statistics and statistical physics, and the viewpoint from statistical physics can help the quantitative understanding of inference problems. Over the last few years, it has been increasingly realised that ideas from statistical physics of disordered systems can help to develop new algorithms for important inference problems in different fields of application. This interdisciplinary field between statistical physics and statistics is now attracting much attention, but there is as yet no summarizing books to capture this synergy. This book will help researchers interested in the application of statistical inference and will enhance further development in statistical physics and statistics by presenting a review of the present landscape. It explains how the analytical tools of statistical physics can be exploited in the understanding wider inference problems. The authors describe how important statistical problems including maximum likelihood estimation, Bayesian inference, sparse estimation, information criterion and model selection can be mapped onto the statistical physics view and how the analytical tools of statistical physics can be used for solving such problems.
Key Features:
- First book on the topic
- Self-contained
- Mathematically accessible by grad students
- Author team of physicist and statistician

SCIENCE / Physics / Mathematical & Computational, Statistical physics, MATHEMATICS / Probability & Statistics / General, Probability and statistics

1 Introduction of sparse estimation
1.1 Problem settings Explanation about the estimation with sparsity and its motivation
1.2 Algorithms Explanation about the algorithms for sparse estimation used in signal processing and statistics 1.2.1 OMP, IHT, IRLS Introduction of algorithms mainly used in signal processing 1.2.2 Coordinate descent, DC algorithm Introduction of algorithms used in statistics and operations research
1.3 Estimation with penalties Explanation about the estimation method using the sparse penalties 1.3.1 Linear relaxation Explanation of L1 penalty
1.4 Theoretical bounds Explanation about the theoretical guarantees for the estimation performance 1.4.1 Restricted Isometry Property and Mutual Coherence 1.4.2 Null space property 1.4.3 Integral geometry
2 Statistical Physics method Explanation of statistical physics tools for the analysis of the sparse estimation problems
2.1 Replica method 2.1.1 Replica Symmetry 2.1.2 De Almeida-Thouless instability 2.1.3 1step replica symmetry breaking and complexity 2.1.4 Comments for the analytical continuation
3 Graphical models
Introduction of the basis for the statistical physics-based learning
3.1 Algorithms on the graphical models: Belief propagation and approximate message passing
3.2 State Evolution
4 Compressed sensing
4.1 Phase diagram with respect to the perfect reconstruction 4.1.1 L0 penalty 4.1.2 L1 penalty 4.1.3 Sampling method 4.1.4 Bayesian estimation 4.1.5 Nonconvex penalty
4.2 Comparison with RIP and Mutual Coherence
5 Noisy Compressed Sensing
5.1 Compression of data and rate distortion 5.1.1 L1 penalty 5.1.2 Sampling method 5.1.3 Nonconvex penalty
5.2 Replica Symmetry breaking Explain the relationship between the difficulty in the estimation and the phase transition picture in statistical physics 5.2.1 L0 penalty 5.2.2 Sampling method 5.2.3 Nonconvex penalty
6 Model selection Parameter selection and its relationship with statistical physics using AIC, cross validation errors, and likelihood
6.1 AIC
6.2 Cross Validation
6.3 Model selection by using likelihoods
7 Variant of compressed sensing
7.1 1-bit compressed sensing
7.2 Group testing
8 Statistical physics for feature learning This chapter deal with feature learning as an advanced setting of the compressed sensing problems.
8.1 Sparse coding by Olshausen and Field Explain the motivation for the feature learning
8.2 Matrix factorization 8.2.1 L0 penalty 8.2.2 L1 penalty 8.2.3 Bayesian inference
9 Compressed sensing in Natural Sciences
To enhance the further application of the sparsity-based estimation method by readers, we will show some examples where the method achieves fruitful results, including but not limited to:
9.1 Fluid dynamics by dynamic mode decomposition
9.2 Analysis of earthquakes
9.3 Compressed sensing for earth sciences
9.4 Blackhole observation
9.5 Mass spectrometer