High-Dimensional Probability With Engineering Applications: Theory, Examples, and Python Implementations (Computational Mathematics Library)
Format:
Hardcover
En stock
1.18 kg
Sí
Nuevo
Amazon
USA
- A rigorous, code-backed blueprint for nonasymptotic analysis in modern statistics, machine learning, and signal processing. This graduate-level text develops concentration inequalities, empirical process theory, and random matrix methods from first principles to advanced results—then translates them into reproducible Python experiments that verify theory at scale.Every chapter follows a deliberate structure—Theory, Examples, and a Python Implementation—bridging deep proofs with engineering practice. Readers move from sub-Gaussian and transport inequalities to chaining, multiplier processes, Hanson–Wright, high-dimensional CLTs, matrix Laplace methods, spectral limit laws, RIP/JL embeddings, randomized numerical linear algebra, and low-rank estimation—equipped with tools to certify performance, generalization, and robustness of high-dimensional algorithms.Who this book is forGraduate students in mathematics, statistics, electrical engineering, and computer science who need a proof-first, computation-ready treatment of high-dimensional phenomena.Researchers and practitioners in machine learning, signal processing, and optimization who require nonasymptotic guarantees for algorithms operating in high dimension.Engineers seeking principled methods for uncertainty quantification in large-scale systems, from MIMO to MRI, radar, and data center telemetry.What you will learnDerive and deploy sharp concentration bounds (Chernoff, Bernstein, McDiarmid, Freedman; Poincaré, log-Sobolev; transportation–entropy; self-normalized).Control suprema via Gaussian process tools (Slepian, Sudakov–Fernique, Borell–TIS) and optimal chaining (Dudley, generic chaining, majorizing measures).Master empirical process theory (VC theory, Rademacher/Gaussian complexities, localized complexities, multiplier and offset processes) under sub-Gaussian and heavy tails.Analyze quadratic forms and chaoses (Hanson–Wright, decoupling) and apply high-dimensional CLTs and anti-concentration to simultaneous inference.Prove nonasymptotic bounds for random matrices (matrix Bernstein/Chernoff/Hoeffding/Freedman, noncommutative Khintchine) and study spectral distributions (semicircle, Marchenko–Pastur).Quantify error in PCA, spiked models, and covariance estimation; certify JL embeddings, RIP, and randomized NLA algorithms; guarantee sample complexity for matrix completion.Translate proofs into code to validate inequalities, visualize finite-sample regimes, and stress-test assumptions against real data.
IMPORT EASILY
By purchasing this product you can deduct VAT with your RUT number