Fluctuations of Eigenvalues
Beyond the Semicircle Law: How linear statistics of eigenvalues fluctuate around their mean?
7 Classical Central Limit Theorem
- The Setup: Let $X_1, X_2, \dots, X_n$ be independent and identically distributed (i.i.d.) random variables with mean $\mu$ and variance $\sigma^2$.
- The Scaling: The sum $S_n = \sum X_i$ fluctuates around $n\mu$ with a magnitude of $\sqrt{n}$. The normalized sum converges to a Normal distribution: $$ \frac{S_n- n\mu}{\sigma\sqrt{n}} \xrightarrow{d} \mathcal{N}(0, 1) $$
VISUALIZER Normal Distribution $\mathcal{N}(\mu, \sigma^2)$
8 Linear Eigenvalue Statistics
We consider the Linear Eigenvalue Statistic defined for a test function $f$:
Unlike the classical case where terms are independent, the eigenvalues $\lambda_i$ repel each other. This rigidity creates a self-averaging effect.
Key Difference
- Classical CLT: Fluctuations are of order $\mathbf{O(\sqrt{n})}$.
- RMT CLT: Fluctuations are of order $\mathbf{O(1)}$.
"The eigenvalues are so correlated that the global fluctuations do not grow with the system size."
9 Random Band Matrices
A Random Band Matrix is a Hermitian matrix where entries $A_{ij}$ are zero if $|i-j| > b_{n}$.
- $b_{n} = n$: Retrieves the standard Wigner Matrix model.
- $b_{n} = O(1)$: Resembles the Anderson Model (localization).
The bandwidth $b_{n}$ interpolates between two distinct physical regimes: delocalized states (conductors) and localized states (insulators).
Fluctuations in Random Band Matrices
Random Band Matrices (RBM) serve as an intermediate model between Wigner matrices (full) and Anderson models (diagonal). The bandwidth $b_N$ controls the sparsity.
Theorem (I Jana)
Consider a random band matrix $A_n$ with bandwidth $b_n$. If the test function $\varphi$ is sufficiently regular (e.g., analytic or Sobolov space $H^s$ with $s > 1/2$), and the bandwidth grows such that $b_n \to \infty$ and $b_n \ll n$, then the centered linear statistic converges to a Gaussian Distribution;
This result extends the universality of the $O(1)$ Gaussian fluctuations to the band matrix regime, establishing that even with sparse structure, eigenvalue repulsion maintains rigid spectral statistics.
My Publications On This Topic
1
Spectrum of random centrosymmetric matrices; CLT and circular law
I. Jana, S. Rani
Random Matrices: Theory and Applications, 2025
Spectrum of random centrosymmetric matrices; CLT and circular law
I. Jana, S. Rani
Random Matrices: Theory and Applications, 2025
Abstract
We analyze the asymptotic fluctuations of linear eigenvalue statistics of random centrosymmetric matrices with i.i.d. entries. We prove that for a complex analytic test function, the centered and normalized linear eigenvalue statistics of random centrosymmetric matrices converge to a normal distribution. We find the exact expression of the variance of the limiting normal distribution via combinatorial arguments. Moreover, we also argue that the limiting spectral distribution of properly scaled centrosymmetric matrices follows the circular law.
2
CLT for non-Hermitian random band matrices with variance profiles
I. Jana
Journal of Statistical Physics, 2022
CLT for non-Hermitian random band matrices with variance profiles
I. Jana
Journal of Statistical Physics, 2022
Abstract
We show that the fluctuations of the linear eigenvalue statistics of a non-Hermitian random band matrix of increasing bandwidth $b_{n}$ with a continuous variance profile $w_{\nu}(x)$ converges to a $N(0,\sigma_{f}^{2}(\nu))$, where $\nu=\lim_{n\to\infty}(2b_{n}/n)\in [0,1]$ and $f$ is the test function. When $\nu\in (0,1]$, we obtain an explicit formula for $\sigma_{f}^{2}(\nu)$, which depends on $f$, and variance profile $w_{\nu}$. When $\nu=1$, the formula is consistent with Rider and Silverstein (2006). We also independently compute an explicit formula for $\sigma_{f}^{2}(0)$ i.e., when the bandwidth $b_{n}$ grows slower compared to $n$. In addition, we show that $\sigma_{f}^{2}(\nu)\to \sigma_{f}^{2}(0)$ as $\nu\downarrow 0$.
3
Linear eigenvalue statistics of random matrices with a variance profile
K. Adhikari, I. Jana, K. Saha
Random Matrices: Theory and Applications, 2021
Linear eigenvalue statistics of random matrices with a variance profile
K. Adhikari, I. Jana, K. Saha
Random Matrices: Theory and Applications, 2021
Abstract
We give an upper bound on the total variation distance between the linear eigenvalue statistic, properly scaled and centered, of a random matrix with a variance profile and the standard Gaussian random variable. The second-order Poincaré inequality-type result introduced in [S. Chatterjee, Fluctuations of eigenvalues and second order poincaré inequalities, Prob. Theory Rel. Fields 143(1) (2009) 1–40.] is used to establish the bound. Using this bound, we prove central limit theorem for linear eigenvalue statistics of random matrices with different kind of variance profiles. We re-establish some existing results on fluctuations of linear eigenvalue statistics of some well-known random matrix ensembles by choosing appropriate variance profiles.
4
Fluctuations of linear eigenvalue statistics of random band matrices
I. Jana, K. Saha, A. Soshnikov
Theory of Probability and its Applications, 2014
Fluctuations of linear eigenvalue statistics of random band matrices
I. Jana, K. Saha, A. Soshnikov
Theory of Probability and its Applications, 2014
Abstract
In this paper, we study the fluctuation of linear eigenvalue statistics of Random Band Matrices defined by $M_{n}=\frac{1}{\sqrt{b_{n}}}W_{n}$, where $W_{n}$ is a $n\times n$ band Hermitian random matrix of bandwidth $b_{n}$, i.e., the diagonal elements and only first $b_{n}$ off diagonal elements are nonzero. Also variances of the matrix elmements are upto a order of constant. We study the linear eigenvalue statistics $\mathcal{N}(\phi)=\sum_{i=1}^{n}\phi(\lambda_{i})$ of such matrices, where $\lambda_{i}$ are the eigenvalues of $M_{n}$ and $\phi$ is a sufficiently smooth function. We prove that $\sqrt{\frac{b_{n}}{n}}[\mathcal{N}(\phi)-\mathbb{E} \mathcal{N}(\phi)]\stackrel{d}{\to} N(0,V(\phi))$ for $b_{n}>>\sqrt{n}$, where $V(\phi)$ is given in the Theorem 1.
Questions?
Thank you for your attention.