antognini
6 days ago
The behavior of a random walk in a high dimensional space can be counter-intuitive. If you take the random walk trajectory and then perform principal components analysis on it, it turns out more than half of the variance is along a single direction. More than 80% is along the first two principal components.
To make matters even more surprising, if you project the random walk trajectory down into these PCA subspaces they are no longer random at all. Instead the trajectory traces a Lissajous curve. (For example see figure 1 of this paper: https://proceedings.neurips.cc/paper/2018/file/7a576629fef88...)
thirtygeo
6 days ago
I use PCA quite often for a variety of signal enhancement tasks in natural sciences. This paper presented something that I would not have expected and I found it really interesting.
vladimirralev
6 days ago
They say "these results are completely general for any probability distribution with zero mean and a finite covariance matrix with rank much larger than the number of steps". It's not clear to me if that condition implies the number of steps is much lower than the dimensions of the random walk space or perhaps the probability distribution needs to be concentrated into a smaller number of dimensions to begin with? In which case the results is much less shocking.
antognini
6 days ago
The condition is the former. The probability distribution spans the full dimensionality of the space. Basically, the result will hold for an infinite number of dimensions and a finite number of steps. But it will also hold if you take both the number of steps and the dimensionality to infinity while holding the ratio N_steps / D constant with N_steps / D << 1.
jadbox
6 days ago
Thank you for sharing. Learning about PCA subspaces and Lissajous curves wasn't originally on my agenda today.