A.V. Bernstein
Skolkovo Institute of Science and Technology, Moscow, 143026 Russia
Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow, 127051 Russia
For citation: Bernstein A.V. Manifold learning in statistical tasks. Uchenye Zapiski Kazanskogo Universiteta. Seriya Fiziko-Matematicheskie Nauki, 2018, vol. 160, no. 2, pp. 229–242.
Для цитирования: Bernstein A.V. Manifold learning in statistical tasks // Учен. зап. Казан. ун-та. Сер. Физ.-матем. науки. – 2018. – Т. 160, кн. 2. – С. 229–242.
For citation: Bernstein A.V. Manifold learning in statistical tasks. Uchenye Zapiski Kazanskogo Universiteta. Seriya Fiziko-Matematicheskie Nauki, 2018, vol. 160, no. 2, pp. 229–242.
Для цитирования: Bernstein A.V. Manifold learning in statistical tasks // Учен. зап. Казан. ун-та. Сер. Физ.-матем. науки. – 2018. – Т. 160, кн. 2. – С. 229–242.
Abstract
Many tasks of data analysis deal with high-dimensional data, and curse of dimensionality is an obstacle to the use of many methods for their solving. In many applications, real-world data occupy only a very small part of high-dimensional observation space, the intrinsic dimension of which is essentially lower than the dimension of this space. A popular model for such data is a manifold model in accordance with which data lie on or near an unknown low-dimensional data manifold (DM) embedded in an ambient high-dimensional space. Data analysis tasks studied under this assumption are referred to as the manifold learning ones. Their general goal is to discover a low-dimensional structure of high-dimensional manifold valued data from the given dataset. If dataset points are sampled according to an unknown probability measure on the DM, we face statistical problems on manifold valued data. The paper gives a short review of statistical problems regarding high-dimensional manifold valued data and the methods for solving them.
Keywords: data analysis, mathematical statistics, manifold learning, manifold estimation, density on manifold estimation, regression on manifolds
Acknowledgements. This work was supported by the Russian Science Foundation (project no. 14-50-00150).
References
1. Cleveland W.S. Data science: An action plan for expanding the technical areas of the field of statistics. Int. Stat. Rev., 2001, vol. 69, no. 1, pp. 21–26. doi: 10.1111/j.1751-5823.2001.tb00477.x.
2. Chen Li M., Su Z., Jiang B. Mathematical Problems in Data Science. Theoretical and Practical Methods. Springer, 2015. xviii, 213 p. doi: 10.1007/978-3-319-25127-1.
3. Donoho D.L. High-dimensional data analysis: The curses and blessings of dimensionality. Proc. AMS Conf. on Math Challenges of the 21st Century, 2000, pp. 1–33.
4. Pinoli J.-Ch. Mathematical Foundations of Image Processing and Analysis. Vols. 1, 2. John Wiley & Sons, 2014. Vol. 1: 464 p. Vol. 2: 496 p.
5. Verleysen M. Learning high-dimensional data. In: Ablameyko S. et al. (Eds.) Limitations and Future Trends in Neural Computation. IOS Press, 2003. pp. 141–162.
6. Stone Ch.J. Optimal rates of convergence for nonparametric estimators. Ann. Stat., 1980, vol. 8, no. 6, pp. 1348–1360.
7. Stone Ch.J. Optimal global rates of convergence for nonparametric regression. Ann. Stat., 1982, vol. 10, no. 4, pp. 1040–1053.
8. Wasserman L. All of Nonparametric Statistics. New York, Springer, 2007. xii, 270 p. doi: 10.1007/0-387-30623-4.
9. Cacoullos T. Estimation of a multivariate density. Ann. Inst. Stat. Math., 1966, vol. 18, no. 1, pp. 179–189.
10. Rosenblatt M. Remarks on some nonparametric estimates of a density function. Ann. Math. Stat., 1956, vol. 27, no. 3, pp. 832–837.
11. Parzen E. On estimation of a probability density function and mode. Ann. Math. Stat., 1962, vol. 33, no. 3, pp. 1065–1076.
12. Bunte K., Biehl M., Hammer B. Dimensionality reduction mappings. Proc. IEEE Symp. on Computational Intelligence and Data Mining (CIDM). Paris, IEEE, 2011, pp. 349–356. doi: 10.1109/CIDM.2011.5949443.
13. Jolliffe I.T. Principal Component Analysis. New York, Springer, 2002. xxx, 488 p. doi: 10.1007/b98835.
14. Cox T.F., Cox M.A.A. Multidimensional Scaling. London, Chapman and Hall/CRC, 2000. 328 p.
15. Hecht-Nielsen R. Replicator neural networks for universal optimal source coding. Science, 1995, vol. 269, no. 5232, pp. 1860–1863. doi: 10.1126/science.269.5232.1860.
16. Hinton G.E., Salakhutdinov R.R. Reducing the dimensionality of data with neural networks. Science, 2006, vol. 313, no. 5786, pp. 504–507. doi: 10.1126/science.1127647.
17. Kramer M. Nonlinear principal component analysis using autoassociative neural networks. AIChE J., 1991, vol. 37, no. 2, pp. 233–243. doi: 10.1002/aic.690370209.
18. DeMers D., Cottrell G.W. Non-linear dimensionality reduction. Proc. Conf. Adv. Neural Inf. Process. Syst. 5. San Francisco, Morgan Kaufmann Publ., 1993, pp. 580–587.
19. Schlkopf B., Smola A., Mller K.-R., Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput., 1998, vol. 10, no. 5, pp. 1299–1319. doi: 10.1162/089976698300017467.
20. Seung H.S., Lee D.D. Cognition. The manifold ways of perception. Science, 2000, vol. 290, no. 5500, pp. 2268–2269. doi: 10.1126/science.290.5500.2268.
21. Cayton L. Algorithms for manifold learning. Technical Report CS2008-0923. Univ. of Calif. at San Diego, 2005, pp. 541–555, 2005.
22. Huo X., Ni X., Smith A.K. Survey of manifold-based learning methods. In: Liao T. Warren, Triantaphyllou E. Recent Advances in Data Mining of Enterprise Data. Singapore, World Sci., 2007, pp. 691–745. doi: 10.1142/6689.
23. Izenman A.J. Introduction to manifold learning. Comput. Stat., 2012, vol. 4, no. 5, pp. 439–446. doi: 10.1002/wics.1222.
24. Ma Y., Fu Y. Manifold Learning Theory and Applications. London, CRC Press, 2011. 314 p.
25. Niyogi P., Smale S., Weinberger S. Finding the homology of submanifolds with high confidence from random samples. Discrete Comput. Geom., 2008, vol. 39, nos. 1–3, pp. 419–441. doi: 10.1007/s00454-008-9053-2.
26. Woods Yu.-Ch. Differential geometry of Grassmann manifolds. Proc. Natl. Acad. Sci. USA, 1967, vol. 57, no. 3, pp. 589–594.
27. Jost J. Riemannian Geometry and Geometric Analysis. Berlin, Heidelberg, Springer, 2011. xiii, 611 p. doi: 10.1007/978-3-642-21298-7.
28. Lee J.M. Manifolds and differential geometry. In: Graduate Studies in Mathematics. Vol. 107. Providence, R.I., Am. Math. Soc., 2009. 671 p.
29. Pennec X. Probabilities and statistics on Riemannian manifolds: Basic tools for geometric measurements. J. Math. Imaging Vision, 2006, vol. 25, no. 1, pp. 127–154. doi: 10.1007/s10851-006-6228-4.
30. Pelletier B. Kernel density estimation on Riemannian manifolds. Stat. Probab. Lett., 2005, vol. 73, no. 3, pp. 297–304. doi: 10.1016/j.spl.2005.04.004.
31. Henry G., Muoz A., Rodriguez D., Locally adaptive density estimation on Riemannian manifolds. Stat. Oper. Res. Trans., 2013, vol. 37, no. 2, pp. 111–130.
32. Bennett R.S. The intrinsic dimensionality of signal collections. IEEE Trans. Inf. Theory, 1969, vol. 15, no. 5, pp. 517–525. doi: 0.1109/TIT.1969.1054365.
33. Kattov M., Simon P. Origins of dimension theory. In: Aull C.E., Lowen R. (Eds.) Handbook of the History of General Topology. Vol. 1. Dordrecht, Springer, 1997, pp. 113–134. doi: 10.1007/978-94-017-0468-7_11.
34. Levina E., Bickel P.J. Maximum likelihood estimation of intrinsic dimension. Proc. 17th Int. Conf. on Neural Information Processing Systems. Cambridge, MA, MIT Press, 2004, pp. 777–784.
35. Fan M., Qiao H., Zhang B. Intrinsic dimension estimation of manifolds by incising balls. Pattern Recognit., 2009, vol. 42, no. 5, pp. 780–787. doi: 10.1016/j.patcog.2008.09.016.
36. Fan M., Gu N., Qiao H., Zhang B. Intrinsic dimension estimation of data by principal component analysis. arXiv:1002.2050 [cs.CV], 2010, pp. 1–8.
37. Rozza A., Lombardi G., Rosa M., Casiraghi E., Campadelli P. IDEA: Intrinsic dimension estimation algorithm. Proc. Int. Conf. on Image Analysis and Processing - ICIAP 2011. Lecture Notes in Computer Science. Vol. 6978. Berlin, Heidelberg, Springer, 2011, pp. 433–442. doi: 10.1007/978-3-642-24085-0_45.
38. Campadelli P., Casiraghi E., Ceruti C., Rozza A. Intrinsic dimension estimation: Relevant techniques and a benchmark framework. Math. Probl. Eng., 2015, art. 759567, pp. 1–21. doi: 10.1155/2015/759567.
39. Camastra F., Staiano A. Intrinsic dimension estimation: Advances and open problems. Inf. Sci., 2016, vol. 328, pp. 26–41. doi: 10.1016/j.ins.2015.08.029.
40. Tehenbaum J.B., Silva de V., Langford J.C. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, vol. 290, no. 5500, pp. 2319–2323. doi: 10.1126/science.290.5500.2319.
41. Bernstein M., Silva de V., Langford J.C., Tenenbaum J.B. Graph approximations to geodesics on embedded manifolds. Technical Report. Stanford University, 2000. 26 p.
42. Belkin M., Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput., 2003, vol. 15, no. 6, pp. 1373–1396. doi: 10.1162/089976603321780317.
43. Rosasco L., Belkin M., De Vito E. On learning with integral operators. J. Mach. Learn. Res., 2010, vol. 11, pp. 905–934.
44. Singer A., Wu. H-T. Vector diffusion maps and the connection Laplacian. Commun. Pure Appl. Math., 2012, vol. 65, no. 8, pp. 1067–1144. doi: 10.1002/cpa.21395.
45. Yanovich Yu. Asymptotic properties of eigenvalues and eigenfunctions estimates of linear operators on manifolds. Lobachevskii J. Math., 2017, vol. 38, no. 6, pp. 1–12.
46. Saul L.K., Roweis S.T. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, vol. 290, no. 5500, pp. 2323–2326. doi: 10.1126/science.290.5500.2323.
47. Donoho D.L., Grimes C. Hessian eigenmaps: New locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA, 2003, vol. 100, no. 10, pp. 5591–5596. doi: 10.1073/pnas.1031596100.
48. Weinberger K.Q., Saul L.K. Maximum variance unfolding: Unsupervised learning of image manifolds by semidefinite programming. Int. J. Comput. Vision, 2006, vol. 70, no. 1, pp. 77–90.
49. Brand M. Charting a manifold. Proc. Int. Conf. on Advances in Neural Information Processing Systems. Cambridge, MA, MIT Press, 2003, pp. 961–968.
50. Lee J.A., Verleysen M. Quality assessment of dimensionality reduction based on k-ary neighborhoods. JMLR: JMLR Workshop and Conf. Proc. Vol. 4: New challenges for feature selection in data mining and knowledge discovery. Antwerpen, 2008, pp. 21–35.
51. Lee J.A., Verleysen M. Quality assessment of dimensionality reduction: Rank-based criteria. Neurocomputing, 2009, vol. 72, nos. 7–9, pp. 1431–1443. doi: 10.1016/j.neucom.2008.12.017.
52. Saul L.K., Roweis S.T. Think globally, fit locally: Unsupervised learning of low dimensional manifolds. J. Mach. Learn. Res., 2003, vol. 4, no. 2, pp. 119–155. doi: 10.1162/153244304322972667.
53. Zhang Z., ZhaH. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J. Sci. Comput., 2005, vol. 26, no. 1, pp. 313–338. doi: 10.1137/S1064827502419154.
54. Bernstein A.V., Kuleshov A.P. Tangent bundle manifold learning via Grassmann & Stiefel eigenmaps. arXiv:1212.6031, 2012, pp. 1–25.
55. Bernstein A.V., Kuleshov A.P. Manifold learning: Generalizing ability and tangent proximity. Int. J. Software Inf., 2013, vol. 7, no. 3, pp. 359–390.
56. Freedman D. Efficient simplicial reconstructions of manifold from their samples. IEEE Trans. Pattern Anal. Mach. Intell., 2002, vol. 24, no. 10, pp. 1349–1357. doi: 10.1109/TPAMI.2002.1039206.
57. Boissonnat J.-D., Ghosh A. Manifold reconstruction using tangential delaunay complexes. Discr. Comput. Geom., 2014, vol. 51, no. 1, pp. 221–267. doi: 10.1007/s00454-013-9557-2.
58. Karygianni S., Frossard P. Tangent-based manifold approximation with locally linear models. Signal Process., 2014, vol. 104, pp. 232–247. doi: 10.1016/j.sigpro.2014.03.047.
59. Canas G.D., Poggio T., Rosasco L. Learning manifolds with k-means and k-flats, arXiv:1209.1121, 2013, pp. 1–19.
60. Kuleshov A., Bernstein A., Yanovich Y. Asymptotically optimal method in manifold estimation. Abstr. 29th Eur. Meet. of Statisticians. Budapest, 2013, p. 325.
61. Wang L., Wang X., Feng J. Subspace distance analysis with application to adaptive Bayesian algorithm for face recognition. Pattern Recognit., 2006, vol. 39, no. 3, pp. 456–464. doi: 10.1016/j.patcog.2005.08.015.
62. Hamm J., Lee D.D. Grassmann discriminant analysis: A unifying view on subspace-based learning. Proc. 25th Int. Conf. on Machine Learning (ICML-08). Helsinki, 2008, pp. 376–383. doi: 10.1145/1390156.1390204.
63. Hotelling H. Relations between two sets of variables. Biometrika, 1936, vol. 28, nos. 3–4, pp. 321–377. doi: 10.1093/biomet/28.3-4.321.
64. Genovese C.R., Perone-Pacifico M., Verdinelli I., Wasserman L. Minimax manifold estimation. J. Mach. Learn. Res., 2012, vol. 13, pp. 1263–1291.
65. Achlioptas D. Random matrices in data analysis. Proc. 15th Eur. Conf. on Machine Learning. Lecture Notes in Computer Science. Vol. 3202. Pisa, Springer, 2004, pp. 1–8. doi: 10.1007/978-3-540-30116-5_1.
66. Tyagi H., Vural E., Frossard P. Tangent space estimation for smooth embeddings of riemannian manifold. arXiv:1208.1065, 2013, pp. 1–35.
67. Coifman R.R., Lafon S., Lee A.B., Maggioni M., Warner F., Zucker S. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. Proc. Natl. Acad. Sci., 2005, vol. 102, no. 21, pp. 7426–7431. doi: 10.1073/pnas.0500334102.
68. Yanovich Yu. Asymptotic properties of local sampling on manifold. J. Math. Stat., 2016, vol. 12, no. 3, pp. 157–175. doi: 10.3844/jmssp.2016.157.175.
69. Yanovich Yu. Asymptotic properties of nonparametric estimation on manifold. Proc. 6th Workshop on Conformal and Probabilistic Prediction and Applications, 2017, vol. 60, pp. 18–38.
70. Kaslovsky D.N., Meyer F.G. Non-asymptotic analysis of tangent space perturbation. J. IMA, 2014, vol. 3, no. 2, pp. 134–187. doi: 10.1093/imaiai/iau004.
71. Bernstein A., Kuleshov A., Yanovich Yu. Information preserving and locally isometric&conformal embedding via Tangent Manifold Learning. Proc. 2015 IEEE Int. Conf. on Data Science and Advanced Analytics (DSAA). Paris, IEEE, 2015, pp. 1–9. doi: 10.1109/DSAA.2015.7344815.
72. Wagner T.J. Nonparametric estimates of probability densities. IEEE Trans. Inf. Theory, 1975, vol. 21, no. 4, pp. 438–440.
73. Hendriks H. Nonparametric estimation of a probability density on a Riemannian manifold using Fourier expansions. Ann. Stat., 1990, vol. 18, no. 2, pp. 832–849.
74. Henry G., Rodriguez D. Kernel density estimation on Riemannian manifolds: Asymptotic results. J. Math. Imaging Vis., 2009, vol. 34, no. 3, pp. 235–239. doi: 10.1007/s10851-009-0145-2.
75. Ozakin A., Gray A. Submanifold density estimation. Proc. Conf. ``Neural Information Processing Systems'' (NIPS 2009), 2009, pp. 1–8.
76. Park H.S. Asymptotic behavior of kernel density estimation from a geometry viewpoint. Commun. Stat. – Theory Methods, 2012, vol. 41, no. 19, pp. 3479–3496. doi: 10.1080/03610926.2011.585009.
77. Kim Y.T., Park H.S. Geometric structures arising from kernel density estimation on Riemannian manifolds. J. Multivar. Anal., 2013, vol. 114, pp. 112–126. doi: 10.1016/j.jmva.2012.07.006.
78. Berry T., Sauer T. Density estimation on manifolds with boundary. Comput. Stat. Data Anal., 2017, vol. 107, pp. 1–17. doi: 10.1016/j.csda.2016.09.011.
79. Kuleshov A., Bernstein A., Yanovich Yu. High-dimensional density estimation for data mining tasks. Proc. 2017 IEEE Int. Conf. on Data Mining (ICDMW). New Orleans, LA, IEEE, 2017, pp. 523–530. doi: 10.1109/ICDMW.2017.74.
80. Pelletier B. Nonparametric regression estimation on closed Riemannian manifolds. J. Nonparametric Stat., 2006, vol. 18, no. 1, pp. 57–67. doi: 10.1080/10485250500504828.
81. Loubes J.M., Pelletier B. A kernel-based classifier on a Riemannian manifold. Stat. Decis., 2008, vol. 26, no. 1, pp. 35–51. doi: 10.1524/stnd.2008.0911.
82. Steinke F., Hein M., Scholkopf B. Nonparametric regression between general Riemannian manifolds. SIAM J. Imaging Sci., 2010, vol. 3, no. 3, pp. 527–563. doi: 10.1137/080744189.
83. Banerjee M., Chakraborty R., Ofori E., Okun M.S., Vaillancourt D., Vemuri B.C. A nonlinear regression technique for manifold valued data with applications to medical image analysis. Proc. 2016 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). Las Vegas, NV, IEEE, 2016, pp. 4424–4432. doi: 10.1109/CVPR.2016.479.
84. Hinkle J., Muralidharan P., Fletcher P.T. Polynomial regression on Riemannian manifolds. arXiv:1201.2395, 2012, pp. 1–14.
85. Liu G., Lin Z., Yu Y. Multi-output regression on the output manifold. Pattern Recognit., 2009, vol. 42, no. 11, pp. 2737–2743. doi: 10.1016/j.patcog.2009.05.001.
86. Shi X., Styner M., Lieberman J., Ibrahim J.G., Lin W., Zhu H. Intrinsic regression models for manifold-valued data. Med. Image Comput. Comput.-Assist. Interv., 2009, vol. 12, pt. 2, pp. 192–199.
87. Kim H.J., Adluru N., Collins M.D., Chung M.K., Bendlin B.B., Johnson S.C., Davidson R.J., Singh V. Multivariate general linear models (MGLM) on Riemannian manifolds with applications to statistical analysis of diffusion weighted images. Proc. 2014 IEEE Conf. on Computer Vision and Pattern Recognition. Columbus, OH, IEEE, 2014, pp. 2705–2712. doi: 10.1109/CVPR.2014.352.
88. Kim H.J., Adluru N., Bendlin B.B., Johnson S.C., Vemuri B.C., Singh V. Canonical correlation analysis on Riemannian manifolds and its applications. In: Fleet D., Pajdla T., Schiele B., Tuytelaars T. (Eds.) Computer Vision - ECCV 2014. ECCV 2014. Lecture Notes in Computer Science. Vol. 8690. Springer, 2014, pp. 251–267. doi: 10.1007/978-3-319-10605-2_17.
89. Bickel P., Li B. Local polynomial regression on unknown manifolds. In: IMS Lect. Notes – Monogr. Ser.Vol. 54: Complex Datasets and Inverse Problems: Tomography, Networks and Beyond. 2007, pp. 177–186.
90. Aswani A., Bickel P., Tomlin C. Regression on manifolds: Estimation of the exterior derivative. Ann. Stat., 2011, vol. 39, no. 1, pp. 48–81. doi: 10.1214/10-AOS823.
91. Cheng M.Y., Wu H.T. Local linear regression on manifolds and its geometric interpretation. J. Am. Stat. Assoc., 2013, vol. 108, no. 504, pp. 1421–1434. doi: 10.1080/01621459.2013.827984.
92. Yang Y., Dunson D.B. Bayesian manifold regression. arXiv:1305.0617, 2014, pp. 1–40.
93. Einbeck J., Evers L. Localized regression on principal manifolds. Proc. 25th Int. Workshop on Statistical Modeling (IWSM 2010). Glasgow, 2010, pp. 1–6.
94. Fletcher P.T. Geodesic regression on Riemannian manifolds. Proc. Int. Workshop on Mathematical Foundations of Computational Anatomy (MFCA), 2011, pp. 75–86.
95. Fletcher P.T. Geodesic regression and the theory of least squares on Riemannian manifolds. Int. J. Comput. Vision, 2013, vol. 105, no. 2, pp. 171–185. doi: 10.1007/s11263-012-0591-y.
96. Banerjee M., Chakraborty R., Ofori E., Vaillancourt D., Vemuri B.C. Nonlinear regression on Riemannian manifolds and its applications to neuro-image analysis. In: Lecture Notes on Computer Science. Vol. 9349: Medical image computing and computer-assisted intervention, pt. I. Springer, Heidelberg, 2015, pp. 719–727.
97. Kuleshov A., Bernstein A. Nonlinear multi-output regression on unknown input manifold. Ann. Math. Artif. Intell., 2017, vol. 81, nos. 1–2, pp. 209–240. doi: 10.1007/s10472-017-9551-0.
Received
October 17, 2017
Bernstein Alexander Vladimirovich, Doctor of Physical and Mathematical Sciences, Professor of the Center for Computational and Data-Intensive Science and Engineering; Leading Researcher of the Intelligent Data Analysis and Predictive Modeling Laboratory
Skolkovo Institute of Science and Technology
ul. Nobelya, 3, Territory of the Innovation Center "Skolkovo'', Moscow, 143026 Russia
Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Bolshoy Karetny pereulok 19, str. 1, Moscow, 127051 Russia
E-mail: a.bernstein@skoltech.ru
Контент доступен под лицензией Creative Commons Attribution 4.0 License.