Open Access
ARTICLE
Tensor Train Random Projection
1 School of Information Science and Technology, ShanghaiTech University, Shanghai, 201210, China
2 Peng Cheng Laboratory, Shenzhen, 518055, China
3 Innovation Academy for Microsatellites of Chinese Academy of Sciences, Shanghai, 201210, China
* Corresponding Author: Qifeng Liao. Email:
(This article belongs to the Special Issue: Numerical Methods in Engineering Analysis, Data Analysis and Artificial Intelligence)
Computer Modeling in Engineering & Sciences 2023, 134(2), 1195-1218. https://doi.org/10.32604/cmes.2022.021636
Received 25 January 2022; Accepted 25 March 2022; Issue published 31 August 2022
Abstract
This work proposes a Tensor Train Random Projection (TTRP) method for dimension reduction, where pairwise distances can be approximately preserved. Our TTRP is systematically constructed through a Tensor Train (TT) representation with TT-ranks equal to one. Based on the tensor train format, this random projection method can speed up the dimension reduction procedure for high-dimensional datasets and requires fewer storage costs with little loss in accuracy, compared with existing methods. We provide a theoretical analysis of the bias and the variance of TTRP, which shows that this approach is an expected isometric projection with bounded variance, and we show that the scaling Rademacher variable is an optimal choice for generating the corresponding TT-cores. Detailed numerical experiments with synthetic datasets and the MNIST dataset are conducted to demonstrate the efficiency of TTRP.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.