Big data has revolutionized the landscape of computational mathematics and has increased the demand for new numerical linear algebra tools to handle the vast amount of data. One crucial task is to efficiently capture inherent structure in data using dimensionality reduction and feature extraction. Tensor-based approaches have gained significant traction in this setting by leveraging multilinear relationships in high-dimensional data. In this talk, we will describe a matrix-mimetic tensor algebra that offers provably optimal compressed representations of multiway data via a family of tensor singular value decompositions (SVDs). Moreover, using the inherited linear algebra properties of this framework, we will prove that these tensor SVDs outperform the equivalent matrix SVD and two closely related tensor decompositions, the Higher-Order SVD and Tensor-Train SVD, in terms of approximation accuracy. Throughout the talk, we will provide numerical examples to support the theory and demonstrate practical efficacy of constructing optimal tensor representations.
This presentation will serve as an overview of our PNAS paper "Tensor-tensor algebra for optimal representation and compression of multiway data" (https://www.pnas.org/doi/10.1073/pnas.2015851118).