# Tensors

## Contents

## Reading List[edit]

### Introductory Material[edit]

Papers with at least some very introductory content include:

- Le Song's GA Tech class on machine learning - See lecture_tensors1.pdf and lecture_tensors2.pdf, gentle intro slides.
- A gentle Introduction to Tensors Boaz Porat / The technion/ 2014 - A more math-based introduction.
- ICML 07, ECML/PKDD 2013 tutorials

### [edit]

Here are some papers with content that's currently research-level for machine learning applications.

- Tensor Decompositions and Applications Kolda, Bader 2009 - I (Akshay) believe this is the best existing survey paper on tensor decompositions.
- Tensor Decompositions for Learning Latent Variable Models Anandkumar et al. 2012 - Sec. 2 is a quick overview, and the rest comprises applications.
- Fourier PCA Goyal, Vempala, Xiao 2014 - A large paper here, but see especially pg. 6 and Sec. 3.1. References provide advanced tensor decomposition content.

### Glossary of terms[edit]

This is a list of terms that we need to understand in order to understand the work on tensors in ML:

- Tensor rank, mode, order
- CP Decomposition
- Tucker Decomposition

Critical intuitions:

- Tensors (even symmetric) don't need to have orthogonal decompositions.