You can think of your data as a matrix, where the rows represent the instances and the columns represent the features. Let's call this matrix A. Now A^TA represents feature \times feature, or as it is commonly called: the co-variance matrix. On the other hand, AA^T represents instance \times instance, i.e. the similarity! This matrix is called the gram matrix.
This is useful in computing SVD. SVD is a method to decompose a matrix A into 3 components:
A=U\Sigma V^T
U is obtained from the eigenvectors of the gram matrix ( AA^T), while V is obtained from the eigenvectors of the co-variance matrix ( A^TA).
This is useful in computing SVD. SVD is a method to decompose a matrix A into 3 components:
A=U\Sigma V^T
U is obtained from the eigenvectors of the gram matrix ( AA^T), while V is obtained from the eigenvectors of the co-variance matrix ( A^TA).
hello! I just discover this website and it's wonderful...everything is well explained, great job. Is there gonna be more?
ReplyDeleteI will try to update it more frequently! Thanks for the comment.
ReplyDelete