## SciPy – 7 – autovalori e autovettori

Continuo da qui, copio qui.
Davvero, quella del titolo è la traduzione di eigenvalues and eigenvectors; o almeno credo 😎

The first topic that you will tackle are the eigenvalues and eigenvectors.

Eigenvalues are a new way to see into the heart of a matrix. But before you go more into that, let’s explain first what eigenvectors are. Almost all vectors change direction, when they are multiplied by a matrix. However, certain exceptional, resulting vectors are in the same direction as the vectors that are the result of the multiplication. These are the eigenvectors.

In other words, multiply an eigenvector by a matrix, and the resulting vector of that multiplication is equal to a multiplication of the original eigenvector with `λ`, the eigenvalue: `Ax=λx`.

This means that the eigenvalue gives you very valuable information: it tells you whether one of the eigenvectors is stretched, shrunk, reversed, or left unchanged—when it is multiplied by a matrix.

e

You use the `eig()` function from the `linalg` SciPy module to solve ordinary or generalized eigenvalue problems for square matrices.

Note that the `eigvals()` function is another way of unpacking the eigenvalues of a matrix.

When you’re working with sparse matrices, you can fall back on the module `scipy.sparse` to provide you with the correct functions to find the eigenvalues and eigenvectors: `la, v = sparse.linalg.eigs(myMatrix,1)`.

Note that the code above specifies the number of eigenvalues and eigenvectors that has to be retrieved, namely, 1.

The eigenvalues and eigenvectors are important concepts in many computer vision and machine learning techniques, such as Principal Component Analysis (PCA) for dimensionality reduction and EigenFaces for face recognition.

Singular Value Decomposition (SVD)
Next, you need to know about SVD if you want to really learn data science. The singular value decomposition of a matrix `A` is the decomposition or facorization of `A` into the product of three matrices:

The size of the individual matrices is as follows if you know that matrix `A` is of size `M x N`:

• Matrix `U` is of size `M x M`
• Matrix `V` is of size `N x N`
• Matrix `Σ` is of size `M x N`

The `∗` indicates that the matrices are multiplied and the `t` that you see in `Vt` means that the matrix is transposed, which means that the rows and columns are interchanged.

Simply stated, singular value decomposition provides a way to break a matrix into simpler, meaningful pieces. These pieces may contain some data we are interested in.

Al solito, il risultato non è quello atteso 😡

Note that for sparse matrices, you can use the `sparse.linalg.svds()` function to perform the decomposition.

If you’re new to data science, the matrix decomposition will be quite opaque for you. Ecco, come pure l’esempio che segue per cui non sono attrezzato. Forse il tutorial di Karlijn (rockz) è troppo specialistico per me. Pausa, poi si continua 😊

Annunci