An application where Eigenvalues and Eigenvectors are most commonly used is Principal Component Analysis (PCA). PCA is behind the scope of this course. For now, think of PCA as a tool that reduces complexity in your model.
PCA is typically used to reduce the number of dimensions in your model when your data is made up of hundreds of variables. While performing PCA, you are trying to find axes about which the these data points in all of the variables have the most variance. These axes are called the principal components of the data. Our goal in calculating these is that we can explain all the variance in the data using these principal components, which are fewer in number than the number of variables. This is how we reduce dimensionality.
Given some variables, it turns out that finding these axes of maximum variance is equivalent to finding the eigenvectors of the covariance matrix between those variables. Thus, the principal components are essentially eigenvectors of the covariance matrix.