From the course: Machine Learning Foundations: Linear Algebra
Scalar and vector projection
From the course: Machine Learning Foundations: Linear Algebra
Scalar and vector projection
- [Narrator] Scalar and vector projections are an extremely important part of machine learning because they make mathematical operations and applying machine learning models easier. Before we dive into understanding them, we first need to understand the magnitude or length of the vector. The magnitude of a vector, also called the norm of the geometric length, is the distance from the tail to head of a vector, and it's computed using the standard Euclidean distance formula, the square root of the sum of squared vector elements. Vector magnitude is indicated using double vertical bars around the vector. There is a formula we use to calculate that looks like this. Don't panic if it looks complex. In NumPy, we have a function that does this job for us called norm. So if we want to calculate magnitude of a vector A, we would just need to type magnitude equals np.norm(a). Cool. Let's see what a vector projection is. A vector projection of a vector A onto another vector B is the orthogonal projection of A onto B. Let's understand the concept of vector projection by looking at a simple graphic. You can visualize the projection of A onto B as the shadow of A falling on B if the sun were to shine on B at the right angle. In order to project vector A onto vector B, we have the following formula. Take the dot product of vectors A and B and divide it by magnitude of B. This gives us a scalar value, which is called the component of scalar projection of A in the direction of vector B. In order to calculate a vector projection, we have to multiply the scalar projection with a unit factor of B. Now, let's try it out by opening our Jupyter notebook. In order to simplify the calculation, when changing the basis, we will choose orthogonal vectors F1 and F2. We'll also need to import the norm function from the linalg module by typing from numpy import lingalg as lng. Next, let's create two vectors A and B. To calculate a norm of a vector A, we just have to type lng.norm(a). Now let's calculate the projection of vector A onto vector B. The projection of vector A onto vector B, we just have to divide dot product of A and B with the dot product of B and multiply it with the vector B. Great, and just print it out to display it.
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.