- Vector dot product and vector length
- Proving vector dot product properties
- Proof of the Cauchy-Schwarz inequality
- Vector triangle inequality
- Defining the angle between vectors
- Defining a plane in R3 with a point and normal vector
- Cross product introduction
- Proof: Relationship between cross product and sin of angle
- Dot and cross product comparison/intuition
- Vector triple product expansion (very optional)
- Normal vector from plane equation
- Point distance to plane
- Distance between planes
Proving the "associative", "distributive" and "commutative" properties for vector dot products. Created by Sal Khan.
Want to join the conversation?
- In Linear Algebra, we are also learning about inner products. I was wondering what the difference was between dots products and inner products, and if you could make a video about inner products. Thanks!(6 votes)
- I think that the best answer I can give you is to say that the inner product is a generalized version of the dot product. The dot product is well defined in euclidean vector spaces, but the inner product is defined such that it also function in abstract vector space, mapping the result into the Real number space.
In any case, all the important properties remain:
1. The norm (or "length") of a vector is the square root of the inner product of the vector with itself.
2. The inner product of two orthogonal vectors is 0.
3. And the cos of the angle between two vectors is the inner product of those vectors divided by the norms of those two vectors.
Hope that helps!(7 votes)
- If a vector is a matrix with one column or row, mustn't we apply the rules of matrix multiplication? In that case we could only multiply a lets say 1x3 vector with a 3x1 vector to get a scalar (1x1 vector). Is that what the dot product is doing, but without formally writing the second vector as a row-vector?
But for matrix multiplication the commutavite property does not apply.(6 votes)
- Since the vectors are one column matrices, why aren't we multiplying vectors the same way we multiply matrices? Matrix multiplication does not allow for commutativity, and yet the dot product does. I am willing to "allow" that the dot product gives us a scalar, not another vector (as one would expect when multiplying two matrices together), but why can we do this with vectors and not matrices? I even can understand the idea that the scalar is the "shadow" of one vector onto another --but where does the matrix behavior appear? Or do matrices have their own "dot products"?(4 votes)
- In this video, Sal uses Rn as to generalize it to all n-tuplets, but wouldn't the proof be just as valid if proven only in R2? (since it is mundane, R2 would save space and time)(3 votes)
- If he proved just R2, the proof might not work with other Rs, but in this case proving in R2 would work for other Rs. The problem with that is you won't know if proving in R2 works for all Rs until you prove in Rn. So you might as well prove it in Rn.(11 votes)
- How can i solve the equation of dot product when vectors are parallel?(3 votes)
- When they both point in the same direction, the dot product is equal to their magnitudes multiplied by eachother:
a·b ≡ |a|*|b|*cos(θ)
a·b = |a|*|b|*cos(0)when they point the same direction
a·b = |a|*|b|(4 votes)
- But my physics instructor said ∇.F != F.∇ for a force field F. Why so?(3 votes)
- I don't want to step on any physics peoples' toes, but "del notation" is a non-mathematical hack that only provides useful memory aids for valuable real-world ideas. Be sure to think of ∇. as a single operator and not the dot product of a "del" with something. There is danger in trying to take the metaphor too far.
- Why vector cross product not form a group(3 votes)
- For a set G to be a group under a binary operation x [formally, we say the ordered pair (G, x) is a group], the following must hold for all elements u, v, and w in G:
1. There is an identity element e, where u x e = e x u = u.
2. For every element u, there is an element -u called u inverse such that u x -u = -u x u = e.
3. The operation is associative, i.e. (u x v) x w = u x (v x w).
The cross product for one, fails associativity. It also has no identity element.
Thus R^3 under the cross product binary operation is not a group.(2 votes)
- It looks to me that your adding the first term plus the second term until n which looks like a series until n. Is there a connection between vectors and series? Could you please elaborate on this in a video if it is (can the relation be proved for example)?(2 votes)
- Well, a series is just a compact notation for writing arbitrary sums. In the case of dot products, if we have two vectors x = (x1, x2, ... , xn), and y = (y1, y2, ... , yn), and we wanted to write the dot product as a series (which we can because we can write every sum as a series), then it would be like this:
x dot y = summation i=1 to n of xi*yi.(2 votes)
- I understand dot products and its properties, but what does a dot product represent? What does a dot product show/mean?(2 votes)
- how much of vector a is in the direction of vector b. think about it: a dot b = a*bcos(theta). If you make a triangle with vectors a and b as sides, the bcos(theta) part is how much of vector b is in the direction of a (and then you multiply this by the magnitude of a to get a scalar). And then the cross product is just how much of vector b is -perpendicular- to vector a (or: absin(theta))(2 votes)
- Is it possible to multiply 2 vectors of different dimensions??(1 vote)
- While you can't technically multiply vectors of different dimensions, what you could do is set the third variable to 0. For example, if you wanted to multiple A (1,2) with B (1,2,3) you could think of A as (1,2,0) to get a dot product of (1)(1)+(2)(2)+(0)(3)=5. Which is to say, vectors of a smaller dimension are really a subspace of a larger dimension, so calculations can be done in the higher dimension.(3 votes)
In this video, I want to prove some of the basic properties of the dot product, and you might find what I'm doing in this video somewhat mundane. You know, to be frank, it is somewhat mundane. But I'm doing it for two reasons. One is, this is the type of thing that's often asked of you when you take a linear algebra class. But more importantly, it gives you the appreciation that we really are kind of building up a mathematics of vectors from the ground up, and you really can't assume anything. You ready to prove everything for yourself. So the first thing I want to prove is that the dot product, when you take the vector dot product, so if I take v dot w that it's commutative. That the order that I take the dot product doesn't matter. I want to prove to myself that that is equal to w dot v. And so, how do we do that? Well, and this is the general pattern for a lot of these vector proofs. Let's just write out the vectors. So v will look like v1, v2, all the way down to vn. Let's say that this is equal to v. And let's say that w is equal to w1, w2, all the way down to wn. So what does v dot w equal? v dot to w is equal to-- I'll switch colors here-- v1 times w1. Plus v2 w2 plus all the way to vn wn. Fair enough. Now what does w dot v equal? Well w dot v-- you know, when I had made the definition, you just multiply the products. But I'll just do it in the order that they gave it to us. So it equals w1 v1 plus w2 v2. Plus all the way to wn vn. Now, these are clearly equal to each other because if you just match up the first term with the first term, those are clearly equal to each other. v1 w1 is equal to w1 v1. And I can say this now because now we're just dealing with regular numbers. Here we were dealing with vectors and we were taking this weird type of multiplication called the dot product. But now I can definitely say that these are equal because this is just regular multiplication. And this is just a commutative property. Let me see if I'm spelling commutative. We learned this in-- I don't know when you learned this, in second or third grade. So you know that those are equal and by the same argument you know that these two are equal. You could just rewrite each of these terms just by switching that around. That's just from basic multiplication of scalar numbers, of just regular real numbers. So that's what tells us that these two things are equal or these two things are equal. So we've proven to ourselves that order doesn't matter when you take the dot product. Now the next thing we could take a look at is whether the dot product exhibits the distributive property. So let me just define another vector x here. Another vector x and you can imagine how I'm going to define it. x1, x2, all the way down to xn. Now, what I want to see if the dot product deals with the distributive property the way I would expect it to, then if I were to add v plus w and then multiply that by x. And first of all, it shouldn't matter what order I do that with. I just showed it here. I could do x dot this thing. It shouldn't matter because I just showed you it's commutative. But if the distribution works, then this should be the same thing as v dot x plus w dot x. If these were just numbers and this was just regular multiplication, you would multiply by it by each of the terms, and that's what I'm showing here. So let's see if this is true for the dot product. So what is v plus w? v plus w is equal to-- we just add up each of their corresponding terms. v1 plus w1, v2 plus w2, all the way down to vn plus wn. That's that right there. And then when we dot that with x1, x2, all the way down to xn, what do we get? Well we get v1 plus w1 times x1 plus v2 plus w2 times x2 plus all the way to vn plus wn times xn. I just took the dot product of these two. I just multiplied corresponding components and then added them all up. That was the dot product. This is v plus w dot x. Let me write that down. This is v plus w dot x. Now, let's work on these things up here. Let me write it over here. What is v dot x? v dot x, we've seen this before. This is just v1 x1. No vectors now. These are just actual components. Plus v2 x2, all the way to vn xn. What is w dot x? w dot x is equal to w1 x1 plus w2 x2, all the way to wn xn. Now what do you get when you add these two things? And notice, here I'm adding two scalar quantities. That's a scalar. That's a scalar. We're not doing vector addition anymore. So this is a scalar quantity and this is a scalar quantity. So what do I get when I add them? So v dot x plus w dot x is equal to v1 x1 plus w1 x1 plus v2 x2 plus w2 x2, all the way to vn xn plus wn xn. I know, it's very monotonous. But you could immediately see we're just dealing with regular numbers here. So we can take the x's out and what do you get? Let me write it here. This is equal to-- we could just take the x out, factor the x out. v1 plus w1, x1 plus v2 plus w2 x2, all the way to vn plus wn xn. Which we see this is the same thing as this thing right here. So we just showed that this expression right here, is the same thing as that expression or the distribution-- the distributive property seems to or does apply the way we would expect to the dot product. I know this is so mundane. Why are we doing this? But I'm doing this to show you that we're building things up. We couldn't just assume this. But the proof is pretty straightforward. And in general, I didn't do these proofs when I did it for vector addition and scalar multiplication, and I really should have. But you can prove the commutativity of it. Or for the scalar multiplication you could prove that distribution works for it doing a proof exactly the same way as this. A lot of math books or linear algebra books just leave these as exercises to the student because it's mundane, so they didn't think it was worth their paper. But let me just show you, I guess, the last property, associativity, the associative property. So let me show you. If I take some scalar and I multiply it times v, some vector v. And then I take the dot product of that with w, if this is associative the way multiplication in our everyday world normally works, this should be equal to-- and it's still a question mark because I haven't proven it to you. It should be equal to c times v dot w. So let's figure it out. What's c times the vector v? c times the vector v is c times v1, c times v2, all the way down to c times vn. And then the vector w, we already know what that is. So dot w is equal to what? It's equal to this times the first term of w. So c v1 w1 plus this times the second term of w, c v2 w2, all the way to c vn wn. Fair enough. That's what this side is equal to. Now let's do this side. What is v dot w? I'll write it here. We've done this multiple times. This is just v1 w1 plus v2 w2, all the way to vn wn. I'm getting tired of doing this and you're probably tired of watching it, but it's good to go through the exercises. You know, if someone asked you to do this now, you'll be able to do this. Now what is c times this? So if I multiply some scalar times this, that's the same thing as multiplying some scalar times that. So I'm just multiplying a scalar times a big-- this is just the regular distributive property of just numbers, of just regular real numbers. So this is going to be equal to c v1 w1 plus c v2 w2 plus all the way to c vn wn. And we see that this is equal to this because this is equal to this. Now the hardest part of this-- I remember when I first took linear algebra, I found when the professor would assign, you know, prove this. I would have trouble doing it because it almost seems so ridiculously obvious. That hey, well, obviously if you just look at the components of them, it just turns into multiplying of each individual component and adding them up and those are associative, so that's obviously-- what's there to prove? And it only took me a little while that they just wanted me to write that down. They didn't want something earth shattering. They just wanted me to show when you go component by component and all you have to do is assume kind of the distributive or the associative or the commutative property of regular numbers, that you could prove the same properties also apply in a very similar way, to vectors and the dot product. So hopefully you found this reasonably useful and I'll see you in the next video where we could use some of these tools to actually prove some more interesting properties of vectors.