In the last video we saw why when we take any non-zero number divided by zero why mathematicians have left that as being undefined. But it might have raised a question in your brain. What about zero divided by zero? Isn't there an argument why that could be defined? So we're gonna think about zero divided by zero. Well there's a couple of lines of reasoning here. One, you could start taking numbers closer and closer to zero and dividing them by themselves. So for example, you take 0.1 divided by 0.1. Well that's gonna be one. Let's get even closer to zero: 0.001 divided by 0.001. Well, that also equals one. Let's get super close to zero: 0.000001 divided by 0.000001. Well once again, that also equals one. And it didn't even matter whether these were positive or negative. I could make these negative and I'd still get the same result. Negative this thing divided by negative this thing still gets me to one. So based on this logic you might say, "Hey, well this seems like a pretty reasonable argument for zero divided by zero to be defined as being equal to one. " But someone could come along and say, "Well what happens if we divide zero by numbers closer and closer to zero; not a number by itself, but zero by smaller and smaller numbers, or numbers closer and closer to zero." And so they say, "For example, zero divided by 0.1, well that's just going to be zero. Zero divided by 0.001, well that's also going to be to zero. 0 divided by 0.000001 is also going to be equal to zero." And it didn't matter whether we were dividing by a positive or negative number. Make all of these negatives, you still get the same answer. So this line of reasoning tells you that it's completely legitimate, to think at least that maybe zero divided by zero could be equal to zero. And these are equally valid arguments. And because they're equally valid, and frankly neither of them is consistent with the rest of mathematics, once again mathematicians have left zero divided by zero as undefined.