Computer science theory
- Primality test challenge
- Trial division
- What is computer memory?
- Algorithmic efficiency
- Level 3: Challenge
- Sieve of Eratosthenes
- Level 4: Sieve of Eratosthenes
- Primality test with sieve
- Level 5: Trial division using sieve
- The prime number theorem
- Prime density spiral
- Prime Gaps
- Time space tradeoff
- Summary (what's next?)
How can we improve the speed of a (deterministic) primality test? Created by Brit Cruise.
Want to join the conversation?
- @2:36Brit introduced the concept that the root of the number in question is the worst case. Why can we assume that if n/sqrt(n) is not divisible then it is prime?(5 votes)
- Hey James, in the CS program I explain this in detail (Level 1). Realize that every number must have a largest and smallest factor. As you search for a factor starting from 2 eventually you will hit the smallest factor. Well how big can the smallest factor be?
The worse case occurs when the smallest factor EQUALS the largest factor (such as 49=7*7). So given any number n, the smallest a factor can be is sqrt(n) OR ELSE it is prime.
Does this make sense? It's important!(28 votes)
- Using known properties of prime numbers to limit the search space is one way to improve the algorithmic efficiency, wouldn't changing the primary instruction set of the processor be another?(10 votes)
- What if you could calculate all of the prime values up to a certain limit and then cache it into an array, and when you want to check if a number is prime or not, you could run a for loop to check if the test number is equal to any of the prime numbers in the array. If it is, then it's prime. Otherwise it is composite. I think this is unpractical and slow, but can't point out why.(3 votes)
- There's a couple problems with this approach:
-It becomes a chicken before the egg type problem if you did that. How did you calculate the list of prime numbers in your array in the 1st place ? You would want to have the most efficient algorithm you could to find the numbers for that array.
-For large primes it can take a long time to figure out if they are prime. So it would take too much time (many many years) to test numbers from 2 to a huge number, and then put them in an array.
Making a list beyond the first trillion primes would become impractical, due to the time to make the list and the space it would require. However, we want to deal with much bigger primes, so it would be impractical to make a list of all of these really big primes we want to use.
As a side note, instead of organizing a list of primes in an array and searching with a for loop, if we organized it in a special kind of way called a tree, we could check to see if a number was in our list very quickly. Too bad making the list is not practical.(7 votes)
- What does the computer actually do to divide numbers? Does it do long division like a person does?
What I'm wondering is whether using tricks to check divisibility would be faster than dividing and looking at the remainder. Like, if the number ends in 2, 4, 6, 8, or 0, would checking that be faster than checking if it divides by two? (It could be something like "if last_digit in [2, 4, 6, 8, 0]")(3 votes)
- computers divide in binary (ones and zeros), so the last digit in the numbers they see is either 1 or 0. They don't take remainders in an account unless they are told to (you would have to write the quotient into a variable that can contain fractions).
However, checking whether a binary number divides by two is very simple: if the last (rightmost) is 0, it divides by 2, if it's 0, it doesn't. This property sometimes comes in handy for certain computations.(3 votes)
- Did anyone else notice just how often the beginning of the number of steps approximated the digits of pi? Seeing 314..., 3142..., etc. several times surprised me. Is that related to how your number was constantly very close to the digits of the square of pi? (i.e. it was in the 987... range for most if not all of the time, and the square of pi is 9.8696...). I'm just guessing here, but if there's a connection there, that's pretty cool. Great videos! They really make me think.(3 votes)
- The number of cycles for algorithm A is always equal to the square root of the number being tested, rounded down, and then plus one. This is present in the program because the variable representing the upper bound of numbers to try, called "upperBound," is set to floor(sqrt(inputNum)).
For example, to find the number of cycles for factoring 1000 using algorithm A, you would find the square root of 1000, which is about 31.62, then round it down to 31, and then add one to get 32.
So I guess it makes sense that if the number being factored is close to pi squared, the number of cycles would be close to pi. That was a really good observation though, and it took me a bit to figure it out.(3 votes)
- It seems like this algorithm would take just as much time as the factorization would. What do you think?(3 votes)
- If the number is composite, then I only need to find one factor (other than 1 and itself) to prove a number is not prime, but I need to find all the factors to factor a number.
If the number is prime ,then factoring and proving the primality of the number is essentially the same problem.
Thus, factoring is at least as hard as proving the number is prime or not.(3 votes)
- True or False is Zero Prime?(2 votes)
- Any number can multiply into zero, including zero itself, so it's NOT prime because it has infinite factors.
It's actually not composite either because a zero product requires at least one zero factor (no two numbers - or one used twice - other than zero can form a product of zero)(3 votes)
- Couldn't you make it only scan prime numbers less than n?Because if a
number isn't divisible by 3 then it cant possibly be divisible by, say, 9.(3 votes)
- This would be more efficient, but it requires a list of primes up to the square root of the largest prime your program can handle. For a more detailed description about why this would be impractical, see the question posted by sauj123 and answered by Cameron.(1 vote)
- I still don't get why to do it fast?(1 vote)
- Check out the series "Journey into Cryptography" it will give context for why a fast Primality test is important. (we need to be able to generate primes quickly)(4 votes)
Voiceover: I have a report now that they are sending a new rover to Mars. We are responsible for doing one thing which is programing the algorithm inside the rover which checks if numbers are prime. Let's say our rover is communicating using RSA. It needs an algorithm inside that can do a primality test. When you design a rover or anything to go in to space you have to be very efficient in every way. The components used have to be very light. The amount of power every subsystem uses has to be minimal. You have to be saving energy and mass at every point in the design process. We have our work cut out for us. We have to be able to deal with numbers up to this size. It has to do it very fast. If we give it a very small input, let's say just 90, it should be able to calculate that almost as fast as this entire number. As the input grows, we don't want to see any of this noticeable slow down. I want to analyze the user questions or the ideas users have which were really good from a mathematical standpoint. We are checking if N is prime or composite. Given some number N, think of this space of all possible Ns. If N is 100, this space is two to 100. What our algorithm is doing is searching this space. Think of algorithm searching a space. At each point it is asking- Think of it as a step, a primitive step. It is asking a question and it's actually a composite test, the question. Say this is a number A, we would say in the trial division method "is N divided by A?" We just try this, we drop A here and we try N divides A and we see if the remainder is zero, which means A is a divisor of N, we say "Ah, we know 100 percent-" We raise the flag and know 100 percent sure it's composite. Otherwise, at each step, we aren't quite sure. It might be prime but we're not sure. We continue searching until we hit the end. Remember our wall in this case was at the square root of N. Worst case situation occurs when N is prime, we search all the way to the square root of N and then we can say "Ah, it is prime "and we are 100 percent sure." In the case where N is even the multiple of two primes, seven times seven equals 49 if we throw 49 into our algorithm, the worst case occurs. We go all the way to the square root of N. The first set of questions, for example TheFourthDimension asks "Once we rule out two "as not divisible, then all multiples of two could be "ruled out, the same for three, four, five, et cetera." That's a really great point. Our old algorithm was stepping along one at a time. We could be using patterns we know about composite numbers, such as we know for sure that if a number is divisible by two then it's composite. If it is greater than two, so two is a prime. Then we can say four is composite. I missed five here, sorry about that. Four, six, eight, 10, and instead we can step like this. Three, five, seven, nine. One category of questions all try to reduce this space. If we eliminate all the even numbers, now the check space, instead of being up to the square root of N is the square root of N divided by two. We can find other patterns in composite numbers to try to make this even smaller. The other type of question concerns the case where we find a composite witness. That is, we find an A that allows us to say "Oh, we know N is composite." Lola said "Wouldn't leaving the loop "as soon as primeCheck equals false cut down "on some of the time?" Yes, that is totally correct and is something we want to do. As soon as we are searching along using some step defined by whatever pattern we are using, we find a composite witness. That means it passes our test and we say 100 percent confidence we should halt early. We stop and say "Oh, we're done. We won't continue searching." This halting early is great except it doesn't help us in the worst case, which is if N is prime then the early halting wont help us. Now, we can visualize how these improvements allow us to reduce the space thus preventing as many checks happening inside the computer which, depending on our computer, will reduce the amount of time it takes. Now I can show you the visualization I set up below which allows us to compare two algorithms based on how many steps occur during their execution. On the left is algorithm A, which is trial division which checks from two to the square root on N. On the right is algorithm B, which is let's say our improved algorithm. In this case, I have applied the check if it's divisible by two so we only do half as many steps. I also terminate early in this algorithm B. It's not perfect, I just applied a few user modifications so we can see what happens. Now, I am going to play with this to show you it. Here we can see as I scroll, we see algorithm A. We have the output, whether it is composite or prime. At the bottom you'll see the number of steps. The first thing to notice is on the right hand side every other number takes only one step. We know why that is. If it is an even number such as this, it will quit. Our old algorithm took 314 steps. Our new algorithm only took one step because it checks if it is divisible by two. That seems like a really nice optimization. However, as we build our input, you'll notice the steps increase and the bar grows and turns red once we approach the region where we are going to crash. This red line up here is step limit. If your bar hits there, we fail, which means our rover would break. In this case, the browser will actually crash. I'll try to get close to it. I am close to it now and it's running very slow. I can feel that my browser is just about to crash and give me the circle of death. Notice the improved algorithm took only two steps in that case, but remember the worst case. I am going to- I have a few large prime numbers saved here for example. We have to be able to handle any case. We don't know what we are throwing at our algorithm. If I throw a large prime at it, look what happens. Algorithm B, as we know, is taking half as many steps in the worst case but it is still taking many steps here because it is the worst case, right. We are getting close to crashing here, and this is not a very long prime. We are still way under 15 digits. When I put this 12 digit prime into our algorithm let's see what happens. It's lagging, it's going to crash. Look what happened, both algorithms went way overboard. It didn't work. Our improvement algorithm is not good enough yet. But, now we have an understanding of what we are trying to improve which is number of steps in the worst case. We also have this cool tool which allows us to visualize how fast it is growing, how fast the number of steps grows as our input grows. Below, I'll explain how I set this up so you can set up your algorithm and compare it with the base case and see how much better you can do.