If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains ***.kastatic.org** and ***.kasandbox.org** are unblocked.

Main content

Current time:0:00Total duration:9:23

I have an update nASA has said that there will be a hardware random number generator on the rover that we have access to and after that they made one more comment they reminded me that we just need our algorithm to work in practice for something to work in practice it means that there's always some possibility of error but perhaps the possibility is so small that we ignore it in practice and if this sounds crazy just realize that in the physical world nothing is ever certain there's always chances for error for example chip packaging contains small amounts of radioactive contaminants and when these decay they release alpha particles which can actually flip bits in memory and perhaps change a number unexpectedly and even more interesting cosmic rays can also cause soft errors we can never remove the chance of error completely and I asked NASA exactly well what error probability is acceptable and they said we just need to make sure that the probability of air for a given trial is less than the odds of hitting the lottery twice in a row and the odds of hitting the lottery say 649 twice in a row is 6 times 10 to the negative 14 so let's just be safe and round down and say our error probability is 10 to the minus 15 safe enough we will not expect to see an error and it could run hundreds or thousands of times now my question is would access to randomness help us speed up a decision algorithm such as this prime allottee test and this is a deep question so let's step back and look at the big picture given some collection of integers from 1 to some limit and say let's think of it as our universe of integers we can always divide a collection into two sets and a decision problem can actually be thought of as a partition into two sets for example provided some n is n less than 100 that will output yes for all integers less than 100 so here is one collection and no for all others which is another collection okay so let's now focus on recognizing Prime's with a decision and ideally what we would like is some simply computed criteria that all Prime's satisfy and no composites satisfy now if we knew some simple pattern which describes the location of all primes and only primes then given some number n we could simply check event follows that pattern but the problem is that so far no exhaustive and simply computed pattern has been found for all primes and no composites or all composites and no primes but we do know fast tests for most composite numbers so for example a simply computed criteria would be a test for evenness and all even numbers are divisible by two and it's fast because we can tell if something's even by just looking at the last digit and even as n grows the problem doesn't get any harder we just look at that last digit also known as the low order bit now realize that we can use this evenness test as a low-quality composite test in that we could have some algorithm except a integer n and all our algorithm does is check if it's even if it is even and greater than two then we know with 100% certainty it is composite because we have proof think of two as our composite witness however if not then we aren't exactly sure if it's odd it could be prime since all Prime's are odd or it could be one of the many composites which are odds jazz 9 or 15 and this massive region of odd composites makes for an unacceptable test now to be clear let's draw this provided some N and can be either prime or composite if n is prime our algorithm will say prime 100% of the time since no Prime's are even that are greater than 2 and it will never say composite when a prime is provided however if n is composite our algorithm will say composite about 50% of the time and prime 50% of the time so this means that if our algorithm outputs composite it means it found proof a composite witness algorithm outputs prime we know there is an unacceptably large chance of air until now our strategy has dealt with this large unacceptably large error by iteration and let's draw the flow of our machine given some n our machine does a series of divisibility tests starting at a is 2 it asks does a divide n and if it doesn't divide n then we can eliminate all of this region then it asks the next question does n divide 3 and if not and we eliminate that whole section and as n divide 5 7 and so on and notice these are disjoint sets which gradually fill up all possible composites now if we ever ant answer yes along the way then we have proof that n is composite and a whatever it is at that point is our witness so we halt in and output composite from our algorithm otherwise we continue trying until we exhaust all possible composites until we hit the square root of n since we know remember every composite number and must have a factor less than or equal to the square root of n so this really leads to the final question in our algorithm if no witnesses are found and a is greater than the square root of n then we suddenly have proof and we halt note put prime so realize we have to exit paths through our algorithm both paths represent proof of certainty that n is either prime or composite and when n is prime we try all possible divisors and we basically rule them all out and that is our proof that on his prime the problem with this strategy is that our test value a at minimum requires us to cycle through every prime starting from 2 to the square root of n so as n grows the number of primes grow with it that results in too many iterations in the worst case such as when we provide a prime to our algorithm looking at the big picture now realize it's providing proof when n is prime which we now know we don't exactly need no one said we needed to prove it we just needed to be 99.9999 59% certain hmm so we actually need to think about the composite test that's being used in our algorithm remember our fastest trial division primality tests thus far have tried to use prime patterns such as 6k or all Prime's are of the form 6k plus or minus 1 to help walk along the Prime's only and eliminate many of the composites to save time and remember a test like this can be turned into a composite test that is given some integer n is of the form 6k plus or minus 1 then we could say it's probably prime but there are many composites still which follow this pattern it doesn't include Prime's only just all primes and some composites and we can think of these extra composites as a leak and this leak is our source of error now if we invert it and ask is and not of the form 6k plus or minus 1 then we can say with 100% certainty it is composite so the leak is our source of air on the prime side but in this case it's unacceptable it's not a non-negligible air there's a much larger probability we need to define a new com a testing procedure which is able to shrink this space and make the chance of air negligible which means we'll need to review how we can actually shrink heirs with iterations I think we should now post our ideas below regarding you know not what kind of tests might we want to perform and also more importantly how could a random generator random number generator really help us