If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains ***.kastatic.org** and ***.kasandbox.org** are unblocked.

Main content

Current time:0:00Total duration:6:53

I have a new update I contacted the engineering department at NASA and found out the new Rover is using the same memory platform used on curiosity and the Curiosity rover was equipped with two computers but only one was active at a time and it had the following specs two gigabytes of flash memory 256 megabytes of random access memory and 256 kilobytes of read-only memory which held core system routines we want to be able to store our entire program in RAM however because we have to share this space with other programs we are allocated 5% of RAM which is the maximum we can use and this is about twelve point eight megabytes total I bring this up because I want to introduce the idea of time-space trade-off or space-time trade-off a commonly used term in computer science I was looking at a program done by Ivy 46 and they had a million prime array so that their algorithm could step along on Prime's only as far as possible when doing a trial division primality test and it begs the question why not just store all the Prime's we need up to some limit in an array instead of calculating them on the fly we mentioned in a previous video that this would be optimal for a trial division algorithm although you may see his algorithm does not use many steps it began to run very slowly and eventually crashed before hitting the step limit so it wasn't able to quickly solve the problem for the size as I defined earlier and in this case they were trading off time in the form of repeated divisibility tests at the expense of space which is memory to hold the array now why didn't this work well let's do a rough calculation using what we've learned to find out if this method is possible using our memory limit remember we must be able to deal with numbers up to just over 9 quadrillion and our trial division algorithms only need to check for factors up to the square root of a number and if it hits that point with no factors found it can say 100% whether or not the number is prime now how many Prime's up to the square root of this limit where the square root of 9 quadrillion is 94 point nine million how many Prime's under 95 million well luckily we have learned a mathematical solution to approximate this answer using the prime number theorem so how many Prime's are there under X well it's X divided by the natural logarithm of X and if X is just over 94.9 million we find the number of primes is approximately five point 1 million now because we are storing these primes we need to know the size of the largest prime or in this case the five point one millionth prime approximately which we know will be some number less than ninety four point nine million now I double-check the table and the actual value of this Prime the largest prime we would need to store under the square root of our limit is 89 million 78640 find out let's convert the number into binary notation which is how the computer will store the number using tiny switches in memory we learned about this in the computer memory video in bits the largest prime looks like this which is 24 bits or three bytes needed to store this single number so let's say we want to store each number in memory and since we know the largest number requires 24 bits we can just imagine a long list of 24 bit blocks storing each prime number so how many bits are needed for this entire list well the memory needed is the number of values or the number of primes times the space per value so we have around 5 point 1 million values times 24 bits per value which is just over 124 million bits or if we convert it into bytes it's about 14 point 7 million bytes call this almost 15 megabytes so it is not possible to store even a list of these numbers in memory using our lemon this is just a toy example it's actually an underestimation of what you'd really need for example an array needs space for a pointer to each and each pointer on a 32-bit machine is 4 bytes so the actual amount of memory needed is much more than 15 megabytes that being said we know that storing all Prime's up to the square root of our relatively small limit is not even possible with our memory limit we can't do it this way okay well what about when prices drop by a factor of a thousand or ten thousand modern-day cryptographic systems use 512-bit or even larger numbers making search and enumeration impossible now for example if someone asks you to make a list of all prime numbers up to Prime's which are two hundred digits in length you shouldn't even consider it because the hard drive needed to store all these Prime's would be heavier than the mass of the observable universe I'll leave the calculations for you to try next time you're at a restaurant with crayons and paper all over the table but remember there are around 10 to the 80 atoms in the observable universe that's an 80 digit number realize now there is a fundamental limit for how much space or memory we have access to never mind how much time it would take but there is always this push and pull between using space or time to solve our problems so to solve this problem of testing for prime allottee quickly using a small amount of space and a small amount of time we are going to have to approach it in an entirely new way