Main content
Computer science
Course: Computer science > Unit 3
Lesson 1: Ancient information theory- What is information theory?
- Origins of written language
- History of the alphabet
- The Rosetta Stone
- Source encoding
- Visual telegraphs (case study)
- Decision tree exploration
- Electrostatic telegraphs (case study)
- The battery and electromagnetism
- Morse code and the information age
- Morse code Exploration
© 2023 Khan AcademyTerms of usePrivacy PolicyCookie Notice
What is information theory?
A broad introduction to this field of study. Created by Brit Cruise.
Want to join the conversation?
- What is a bit density?(186 votes)
- Bit density measures how many bits can be stored in some area or volume. As far as I'm aware there is no formal definition but you can get an informal idea from a few examples.
A typed paper has some number of letters on it, let's say 1000. Then a rough estimate for the bit density would be 1 KB per sheet of paper, assuming 1 byte per character. There could be many ways to increase the density, e.g. use a smaller font or use something more efficient than letters, like tiny squares that can either be on (black) or off (white). I've heard of software to back up information on paper that can store 500 KB per sheet.
On the other hand there are fingernail-sized microSD cards that can store 64 GB - a much higher bit density!
Given that a sheet of paper has an area of ~ 624 cm^2 that means its bit density is about 1 KB / 624 cm^2 = 0.0016 KB / cm^2. But a microSD card has an area of about 1.65 cm^2 so its bit density is about 64 GB / 1.65 cm^2 = 38.8 GB / cm^2. So, by this rough calculation the bit density of a microSD card is about 24,250,000 times greater than that of a sheet of paper.(239 votes)
- In electronics, a 'bit' is the best 'fundamental unit of information,' because the 'smallest unit of information' in electronics IS a bit (with only two options, on or off). Intuition tells me that we use the bit because (1) It is the smallest prime number and (2) It is the basic unit for information stored in electronics (where we see much of the practical application of this theory).
Am I right to assume that a bit is not always either the most representative or the most fundamental unit to quantify information?
For example, DNA has 4 molecules, so the smallest DNA pair has 4 options rather than 2... right? The coding of proteins in the human body are driven by 20 proteinogenic amino-acids (base 20). In the English language, there are 26 letters (perhaps punctuation should be added, making it more)... we just translate it to bits/bytes out of convenience (for conceptualizing and structuring the programing of computers). Though I can't think of any examples, I bet there are examples in nature that are 'base 3', which would not translate 'naturally' to 'base 2.' Some units of information are not even discrete. The smallest unit of light (a single photon) could theoretically be divided into an 'infinite spectrum of wavelengths' (base infinity??), limited only by our ability to detect small differences.
Also, I imagine 'base 2' is theoretically NOT the most efficient way to store information if the primary goal is increased information density.
Great video! Seems like one of those concepts that is both simple and complex at the same time. It is messing with me. Are the above questions/assumptions correct?(21 votes)- There is also U-C and A-T where uracil replaces guanine. Check out https://www.khanacademy.org/science/biology/evolution-and-natural-selection/v/dna(4 votes)
- At, he says one bit is the answer to a yes or no question. So is a one bit one yes or one no to a question, or the question itself, or the question and the answer? 2:27(7 votes)
- The bit is the yes or no, the 1 or 0, the on or off.(5 votes)
- hey how could the value on the scale be not an integer number ? =/ I mean bit can't be divided into smaller parts, or can they ?(8 votes)
- Great question.Yes they can! We will cover this in detail soon (video on entropy has not been posted yet still working on it).(8 votes)
- The various forms of Alice's message atall have a bit value of 25.5. Does that mean there are 25.5 parts to the idea of her message or 25.5 letters in the message? 2:30(7 votes)
- Some of something but not all of it only a little bit(2 votes)
- Did you have to rip out the page of a perfectly good book? I mean, surely you could have made your point some less damaging way. I literally recoiled in horror. (Yes, I mean literally.)(6 votes)
- Asked the same question. I was mortified that he would do that to a book (ಥ‸ಥ)(3 votes)
- what's the most used and still continued to use method since ever?(4 votes)
- I think Art, because people still draw things to show their feelings, events, and maybe even a story (like in cartoon or anime)(4 votes)
- What's informtaion theory in a nutshell?(4 votes)
- A theory which answers two important questions. (among other things)
1. What is the speed limit of information? How do we measure this?
2. How can we send information reliably in the presence of noise?(5 votes)
- Athe says "Does information have a speed limit?" 2:57
What does he mean by information's speed?(4 votes) - In which way should I interpret the statement atof a bit being 'a measure of surprise'? Is it 'surprise' as in "Ooh, interesting, I did not expect that to happen"? Perhaps it alludes to the surprise that may be contained within or caused by the information itself? Or is there another, different meaning that is getting lost in translation here? (English is not my native language) 2:28(3 votes)
- great question! Surprise will be linked to the number of "possible* outcomes. When you win the lottery you are surprised because there were so many possible others. When you flip a coin and guess correctly, you are not as surprised.(2 votes)
Video transcript
Imagine Alice has an idea
and she wants to share it. There are so many ways to share an idea. She could draw a picture, make an engraving, write a song, (piano music) send a telegraph or an email. But how are these things different? And more importantly, why are they the same? This story is about a fundamental particle of all forms of communication. It begins with a special skill
you likely take for granted. Language. All language allows you to take a thought or mental object and break it down into a
series of conceptual chunks. These chunks are externalized using a series of signals or symbols. Humans express themselves using a variation in sound and physical action, as do chirping birds and dancing bees. And man-made machines exchanging a dancing stream
of electrical vibrations. Even our bodies are built
according to instructions stored inside microscopic
books known as DNA. All are different forms of one thing. Information. In simplest terms, information is what allows
one mind to influence another. It's based on the idea of
communication as selection. Information, no matter the form, can be measured using a fundamental unit, in the same way we can measure
the mass of different objects using a standard measure
such as kilograms or pounds. This allows us to precisely
measure and compare the weight of say rocks, water, or wheat using a scale. Information too can be
measured and compared using a measurement called entropy. Think of it as an information scale. We intuitively know that a single page from some unknown book has less information than the entire book. We can describe exactly how much using a unit called the bit, a measure of surprise. So no matter how Alice wants to communicate a specific message, hieroglyphics, music, computer code, each would contain the
same number of bits, though in different densities. And a bit is linked to a very simple idea. The answer to a yes or no question. Think of it as the language of coins. So how is information actually measured? Does information have a speed limit? A maximum density? Information theory holds the exciting answer to these questions. It's an idea over 3,000
years in the making. But before we can understand this, we must step back and explore perhaps the most powerful
invention in human history. The alphabet. And for this, we return to the cave.