The Internet began its life as a network connecting universities and research centers. Once it became available and affordable for consumers, it shot up in popularity and is now used by an estimated 4.5 billion people.
Fortunately, the protocols powering the Internet and the Web were designed for scalability. A scalable system is one that can continue functioning well even as it experiences higher usage.
What features increase the scalability of the Internet?
- Any computing device can send data around the Internet if it follows the protocols. There is no bureaucratic process that blocks a device from joining or prevents a programmer from learning how the protocols work.
- The IPv6 addressing system can uniquely address a trillion times the amount of devices currently connected to the Internet.
- Routing is dynamic, so new routers can join a network at any time and help to move data packets around the Internet.
The Internet was designed to be scalable, but no system is infinitely scalable.
What threatens the scalability of the Internet? Or put another way, what could go wrong if every single device in the world connected to the Internet right now and attempted to download a movie?
Here are a few ideas:
- Network connections have limited bandwidth. A huge amount of data may flow easily through the very high bandwidth connections but it could easily overwhelm low bandwidth connections, leading to delays or dropped packets.
- Routers have limited throughput (the amount of data they can forward per second). A modern consumer router has a throughput around 1 Gbps while much more expensive enterprise routers can forward up to 10 Gbps. An average movie is around 1 to 10 GB, so a worldwide download-a-thon could get bottlenecked in the routers.
- Wireless routers often have a limitation in the number of devices that can be connected to them, typically up to 250 devices. If everyone tried to use a shared WiFi network at the same time (like in a university or library), they might find themselves simply unable to join.
🤔 Based on everything you've learned about the Internet, what else might affect its scalability, for good or for bad?
Web application scalability
A web application that runs on top of the Internet must also be scalable, whether it's an iPhone app, a website, or multiplayer game. Now that there are billions of people connected to the Internet, any application can suddenly experience a surge in users. If the application doesn't scale to meet the demand, users might experience increased latency or a complete outage. 😬
During the COVID-19 pandemic, people across the globe were asked to stay indoors to decrease infection and many of them rushed to online services for virtual versions of what they were missing in person. Due to the many students turning to Khan Academy for virtual practice, our website experienced a 250% increase in server load.
That high usage took our systems by surprise, and a few of them were just barely handling the sudden onslaught of requests. For example, our system to notify students about new assignments was getting backlogged, so notifications would take a few minutes to show up (versus a few seconds).
Fortunately, our engineers quickly bumped up the capacity of those systems, and most users never noticed anything amiss.
Engineering teams can prepare for spikes in usage by doing load testing: simulating high amounts of traffic in a short period of time to see if the system buckles under the load. Load testing can uncover bottlenecks or hard-coded limits in the system.
Ever played Pokémon Go? It's a mobile game that came out in the summer of 2016 and was an instant hit. The game developers did load testing before releasing the game, simulating 5 times the highest amount of traffic they expected, and the game servers handled it just fine.
They vastly underestimated the popularity of Pokémon Go, however. On launch day, their servers saw 50 times the estimated traffic.
The game servers weren't ready for that level of extreme load, so many players were greeted with a disappointing screen:
The team scrambled to improve the scalability of the system, amidst growing demand from frustrated users plus multiple DDoS attacks on their servers from cybercriminals.
After reconfiguring their server architecture to be more scalable, the team released Pokémon Go to the rest of the world. In the three years since, it's been downloaded more than a billion times from mobile app stores.
The spectrum of scalability
A system is scalable when it has the capacity to accommodate a greater amount of usage. Some systems aren't at all scalable, and can only handle exactly the amount of usage they were designed for.
Scalable systems can handle extra usage, but their capacity varies. Some systems might only scale to handle double the current usage; other systems might scale to handle 1000x the current usage.
When we're designing systems with potentially global reach—such as the Internet itself or applications that run on top of it—we need to always consider the scalability of our approach.
Want to join the conversation?
- How do we increase scalability?is there any another option?(10 votes)
- Imagine you are trying to deliver masks to areas affected by COVID-19. To scale your efforts, you'd ideally
1) Look for many volunteers who can work in parallel (more computational power)
2) Build efficient delivery techniques (algorithmic efficiency)
3) Have backup-plans in case volunteers have to take time off or there are surges (redundancy, load balancing)
Most of the things you can think of from this example have been incorporated in some form when building scalable systems, be it increasing the number of computational devices (capacity) or using faster algorithms.
Hope this helps!(42 votes)
- E, or e, is the fifth letter and the second vowel letter in the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is e (pronounced /ˈiː/); plural ees, Es or E's. It is the most commonly used letter in many languages, including Czech, Danish, Dutch, English, French, German, Hungarian, Latin, Latvian, Norwegian, Spanish, and Swedish.
Read more at https://en.wikipedia.org/wiki/E(28 votes)
- Can anyone please tell me what a DDoS attack is, I've seen the phrase several times, but I never quite understand the concept.(5 votes)
- A distributed denial-of-service (DDoS) attack is a malicious attempt to disrupt the normal traffic of a targeted server, service, or network by overwhelming the target or its surrounding infrastructure with a flood of Internet traffic.
These attacks achieve effectiveness by utilizing multiple compromised computer systems as sources of attack traffic. Exploited machines can include computers and other networked resources such as IoT devices.(16 votes)
- Does it cost money to increase scalability? If not why don't developers want to make it very high before the launch? Or is it because higher scalibility=more to supervise(0 votes)
- Scalability requires resources such as computer servers, electricity, etc. So, scalability does cost money. More computer servers will require more supervision as well (though much of the supervision can be automated). Fortunately, we have seen a tremendous increase in computer clouds recently which has allowed for the creation of highly scalable systems. With a computer cloud, you can automatically scale a system up or down as needed and only pay for the resources you are using.(5 votes)
- In terms of scalability, how many years in the future would IPv6 be replaced and what are the features that will make it more scalable than it?(1 vote)
- how many online user can a average server can handle ?(0 votes)
- It depends on the service / system. Khan Academy servers likely handle many millions of users everyday, but the number of users is a small estimate of the load (i.e. some users barely use the site, while a small group generates a lot of traffic).
Hope that helps!(2 votes)
- How in actual life or we can say in coding a scalable system scalability is seted or increased?(0 votes)
- What are "hard-coded limits in the system"? This is under the Load Testing section.(0 votes)
- im not sure if this is correct, but hard coded limits, are codes that are permanently engrained into the software, except by changing the source code of the program. This types of codes are bad because coding is always changing and is volatile and is never fixed so its important to find these in load testing to remove them.(1 vote)