It is said that Google processes 40,000 search queries every second that translate to 3.5 billion searches every day which is quite huge. Do you know that Google bots crawl 20 billion websites daily. Like Google, Facebook also processes a lot of requests. Everyday, 864 million Facebook users log in to check newsfeeds, view photos, check on videos and update their statuses. The Facebook’s Like Button is hit more than 4.5 billion times every day while more than 10 million messages are sent every day.
What is remarkable about Google and Facebook is that they are very fast. How do they manage to load pages so fast even when there are so many requests being sent to the server?
How does Google work so fast?
To be honest about how Google servers work so fast, only a handful of people in the company can tell. One thing that users must appreciate is that when a query is submitted, Google searches its Index of the internet and not the web. The main technologies used to make every query on Google faster include the following.
- Multiple data centers with global load balancing
While Google remains mum about the location of its data centers, it is clear that they are spread all over the globe. Google holds 15 data centers which are located in parts of North America, South America, Asia and Europe. When a query is entered on Google, it is directed to the nearest data center for a prompt reply. If your request finds the nearest data center overwhelmed, it is redirected to the next nearest datacenter. By locating the servers closer to the users, reports to queries are generated at supersonic speed.
- Using multiple computers for distributed lookups
In every data center, there are hundreds of networked computers that break every query to various servers. This means that the combined effort generates results faster compared to using a single computer. When a user keys a query, it means that multiple servers take the task to get results as fast as possible.
- Employing the custom software and file system
The software running in Google data centers are created particularly for Google’s use. From Colossus the database management system, the main focus is increasing the speed of processing queries and cutting the time required to return search results.
- Caching
One unique thing about searches is that search engines do not require robust, consistent, or up-to-date data when rendering. When people make queries, the generated results are stored as replicas in data center computers so that servers do not even have to look for the info every time clients make related searches. This is mainly done for common searches. You only need to hit the query, and the answer is already there if another person had made a similar query.
How does Facebook handle so many users?
- The BigPipe system
Facebook utilizes a unique system called BigPipe that helps its pages to load faster. The system breaks the pages into small sections called Pagelets. Then, it uses JavaScript to load the crucial pages faster and the less important ones later. This means that by loading the important pages first and the rest later, Facebook appears to load faster compared to waiting for the entire page to load before rendering. In addition to Pagelets, the Feed pages are also broken into small chunks so that only a few post loads when scrolling down.
- Traffic scaling
The second method used to make the Facebook page load faster is scaling. Therefore, how does Facebook scale work? The auto scale system is used as an intermediary between the company servers and incoming traffic. When the traffic is high, the automatic scale system distributes the traffic to different servers and maintains them running at medium capacity. Some of the servers are left idle or only handle batch-processing that consume a very small amount of power.
Facebook uses Source IP Hash that helps to optimize the server speed. If one server fails or gets overwhelmed, traffic is rerouted from that server to the next to ensure that the user is not left with blank pages. Another method employed by Facebook is layer seven-load balancing that evaluates the incoming traffic and redirects it to the most available resource.
Top technology firms such as Facebook and Google are committed to adopting the latest technology to guarantee faster page loads. The scaling, caching technologies and BigPipe technologies have demonstrated high efficiency in reducing load speeds.
I like your post thanks for sharing.