888-504-4678 Support@nethosting.com

Note: Part of what makes Facebook servers so fast isn’t just the new network but the dedicated servers it uses as well. Try a 30-day, obligation-free dedicated hosting solution with NetHosting today to get that same performance from your website.

Facebook noticed a spike in its internal traffic and came up with new network, server, and data center designs to better accommodate users.

When Facebook decided it needed another data center to manage all of its traffic, it was hoping for an easy solution. The company had just built a data center in Prineville, Oregon that was doing very well for them so when ground was broken in Forest City, North Carolina, it was only reasonable that the plan would be to build the exact data center that was already operational in the Oregon desert. Fortunately and unfortunately for the social networking behemoth, that proved to be a bad idea.

facebook-data-center-future-proof

It’s fortunate that the plans wouldn’t work because Facebook was seeing a spike in its network traffic, and while that’s a great thing for the website, it meant that architects and engineers would have to go back to the drawing board to figure out the best data center design to facilitate that traffic. More importantly, Jay Parikh (the overseer of Facebook’s entire data center infrastructure) realized that the company’s best move would be to future proof the data center.

Note: Data center security is just as important as data center network speeds. Take a tour of the NetHosting data center online to see the digital and physical security measures we take to make sure our customers’ data is safe.

Future proofing has been looked down in before in consumer electronics because for the extra money, it doesn’t really save you a ton of time before the next latest and greatest gizmo hits the shelves. But for the case of a much larger and cumbersome investment, like a data center, future proofing might just save your bacon as server technology evolves in the coming years. Of course, it’s hard to proof your gear for too far in the future as well because who knows what the next great thinker will come up with to revolutionize the way data is stored and hosted.

Two months into the building of the original data center plans, Facebook engineers noticed that the internal services traffic on Facebook was already at a higher rate than traffic to the site and traffic moving away from the site. It was growing faster than either of those two categories as well. The numbers show that inter-server traffic in Facebook data centers has more than doubled in the past seven months. Services like real time notifications, friend recommendations, and the rest of Facebook’s features that connect users to each other are the inter-server traffic the company noticed.

Aside from just building a brand new data center, Parikh and his colleagues reworked the entire Facebook network to better handle this uptick in traffic. Additionally, the team came up with more efficient server designs to make sure the hardware was performing as fast and reliably as possible for Facebook’s specific needs. The new server designs were already in use at the North Carolina data center and were up for grabs thanks to the social network’s open source project, Open Compute Foundation. Facebook has now joined the illustrious ranks of big Internet names like Amazon, eBay, Google, and Microsoft for building not only its own hardware, but its own data centers as well.

From the network cards in servers to the switches connecting server racks in the data center, Facebook has completely overhauled its network. The entire thing now runs at ten gigabits per second which is about ten times faster than it was previously, which also partly contributed to the need to get new hardware to keep up with the blazing new network speed. Again, Facebook keeps good company: Google also designs its own network gear.

But to get to the nitty-gritty of the network change, Facebook moved from a layer 2 network to a layer 3 network, meaning that traffic goes through a high-level border gateway protocol now. That same protocol is what directs traffic at some of the base levels of the Internet.

Time will tell if the data center plans are truly future proofed, but one thing for certain is that Facebook knows how to optimize its hardware to deliver the best browsing experience to its users. To read more about Facebook hardware customization, check out our blog post about the changes the company made to its server hard drives.