Our latest case study takes a closer look at the ins and outs of Google Inc.
There isn’t a more prolific company on the Internet today than Google. From email to social networking, the Mountain View, CA based company has products for almost every aspect of online activity. And for your reading pleasure, we just released a case study examining Google in more depth. Here are some highlights:
Despite the company’s omnipresence now, it had humble beginnings in 1997. Larry Page and Sergey Brin’s research project was originally named BackRub, because the search algorithm was determined by backlinks (and still is). Thank goodness for the name change – if it had stuck, got popular, and then became a verb like Google has, we’d all have to say we just “backrubbed” a search term. I much prefer telling someone that I googled information.
The name change came about from a misspelling of googol. A googol is a number – a one followed by one hundred zeros. Brin and Page picked it to represent the wealth of information they wanted to provide their user base. Talk about self-fulfilling prophecies.
One interesting fact that might be less well known is that the two Stanford PhDs students attempted to sell their company in its early stages. In 1999, both founders decided that their side project was taking too much time away from their academics. They approached Excite search engine CEO George Bell and tried to sell him the concept and implementation of Google Inc. for one million dollars. Bell declined, and is probably still kicking himself today.
As for the systems behind the magic, Google has remained notoriously tight-lipped about its own hardware and software. >From what we can gather, Google primarily codes its products in C++, Java, and Python. Additionally, Google runs their own Linux-based web server called Google Web Server on all of their servers. The hardware was given a brief unveiling in 2009 and since then no one outside the company has been given another look at the servers.
Power was the name of the game at the last glimpse of Google’s servers. The only really surprising thing about them was that each server had a 12-volt battery in addition to its power supply. This redundancy is a guard to make sure in case of one power failure, servers have some time to keep running and providing those services billions of users have now become dependent on.
The speed of Google’s search results are thanks in part to the in-house server index and search system the company developed, Caffeine (that is the name of the software, not a nod to the company’s break room coffee). Caffeine is continually indexing, which is why it can return search results in milliseconds.
A common data center practice for massive companies like Google (also, Microsoft uses the same method) is to build data centers out of shipping containers. That way, every building is modularized and can be easily moved or built upon. Google started doing this with at least some of their data centers in 2005.
For as frequently as we all use Google, it’s surprising some of the details about their history and systems that we don’t know. Supplement your knowledge with a brief synopsis of this powerhouse company’s past and current operations by reading our latest case study.