Do you know, where is your Facebook data store? its at arctic circle!
Saturday 30 November 2013

Do you know, where is your Facebook data store? its at arctic circle!

Till now the giant companies used to buy server racks from the usual suspects, Facebook designs its own systems and outsources the manufacturing work. In April 2011, the social networking company began publishing its hardware blueprints as part of its so-called Open Compute Project, which lets other companies share the credit on the work of its engineers. The project now sits at the heart of the data center industry’s biggest shift in more than a decade.

Now about the place were the Data lies, its just south of the Arctic Circle, in the Swedish town of Luleå. In the middle of a forest at the edge of town, the company in June opened its latest mega sized data center, a giant building that comprises thousands of rectangular metal panels and looks like a wayward spaceship. Comparing and having its measures, it’s the most energy-efficient computing facility ever built, a colossus that helps Facebook process 350 million photographs, 4.5 billion “likes,” and 10 billion messages a day. While an average data center needs 3 watts of energy for power and cooling to produce 1 watt for computing, the Luleå facility runs nearly three times cleaner, at a ratio of 1.04 to 1. “What Facebook has done to the hardware market is dramatic,” says Tom Barton, the former chief executive officer of server maker Rackable Systems. “They’re putting pressure on everyone.”

The reason why the Facebook selected this corner of the world to store its Data has a lot to do with the system’s efficiency. Main reason is that, Sweden has a vast supply of cheap, reliable power produced by its network of hydroelectric dams. Just as important, Facebook has engineered its data center to turn the frigid Swedish climate to its advantage. Instead of relying on enormous air-conditioning units and power systems to cool its tens of thousands of computers, Facebook allows the outside air to enter the building and wash over its servers, after the building’s filters clean it and misters adjust its humidity. Unlike a conventional, warehouse-style server farm, the whole structure functions as one big device. Different from other Data centers of the big companies.

Facebook’s engineers has tried to simplify its servers, which are used mostly to create Web pages by stripping away typical components such as memory slots and cables and protective plastic cases. The servers are basically slimmed-down, exposed motherboards that slide into a fridge-size rack. The engineers say this design means better airflow over each server. The systems also require less cooling, because with fewer components they can function at temperatures as high as 85F.

When Facebook started to outline its ideas, traditional data center experts were skeptical, especially of hotter-running servers. “People run their data centers at 60 or 65 degrees with 35-mile-per-hour wind gusts going through them,” says

The custom hardware designed by Web giants such as Google and Amazon.com has remained closely guarded, but Facebook’s openness has raised interest in its data center models beyond Internet companies. Facebook has provided a road map for any company with enough time and money to build its own state-of-the-art data mega factory. Executives from Intel and Goldman Sachs have joined the board of the Open Compute Project’s foundation, a 501(c)(6) corporation chaired by Facebook’s Frankovsky. Taiwanese hardware makers such as Quanta Computer and Tyan Computer have started selling systems based on Open Compute designs. Facilities on the scale of Luleå, which can cost as much as $300 million to build, will continue to be outliers, but companies of all sizes can take advantage of the cheaper, more power-efficient equipment.

Your Data stored safely with lot of security long away on the other end of the world, seems amazing!!!

 

Photo Gallery Videos