Link Copied

Take a look at Facebook's gorgeous data centers from around the world

Here's a look inside Facebook's data center in Forest City, North Carolina. The company launched this center in 2010.

Here's a look inside Facebook's data center in Forest City, North Carolina. The company launched this center in 2010.
Share Slide

Facebook began construction on its second data center in Lulea, Sweden, in March 2014.

Facebook began construction on its second data center in Lulea, Sweden, in March 2014.
Share Slide

Inside Lulea's first data center building, you can see Facebook's "vanity free" approach to design. For instance, there are no plastic bezels in front of its servers — something commonly found in other data centers — to allow those servers to draw in more air.

Inside Lulea's first data center building, you can see Facebook's "vanity free" approach to design. For instance, there are no plastic bezels in front of its servers — something commonly found in other data centers — to allow those servers to draw in more air.
Share Slide

In the Lulea data center, web server and storage designs use snaps and spring-loaded catches to hold components in place.

In the Lulea data center, web server and storage designs use snaps and spring-loaded catches to hold components in place.
Share Slide

Lulea's rapid deployment data center (RDDC) design is all about being lean, which allows Facebook to deploy two data halls in the time it previously took to deploy one, thus reducing the cost of construction.

Lulea's rapid deployment data center (RDDC) design is all about being lean, which allows Facebook to deploy two data halls in the time it previously took to deploy one, thus reducing the cost of construction.
Share Slide

This is Facebook's data center in Prineville, Oregon, which is the first data center deployed using the company's Open Compute Project designs.

This is Facebook's data center in Prineville, Oregon, which is the first data center deployed using the company's Open Compute Project designs.
Share Slide

Facebook used 1,560 tons of steel to build its Prineville data center, which is the equivalent of 900 mid-size cars.

Facebook used 1,560 tons of steel to build its Prineville data center, which is the equivalent of 900 mid-size cars.
Share Slide

Facebook's Prineville data center also uses a lot of wires and cables. In fact, there are 950 miles worth of wires and cables in this data center alone — roughly the distance between Boston and Indianapolis.

Facebook's Prineville data center also uses a lot of wires and cables. In fact, there are 950 miles worth of wires and cables in this data center alone — roughly the distance between Boston and Indianapolis.
Share Slide

The Prineville data center also has a ton of concrete: 14,254 cubic yards, to be exact. Imagine a sidewalk that's 24.3 miles long.

The Prineville data center also has a ton of concrete: 14,254 cubic yards, to be exact. Imagine a sidewalk that's 24.3 miles long.
Share Slide

Thanks to Facebook's unique server design, technicians, like this one working in Prineville, don't have to spend time finding the right tools to unscrew multiple components every time they need to replace a failed component.

Thanks to Facebook's unique server design, technicians, like this one working in Prineville, don't have to spend time finding the right tools to unscrew multiple components every time they need to replace a failed component.
Share Slide

Facebook's rapid data center deployment structure is similar to assembling a car: The structural frame is built before all of the components, which are attached on an assembly line in a factory. The entire structure is driven to the building site on a truck.

Facebook's rapid data center deployment structure is similar to assembling a car: The structural frame is built before all of the components, which are attached on an assembly line in a factory. The entire structure is driven to the building site on a truck.
Share Slide

With the efficiency gains afforded by the unique server designs, Facebook has reduced the average repair time to swap parts by more than 50%.

With the efficiency gains afforded by the unique server designs, Facebook has reduced the average repair time to swap parts by more than 50%.
Share Slide

Here you can see technicians delivering server racks to Lulea's building one, the company's first data center building.

Here you can see technicians delivering server racks to Lulea's building one, the company's first data center building.
Share Slide

As a result of these unique data centers, Facebook can handle the billions of daily "Likes" and photos, as well as the trillions of messages that have been sent since Facebook was founded over a decade ago.

As a result of these unique data centers, Facebook can handle the billions of daily "Likes" and photos, as well as the trillions of messages that have been sent since Facebook was founded over a decade ago.
Share Slide
More from our Partners