Inside Google’s Top Secret Data Centers

It’s the physical network of thousands of fiber miles and servers that create the multibillion-dollar infrastructure that makes Google Google

20121017101020googleservers.jpg
Screenshot from Google Street View data center tour

Google’s constantly refined search algorithm changed the way that we use and conceptualize information and launched the company into its place as one of the world’s most successful and influential entities. But it’s the physical network of thousands of fiber miles and servers that create the multibillion-dollar infrastructure that makes Google Google. And while at least some of these facilities are visually striking, it’s the technology inside that makes them valuable. Google views its network as the ultimate competitive advantage, only allowing key employees to venture inside—until now.

Lenoir, North Carolina—a town of 18,000 once defined by furniture factories—today hosts a Google data center. Wired‘s Steven Levy took a peek inside to “top secret” complex to reveal the intricacies at the center of the digital age. Levy begins his tour:

We have passed through the heavy gate outside the facility, with remote-control barriers evoking the Korean DMZ. We have walked through the business offices, decked out in Nascar regalia. (Every Google data center has a decorative theme.) We have toured the control room, where LCD dashboards monitor every conceivable metric. Later we will climb up to catwalks to examine the giant cooling towers and backup electric generators, which look like Beatle-esque submarines, only green. We will don hard hats and tour the construction site of a second data center just up the hill. And we will stare at a rugged chunk of land that one day will hold a third mammoth computational facility.

Levy visited “the floor,” where he had to don ear plugs to protect himself from the roar of massive fans that control airflow.

Now we enter the floor. Big doesn’t begin to describe it. Row after row of server racks seem to stretch to eternity. Joe Montana in his prime could not throw a football the length of it.

During my interviews with Googlers, the idea of hot aisles and cold aisles has been an abstraction, but on the floor everything becomes clear. The cold aisle refers to the general room temperature—which Kava confirms is 77 degrees. The hot aisle is the narrow space between the backsides of two rows of servers, tightly enclosed by sheet metal on the ends. A nest of copper coils absorbs the heat. Above are huge fans, which sound like jet engines jacked through Marshall amps.

Employees tote gear from “the pharmacy”—the area on the floor that holds replacement gear—around the Lenoir facility’s 49,923 operating servers. Levy points out that actual server numbers aren’t as relevant as they once were, though, since a single Google server today may be the equivalent of 20 servers a generation ago. Rather, Google thinks in terms of clusters, of huge numbers of machines acting in unison to provide a service or run an application.

Approaching the end of his starry-eyed tour amongst the twinkling lights of thousands of Google users, Levy comes to a realization: in a company renowned for innovation and constant improvement, the secrets he gleaned at Lenoir will likely be rendered obsolete in a few short years to come:

As we leave the floor, I feel almost levitated by my peek inside Google’s inner sanctum. But a few weeks later, back at the Googleplex in Mountain View, I realize that my epiphanies have limited shelf life. Google’s intention is to render the data center I visited obsolete. “Once our people get used to our 2013 buildings and clusters,” Hözle says, “they’re going to complain about the current ones.”

More from Smithsonian.com:

Amazing Shots Captured by Google Street View
Smithsonian Gets Google Mapped

Get the latest stories in your inbox every weekday.