Efficiency: How others can do it
Five things you can do now
At Google, we've spent more than a decade improving the energy efficiency of our data centers, and we've picked up some best practices along the way. Whether you're running a small or large data center, you can apply several simple design choices to improve the efficiency of your facility, reduce costs, and reduce your impact on the environment.
Here are our top five best practices:
In addition to the large-scale data centers used to deliver our web services, we maintain several small, network point of presences (POPs). See how we applied some of the efficiency best practices during a retrofit to save money and reduce greenhouse gas emissions.
You can't manage what you don’t measure, so be sure to track your data center's energy use. The industry uses a ratio called Power Usage Effectiveness (PUE) to measure and help reduce the energy used for non-computing functions like cooling and power distribution. To effectively use PUE, it's important to measure often. We sample at least once per second. It’s even more important to capture energy data over the entire year, since seasonal weather variations affect PUE. Learn more.
Good air flow management is crucial to efficient data center operation. Minimize hot and cold air mixing by using well-designed containment. Then, eliminate hot spots and be sure to use blanking plates (or flat sheets of metal) for any empty slots in your rack. We've found that a little analysis can have big payoffs. For example, thermal modeling using computational fluid dynamics (CFD) can help you quickly characterize and optimize air flow for your facility without having to reorganize your computing room. Learn more.
Adjust the thermostat
The need to keep data centers at 70°F is a myth. Virtually all equipment manufacturers allow you to run your cold aisle at 80°F or higher. If your facility uses an economizer (which we highly recommend), run elevated cold aisle temperatures to enable more days of "free cooling" and higher energy savings. Learn more.
Use free cooling
Chillers typically use the most energy in a data center's cooling infrastructure, so you'll find the largest opportunity for savings by minimizing their use. Take advantage of "free cooling" to remove heat from your facility without using a chiller. This can include using low temperature ambient air, evaporating water, or a large thermal reservoir. While there's more than one way to free cool, water and air-side economizers are proven and readily available. Learn more.
Optimize power distribution
You can minimize power distribution losses by eliminating as many power conversion steps as possible. For the conversion steps you must have, be sure to specify efficient equipment transformers and power distribution units (PDUs). One of the largest losses in data center power distribution is from the uninterruptible power supply (UPS), so it's important to select a high-efficiency model. Lastly, keep your high voltages as close to the power supply as possible to reduce line losses. Learn more.
At Google, we believe that industry collaboration is the key to creating a greener and more efficient technology sector. In 2009 and 2011, we hosted two events on data center efficiency where we invited our industry peers to join a candid discussion on how we can improve as a whole. In 2013, we hosted the “How green is the Internet?” summit to explore questions around the environmental impacts and benefits of the Internet. Take a look at our past events.
In June 2013, we hosted the “How green is the Internet?” summit. During this gathering experts from industry, academia, government and NGOs explored questions about the environmental impacts and benefits of the Internet. The event featured speakers Al Gore, Eric Schmidt, Jon Koomey and others, as well as a preview of a new study on the energy impact of cloud computing. Topics of in-depth working sessions included digital content, e-commerce and collaboration tools.
To explore a European perspective on best practices, we hosted a summit in Zurich, Switzerland. Here, we presented a video about our sea-water cooled data center in Hamina, Finland, and released a white paper and series of videos on our data center best practices. This event featured keynotes, case studies, and strategies from industry leaders and Google executives, as well as two panel discussions covering eight viewpoints across Europe and the US.
We hosted our first data center efficiency event at our Mountain View campus in 2009. Here, we gave a video tour of one of our data centers and showcased a Google server for the first time, revealing details like our onboard uninterruptible power supply (UPS). We also talked about how we manage water and e-waste at our data centers. This summit featured presentations by the Green Grid, the US Environmental Protection Agency, Amazon Web Services, and several Google data center experts.
We collaborate with other members of the data center community to improve efficiency on a broader scale.
In 2007, we teamed with Intel and other industry partners to found the Climate Savers Computing Initiative. This non-profit consortium was committed to cutting the energy consumed by computers in half—reducing global CO2 emissions by 54 million tons per year. In 2012, Climate Savers Computing Initiative merged with The Green Grid to create a single, global organization working to address energy efficiency and sustainability issues across the entire computing ecosystem.
Today, Google serves on the board at The Green Grid. We're actively involved across many of the technical working groups focused on pushing the state of data center efficiency forward. We're also advocating public policies that accelerate energy efficiency and renewable energy use.