How Google used 6SigmaDCX CFD to increase efficiency & optimize the DataCenter

Screenshot 2015-03-03 09.49.23

Every year, Google saves millions of dollars and avoids emitting tens of thousands of tons of carbon dioxide thanks to our data center sustainability efforts. In fact, our facilities use half the energy of a typical data center. This case study is intended to show you how you can apply some of the cost-saving measures we employ at Google to your own data centers and networking rooms. At Google, we run many large proprietary data centers, but we also maintain several smaller networking rooms, called POPs or “Points of Presence”. POPs are similar to millions of small and medium-sized data centers around the world. This case study describes the retrofit of one of these smaller rooms, describing best practices and simple changes that you can make to save thousands of dollars each year. For this retrofit, Google spent a total of $25,000 to optimize this room’s airflow and reduce air conditioner use. A $25,000 investment in plastic curtains, air return extensions, and a new air conditioner controller returned a savings of $67,000/year. This retrofit was performed without any operational downtime.

Click here to check out how thermal modeling helped to make an immediate improvement and identify hot-spots.

Screenshot 2015-03-03 09.48.06

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Tagged , , , , , , , , , | Leave a comment

Video – The DataCenter Ops Performance Challenge

Watch the recording click here

Screenshot 2015-02-26 11.50.36

Many data center operators assume that future IT related changes demanded by the business can be accommodated within their infrastructures. But in reality, this assumption does not hold up. In fact, blind faith in operational best practices is the cause of performance problems that put the data center itself and hence the business as a whole at risk. Unfortunately, most companies don’t systematically assess performance or quantify risk. This keeps the problem hidden until it becomes severe enough to register on the DCIM dashboard. And by this time, the damage is done.

Other industries such as oil and gas, automotive, aerospace and defense have standardized on computer simulation techniques to predict and analyze system performance. However, this approach is rare in the data center industry despite the documented benefits that have been realized in other industries.

In this webcast, we will present a new measure of data center performance called“ACE” (availability, capacity and efficiency) that reveals the relationship between operational decisions and business outcomes. The ACE Score is currently being reviewed by The Green Grid for consideration to become a recommended data center performance measurement.

Next, the audience will learn how a major, global bank uses computer modeling to make operational decisions that meet business requirements. The case study will describe specifically how computer modeling was used to safely raise the air temperatures within two mission critical facilities that save more than $1 million per year in energy bills while improving IT resilience and facility loading capacity.

Watch the recording click here

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Tagged , , , , , , , , , , | Leave a comment

Video – DataCenter Free Cooling Design, Control & Delivery

External Flow

With increased demand on reducing overall energy consumption of data centers, organizations are moving away from conventional cooling methods to newer technologies such as “free cooling.” However, adopting such technologies requires understanding of the performance of not just the data center, but the individual components that comprise the entire cooling system and how they respond to the climate outside. The answers lie in the use of cfd simulation to visualize the entire cooling path before committing to a design.

Join Future Facilities in this webcast to learn how you can use the power of computer simulation in your next data center design. The webcast will cover the use of 6SigmaDCX to create custom evaporative cooling and air-to-air heat exchanger units along with sophisticated control systems to create the best cooling delivery for your data center.

Watch the video here: http://bit.ly/DCXdesign

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Tagged , , , , , , , , | Leave a comment

Modular Cold Aisle Containment and Legacy Cold Aisle Containment: A Comparative CFD Analysis

Modular Cold Aisle Containment and Legacy Cold Aisle Containment: A Comparative CFD Analysis

February 11, 2015

CabinetsLast year, Upsite Technologies introduced the first-ever ‘modular’ containment system to the market, AisleLok® Modular Containment. Many operators were intrigued by the unique design, as no one had previously seen a containment system with an open architecture. But as with many things in life, appearance isn’t everything. Though the open architecture of AisleLok® appears counterintuitive, it was specifically engineered to effectively separate the cold supply air from the hot exhaust air with a minimum of materials, while being extraordinarily easy to install, limiting obstructions to fire suppression and providing flexibility for relocation.

But beyond these unique features, we wanted to thoroughly examine how our new AisleLok®solution stacked up against the more traditional legacy containment systems, so we set out to do some research. We created a CFD model using Future Facilities’ 6Sigma software and laid out a typical 5,280 Sq Ft Data Center. We then analyzed the conditions using 3 distinct approaches: Approach 1 had No containment (the base line), Approach 2 had contained cold aisles using AisleLok Modular Containment, and Approach 3 had contained cold aisles using a legacy containment system.  Here’s what the CFD results revealed, as fully explained in our new white paper.

Fig 1: Baseline Model of Data Center at 5,280 Sq Ft (490 Sq M)

CFD

Temperature Reduction

Both forms of cold aisle containment had a significant impact on the cabinet inflow air temperature. For mid-row cabinets, AisleLok Modular Containment reduced the cabinet air inflow temperature from 85.9°F (29.9°C) down to 67.3°F (19.6°C).  The legacy containment system reduced the air inflow temperature down to 63.3°F (17.4°C), a delta of only 3.5 F° (2.2°C).    This showed that both AisleLok Modular Containment and legacy containment allowed for significant inflow temperature reduction.

Airflow and Cooling Reduction

Once IT equipment intake air temperatures were reduced, fan speeds were reduced to realize operational cost savings. Again, both forms of containment allowed for a significant reduction in cooling airflow rate compared to no containment in the cold aisle. AisleLok Modular Containment enabled a 30% reduction in airflow to the raised floor without IT equipment intake temperatures exceeding the ASHRAE recommended maximum temperature of 80.6F° (27°C). Legacy containment performed slightly better, allowing for a 35% reduction in airflow to the raised floor.

Energy Savings

Energy savings is often a major motivator for data centers to install a cold aisle containment solution. As expected, both forms of containment yielded significant savings. Assuming a $.10/kWh energy cost rate, AisleLok Modular Containment yielded an annual energy savings of about $32,000—a substantial amount of savings for such a simple installation. Legacy cold aisle containment provided slightly more annual energy savings, at about $35,000.

ROI & Total Cost of Ownership (TCO)

This is a biggie, and quite frankly where the divide between the two solutions becomes more apparent. Because AisleLok Modular Containment attaches with magnets and requires zero customization, it can be installed by internal staff in a very short amount of time, which saves the costs of having a 3rd party come out to measure, design, and install a solution. The ordering and installation of Full containment requires many man hours that end up inflating the higher product cost . Because of this, the ROI difference between the two was significant: AisleLok Modular Containment can pay for itself in about 13.5 months, while legacy containment takes at least 24 months to fully pay for itself. The chart below elaborates on this:

Chart*Cost estimates are based on full installation of the computer room configuration in section 2.2 of the white paper. The model features 138 IT cabinets and measures 5,280 ft² (490m²).

The spread is even greater if you consider changes that may need to be made to the containment system after installation. If additional cabinets are added to a row, for example, a legacy containment system will have to be reconfigured, which requires even more design and installation from 3rd parties. Conversely, AisleLok containment is modular, so you can move the Bi-Directional Doors yourself to the new outer-most rack, and add more Rack Top Baffles as needed.

Conclusion

When choosing a cold aisle containment solution, there are many factors to consider when determining which will work best for your situation. But if considering AisleLok Modular Containment vs a legacy containment system, it’s easy to see that the performance levels of the two are very similar. Each is effective at decreasing IT intake temperatures and enabling the reduction of energy costs. However, the benefit of modularity and self-installation give AisleLok the leading edge in many situations, especially when it comes to Total Cost of Ownership (TCO).

Learn more about how AisleLok Modular Containment stacks up to full containment by reading our newest white paper, AisleLok Modular Containment vs. Full Containment: A Comparison Study.

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Tagged , , , , , , | Leave a comment

Video Demo Introducing 6SigmaDCX

We invite you to watch why 6SigmaDCX (http://www.6sigmadcx.com/) has been branded the ‘iOS of data center CFD software packages’ with its new interface and revolutionary new features.

Watch the software demo:http://www.6sigmadcx.com/media/webinars/6SigmaDCX_Webcast.php

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Tagged , , , , , | Leave a comment

Technology Adoption – 2 beliefs you need to undo

Originally posted on Technology Trend Analysis:

Here is further proof that “Consumerization of IT” (CoIT) is a reality. And, that has significantly altered the dynamics of technology adoption. Before I explain this shift, let us look at…

How experts explain technology adoption cycle

The accepted premise is that every new technology goes through the following phases:

  1. Hype: Search for next big thing leads to Hype around any new technology.
  2. Struggle: Adoption of these Bleeding Edge technologies depended on the Visionaries who had the vision, energy and money to make it work.
  3. Success: Mainstream adoption required convincing the Pragmatists who needed success stories and support system around the technology.

Not all technologies made it to mainstream. All these are from the perspective of an enterprise. Consumers had very little role to play in this lifecycle. This underlying theme comes out in both the “Hype Cycle” model used by Gartner since 1995 and the…

View original 612 more words

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Leave a comment

CFD Modeling Can Help Maximize Datacenter Performance

123e9c1

By Maryellen Lo Bosco

By analyzing air flow, computational fluid dynamics (CFD) tools can help facility managers maximize data center capacity. While data center infrastructure management (DCIM) has good value, supplementing it with CFD modeling provides a higher level of monitoring in the data center and more predictive ability.

The CFD modeling is emerging as a new tool that data center facility managers can use to fight hot spots. Most data centers, especially older ones, have cooling problems and hot spots, explains David Cappuccio, managing vice president and chief of research for data centers, Gartner. The way that problem has been solved in the past was to move equipment around as problems came up, but the current trend is to look at the entire floor space to get the most efficient air flow possible, which is where CFD modeling comes in.

“By running these models you can see how air is being distributed around the data center,” says Mark Evanko, principal, BRUNS-PAK. CFD modeling focuses on airflow, static pressure, and temperature, and can predict where hot spots will show up, so CFD modeling can be used to determine the best place to put new equipment. “Before CFD modeling, you might be placing 10 different air-conditioning units throughout the data center, but when you deploy the CFD model you might just need six. It’s also a tool for optimizing planning going forward,” Evanko says.

The need for modeling arises from the way that IT equipment is installed in the data center. Data centers are designed and built on the projection for the amount of computing capacity that is needed to support the business for many years, explains Christopher Wade, national program director, critical environment operations, Newmark Grubb Knight Frank. In reality, this projection is just a “best guess” because the type of equipment that goes into them changes over time, he explains, as organizations install whatever systems are needed for the business. IT systems are upgraded or changed every six months, in some cases, and those changes have to be addressed on the data center floor. In this rapid evolution of the data center, it’s hard to predict how much power, space, and cooling will be needed.

Using Space Efficiently

Predictive analytics being used for data center infrastructure management (DCIM) are intended to get more efficient use of the data center floor — “more compute per kilowatt and more compute per square foot,” Cappuccio says. “You can get higher density in the racks and not have to build the next data center for years to come.”

While DCIM does a good job of keeping track of temperatures, power, and space, it cannot tell the facility manager about air flow or predict the impact on the data center of future system installations. Wade says it is now necessary to have a higher level of monitoring in the data center, and supplementing DCIM with CFD provides engineering simulation tools that can predict the impact of installing equipment in any location on the floor.

“Adding CFD to DCIM allows you to predict the nature of real computer deployments you have going out to the data center,” Wade says. The modeling tool will immediately let the data center manager know whether the placement of the new equipment will create a hot spot or some other problem.

“The data center is now in a similar place that individual computers were a few decades ago,” says Jonathan Koomey, research fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University and an expert on data center energy use and management. “The software and hardware are now good enough so that we can do computer models of the whole data center.” It is possible to take into consideration every individual piece of equipment — for example, a server with a specific type of fan that blows in a certain way — and incorporate it into the data center model in a way that provides insight, Koomey says.

The industry is moving away from reconfiguring existing data centers using “rules of thumb” to actually using 3-D models, not only in existing data centers, to measure temperature, airflow, and location to predict hot spots, but also in the design of new data centers.

Original article: http://www.facilitiesnet.com/datacenters/article/CFD-Modeling-Can-Help-Maximize-Data-Center-Capacity–15586?source=FeaturedBOM-01/2015#

Posted in Data Center DCIM Datacenter Datacenters Datacentre | Tagged , , , , , , , , , , , , | Leave a comment