air purifier for data center

There’s been a lot of press over the last few years about “Google type” data centers that use 100% fresh air to perform cooling. In many cases, these designs bring fresh air directly in from outside and may use some type of evaporative cooling to assist in tempering the air. In general terms, these data centers are willing to accept a wider variation in the ambient environment as well as accept any possible contaminants carried in with that air. Obviously filtration can help but you can only filter so much – relying on filters to insure data center uptime can be risky in the event of a major nearby fire, sandstorm, or some other event that can clog filters and leave you open to a single point of failure. The question I ask is whether this is a good idea for typical data centers? I get asked this question often and I usually get a mixed feeling from many data center operators. There tends to be a mentality of “I’m not sure I’m like ‘Google’, even though I’d like to lower energy costs.”

To be clear, I’m not trying to bash the ‘Google type’ data center – they make a lot of sense for their intended application (those guys at Google are pretty smart!). However, the fact is that many data centers today are more limited in their ability to leverage a fresh air design. For instance, if the data center isn’t close to an exterior wall or roof than it can be overly complicated to get the fresh air into the IT Room.
best air purifier car exhaustWhat if it’s a small data center in the middle of an office building?
ac coil cleaning tools Our data center in St. Louis is an example where significant work would have to occur to the building to utilize 100% fresh air.
air blower for car washAnd being business people, we have to look at whether it’s worth the expense.

Not to mention that many people I talk to are inherently a little more risk averse than ‘Google type’. Small issues like a potential chemical spill or pollution in the air outside getting brought directly into the IT room are concerns I have heard raised. I had one data center manager tell me that no matter what ASHRAE says, he’s not going to allow the temperature to vary too much in his data center. It’s the old mentality of “I won’t get fired if the electric bill is high but I will if the application goes down.” We developed our Ecobreeze solution to separate the outside air from the inside air by using an air-to-air heat exchanger. We did this in response to our research showing a large concern by the mainstream market to bringing in fresh air directly. The irony is that now when I present Ecobreeze at conferences I have people challenging me during the Q&A session on how much it costs to run the air-to-air heat exchanger implying that it’s inefficient.

It’s interesting to me that I have yet to see in one of the presentations someone stand up publicly and say they don’t think fresh air is a good idea. Usually they come up to me afterwards and discuss their concerns one on one. “I didn’t want to say anything in front of the group but I’m not risking my job on 100% fresh air”. It’s a fascinating dynamic to experience. I’m really not sure whether it’s smart to be aggressive or conservative with the use of 100% fresh air. Regardless, I do believe that most data centers can probably get a lot more efficient by doing some basic things well, such as utilizing air containment strategies, right-sizing IT load against power and cooling capacity, ensure economization is programmed and operating properly, etc. Time will tell but I expect we will still see water in a lot of data centers for the foreseeable future. Fresh air or water – take your pick. Please follow me on Twitter @KevinBrown77 Kevin Brown is Vice President, Data Center Global Offer for Schneider Electric.

He leads of team of industry professionals to develop and bring to market solutions for the data center market. In this role, he has responsibility to articulate the vision for Schneider Electric’s data center offer and create comprehensive data center solutions that solve real customer problems today. Kevin is an experienced industry professional in both the IT and HVAC industry. He has over 20 years experience at Schneider Electric in a variety of senior management roles including product development, product management, marketing, and sales.We've all done it — made that stupid mistake and hoped nobody saw it, prayed that it wouldn't have an adverse effect on the systems or the network. And it's usually okay, so long as the mistake didn't happen in the data center. It's one thing to let your inner knucklehead come out around end user desktop machines. But when you're in the server room, that knucklehead needs to be kept in check. Whether you're setting up the data center or managing it, you must always use the utmost caution.

Well, you know what they say about the best laid plans... Eventually you will slip up. But knowing about some of the more common mistakes can help you avoid them. You know the old adage — measure twice, cut once. How many times have you visited a data center to see cables everywhere? On the floor, hanging down from drop ceilings, looped over server racks and over desks. This should simply not happen. Cable layout should be given the care it needs. Not only is it a safety hazard, it is also a disaster waiting to happen. Someone gets tangled up and goes down — you run the risk of a law suit AND data loss, all because someone was too lazy to measure cable runs or take the time to zip tie some Cat5. I know, this might seem crazy, but I've witnessed it first hand too many times. Admins (or other IT staff) enter the data center, drink in hand, and spill that drink onto (or into) a piece of equipment. In a split second, that equipment goes from life to death with no chance for you to save it.

Every data center should have a highly visible sign that says, "No drink or food allowed. This policy must be enforced with zero tolerance or exception. Even covered drinks should be banned. This applies to nearly any electricity problem: accidentally shutting off power, lack of battery backups, no generator, pulling too much power from a single source. Electricity in the data center is your only means of life. Without it, your data center is nothing. At the same time, electricity is your worst enemy. If you do not design your electrical needs in such a way as to prevent failures, your data center begins its life at a disadvantage. Make sure all circuit breakers (and any other switch that could cause an accidental power loss) have covers and that your fire alarms and cutoff switches are not located where they might tempt pranksters. How many keys to your data center have you given out? Do you have a spreadsheet with every name associated with every key? If you aren't keeping track of who has access to the data center, you might as well open up the door and say, "Come steal my data!"

And what about that time you propped the exit door open so you could carry in all of those blades and cable? How much time was that open door left unattended? Or what about when you gave out the security code to the intern or the delivery man to make your job easier.... See where this is going? When you step into data center, what is your first impression? Would you bring the CEO of the company into that data center and say, "This is the empire your money has paid for?" Or would you need a day's notice before letting the chairman of the board lay eyes on your work? How exactly did you map out that network? What are the domain credentials and which server does what? If you're about to head out for vacation, and you've neglected to document your data center, your second in command might have a bit of drama on his or her hands. Or worse, even you've forgotten the domain admin credentials. I know, I know — fat chance. But there's this guy named Murphy. He has this law. You know how it goes.

If you're not documenting your data center, eventually the fates will decide it's time to deal you a dirty hand and you will have a tangled mess to sift through. How many times have you caught yourself or IT staff using one of the machines in the data center as a desktop? Unless that machine is a Linux or Mac desktop, one time is all it takes to send something like the sexy.exe virus running rampant through your data center. Yes, an end user can do the same thing. But why risk having that problem originate in the heart of your network topology? Sure, it'd be cool to host a LAN party in your data center and invite all your buds for a round of CoD or WoW. When was the last time you actually visited your data center? Or did you just "set it and forget it"? Do you think that because you can remote into your data center everything is okay? That data center needs a regular visit. It doesn't need to be an all-day tour. Just stop by to check batteries, temperature, cabling, etc. If you fail to give the data center the face time it needs, you could wind up with a disaster on your hands.

You're proud of your data center — so much so, you want to show it off to the outside world. So you bring in the press; you allow tours to walk through and take in its utter awesomeness. But then one of those tourists gets a bit too curious and down goes the network. You've spent hundreds of thousands of dollars on that data center (or maybe just tens of thousands —- or even just thousands). You can't risk the prying eyes and fingers of the public to gain access to the tenth wonder of the world. Don't deny it: You've spent all-nighters locked in your data center. Whether it was a server rebuild or a downed data network, you've sucked down enough caffeine that you're absolutely sure you're awake enough to do your job and do it right. If you've already spent nine or 10 hours at work, the last thing you need to do is spend another five or 10 trying to fix something. Most likely you'll break more things than you fix. If you have third-shift staff members, let them take care of the problem.