For Data Center Issues: Use Your Senses To Help Your Sensors
Technology is wonderful. Automation, software, programming, robotics. All great stuff.
In the business of data centers, the hub of greater technology, day-to-day operations performed by humans with clipboards, pencils, and spreadsheets have long been replaced by powerful automated processes for monitoring power, space, and cooling and hardware.
Besides burning up countless man hours for these menial tasks, the primary reason for replacing the data center tours and inspections is that the greatest cause for unplanned downtime is human error. This is a bummer – makes us humans feel outclassed and uneasy.
Now, operators basically no longer have to leave their desks. Automated scripts unseemly work magic and notify environmental and hardware abnormalities with texts and emails.
However, we humans are not completely done. We still have a purpose, and we have our own bodily tools to assist our software-driven electronic-mechanical friends living in rows of data cabinets.
The next time you’re strolling through a data center, particularly up and down cold and hot aisles, take a quick stop to observe the environment. You’d be surprised at what you can discover with just your naked eyes, hands, and a dollar bill. Yes, no need for heat guns, handheld devices running DCIM software, or monitoring equipment. For now, anyway.
A modern data center, a technological wonder world filled with automation. Most are unmanned these days, but humans can still add their two cents (or senses).
It’s nice to have these tools, and in the end, please do use them, but your natural senses are wonderful tools, too.
Let’s take the sense of touch. Walk down a cold aisle of a data center. A cold aisle is comprised of parallel data cabinets in which the front-side of servers face each other. The floor tiles are slotted or perforated for upward-shooting cold air from CRAC or CRAH units. The aisle is usually sealed with doors at each end and a ceiling to keep the cold air contained.
As you walk, place your arms out horizontally. With your hands, you should feel the cold air hitting the perforated doors of the server racks. Move your arms upward and downward as you walk. You may notice a change in temperature, a bit warmer perhaps. Now, take a look. Inspect the inside of the rack. Maybe a blanking panel is damaged or missing, allowing backend exhaust from servers to flow through the opening to mix in with the cold air.
This may also be a good time to break out the old George Washington, or fiver, or sawbuck. Place the bill on the perforated door. If the servers are working correctly, the paper money should stick to the door like a magnet on a refrigerator door. This is because the faces or front-end of the servers are taking in the cold air with fans and cooling the inside of the chassis. The exhaust is discharged at the backend into the hot aisle.
Now, take the dollar bill and place it at the warmer spot. Maybe the bill slips down a bit, wavers, or falls off depending on the location of the warm spot. Check. Is a blanking panel cracked or missing – this may be the source of warmer air pushing through. Make a note and repair the opening. Sometimes, blanking panels fall due to improper installation after hardware maintenance. It’s always best practice to check for proper blanking panel installation after maintenance, installs, and decommissions.
Another reason for warm spots are nontraditional intake/exhaust equipment in the data racks. These types of hardware are usually network devices that intake and exhaust from the sides or tops, although there are some servers that may intake from the top. Again, take out your trusty dollar bill. Open the cabinet door. Place the paper bill on the front of the hardware. If it falls, the device is either dead or it is taking in air from the sides or top. Run the dollar bill along the sides to identify intake or exhaust just as was done at rack door earlier.
This image captures the cooling process for servers in a data center. DCIM software monitors environmental conditions in the data center as well as the health of servers.
“Side-breathers” and “top-breathers” should not live in the same racks as traditional-breathing hardware. This causes warm-air circulation within the rack. Make it a habit to test all pieces of hardware for airflow direction prior to installation.
Sometimes switches in server racks distribute opposite airflow. Most of the time, this is corrected with installing the correct airflow fan and power supply kits. Still, this requires checking prior to installation. Once live, downtime is required to fix the issue.
The same feely-touchy sense also works at the backend of the racks. To get a quick idea of how much kilowatts a piece of equipment is churning out, a simple test is to stand behind the exhaust side of the rack. Racks with exhaust chimney have solid doors; those that do not employ perforated or slotted doors.
Try to gauge the intensity of the exhaust by comparing it to the hot-air expulsion of a common bathroom blow dryer. A typical range of watts for a hot blow dryer is between 800 and 1800 watts. Does the exhaust coming from the back of a 1RU or 2RU server feel as much as that of the blow dryer? Calculating the hours used by the power use (watts) by average kWh of 10 cents of a blow dryer comes out to 366 kWh per day at a cost of $13,360.46 dollars a year. That’s a lot of hot air and money!
Besides touch or feel, plain eyesight is also a good observation tool. Most equipment is fitted with colored lights to indicate states of operation. For example, green may signify power on while red or amber signify some sort of component failure. Some lights blink, and some are solid.
Auditing and inspecting racks on a daily basis tunes the eyes to zero in on changes. It becomes natural when a light state does not look right. A red light on a power supply may mean a failure or a loose or disconnected power cord. Hopefully, DCIM or monitoring software picks this up, but if not, the naked eye could.
Our sense of smell can detect burning or melting plenum or circuitry. Sometimes, power supplies pop and servers ignite. Most data centers are equipped with environmental sensors that detect fires or smoke, but sometimes smaller occurrences are undetected.
The last human sensor is our ears. In general data centers are noisy. IT equipment and facility structures make all kinds of sounds. Servers usually get louder during times of greater processor power.
A way that ears combined with the other senses can detect a problem if a particular sever is cranking at an unusually loud level. Further inspection may indicate a red-faulted power supply (eyes) that jettisoned the server to increase fan speed and accompany noise (ears) to compensate for cooing while sniffing in the burnt-out power supply (nose). At this point, keep the dollar bill in your pocket.
Our human senses can pick up a lot during simple saunters through data centers, but a lot of today’s sites are unmanned and rely on monitoring, DCIM software, and automation processes to detect issues in and outside the racks.
Many data center infrastructure software companies offer a large variety of intelligent PDUs, environmental sensors, and DCIM software to assist our hands, eyes, noses, and ears, and combined, the solutions should save you a lot of energy dollars.
This article was originally published on @vpanageas