What’s The Difference Between AC and DC Power?
Posted by RJ Tee on October 26, 2017
- Power Distribution & Monitoring
Do you question the best way to transmit electricity to network devices throughout your data center? Some argue that alternating current (AC) is the less expensive and accessible method of using power, while others claim that it’s safer and more efficient to use direct current (DC).
What’s the difference between the two types of current? In a nutshell, AC wavelength changes direction, while DC’s is constant, or a “direct,” line. In the U.S., AC wavelengths oscillate at a frequency of 60 Hz per second. Due to the fact that AC power changes direction, you simply need a transistor to adjust the voltage, or force that pushes the current.
The voltage in a DC supply unit (such as a fuel cell), by comparison, cannot be changed unless the power supply is altered or turned off, such as with a high-voltage direct current (HVDC) system. DC is, therefore, less efficient and affordable to transmit than AC power, but it can be accomplished.
What’s the better power solution for your data center? The truth is that they both have specific uses. It’s not a matter of choosing one or the other. In fact, migrating away from AC would be impractical and difficult because your facility is most likely wired to receive it already. It’s much less expensive and much more practical to use AC directly from the power grid for your daily data center power needs. Most data centers take AC, convert it to DC to power fuel cells and batteries, and then convert it back to AC to power their machines.
Server Technology carries a full line of rack mount power strips that allow you to feed AC or DC power to your data center equipment cabinets or remote site all while providing advanced power monitoring metrics and management capabilities.