When purchasing a power adaptor for your electrical device, there are two general types: AC adaptors and DC adaptors. While they may look very similar and have similar looking inputs and outputs, their functions are very different.
Other People Are Reading
Alternating Current Adaptors
According to The American Heritage Dictionary of the English Language, Fourth Edition, alternating current (AC) is "an electric current that reverses direction in a circuit at regular intervals." This means that the amount of current (measured in amperes, or amps) goes from positive to negative at a regular interval. This interval is measured in cycles per second, or hertz (abbreviated Hz). An AC adaptor is used to convert one AC voltage into another AC voltage. For example, you may see an AC adaptor that converts voltage from a wall outlet into 12 volts alternating current (abbreviated as 12 V AC). In the United States, that would mean that the voltage was converted from 120 V AC to 12 V AC.
An AC adaptor will also be rated for maximum power output in watts. Watt's Law (named for James Watt, an 18th century Scottish engineer) states that 1 watt (1 W) is how much power there is in an electrical circuit where 1 amp (1 A) of current flows across a difference of 1 volt (1 V). Therefore, a 12 V AC adaptor rated at 36W will have a maximum output of 3 A.
Unless the AC adaptor changes the interval at which the current reverses direction, AC adaptors will only be rated at the input interval (such as 60 Hz for the United States).
Direct Current Adaptors
The American Heritage Dictionary of the English Language defines direct current (DC) as "an electric current flowing in one direction only." Therefore, the direct current remains flowing in a constant direction, with no changes in frequency or polarity (changing from positive to negative).
A DC adaptor differs from an AC adaptor in that the DC adaptor converts AC electricity into DC electricity. For example, a 12 V DC adaptor sold in the United States will convert 120 V AC at 60 Hz into 12 V DC.
Like its AC counterpart, a DC adaptor is also rated for maximum power output in watts. Therefore, a 12 V DC adaptor rated at 36W will also be capable of a maximum output of 3 A, but the difference is that the current is constant, flowing in one direction, rather than the constantly reversing alternating current that the AC adaptor puts out.
While AC and DC adaptors are rated using similar terminology and units, the outputs of these adaptors are very different. Due to the nature of its constantly-reversing electrical current, AC electricity can damage some electrical circuits that were designed for DC electricity. Similarly, DC electricity will cause excess heat in some electrical components, such as transformers, thus causing them to be damaged or destroyed.
Because damage can occur if the wrong type of adaptor is plugged into a device, it is important to be able to tell the difference between the two. An AC adaptor will list the voltage and electrical current settings it is converting from (such as 120 V AC, 1.5 A, 60 Hz) first. The voltage that the AC adaptor is converting to should appear below the "from" voltage (such as 12 V AC, 15 A). AC voltage may also be represented by a line that resembles a tilde (~). An example of this would be "120 V ~ 1.5 A, 60 Hz".
A DC adaptor will also have the voltage and electrical current settings it is converting from listed first (such as 120 V AC, 1.5 A, 60 Hz). However, the voltage that a DC adaptor is converting to will be listed as "DC" (such as 12 V DC, 10 A), or will contain a symbol that has dashed lines below a solid line (see photo).
Which adaptor is correct for my device?
Typically, your electronic device will tell you which type of electrical current is required. This will also be in the form of "12 V DC, 10 A" or "12 V AC, 10 A, 60 Hz" (see photo).
- 20 of the funniest online reviews ever
- 14 Biggest lies people tell in online dating sites
- Hilarious things Google thinks you're trying to search for