We Value Your Privacy

We and our partners use technology such as cookies on our site to personalise content and ads, provide social media features, and analyse our traffic. Click below to consent to the use of this technology across the web. You can change your mind and change your consent choices at anytime by returning to this site.

Update Consent
Loading ...

How to reduce voltage with resistors

Updated February 21, 2017

Reducing voltage in an electrical circuit with resistors is an efficient way for controlling electrical power. The most important aspect for the use of resistors is the correct sizing of that resistor so it will not burn out. By following a few basic procedures and the use of Ohms Law, you can reduce the voltage of most electrical circuits.

Loading ...
  1. Understand the use of Ohms Law where "V = I*R". "V" is equal to the voltage of the circuit. "I" is equal to the current that flows in the circuit and "R" is the resistance of the electrical circuit. In this usage of Ohms law, "V" would be the voltage reduction by certain sizes of resistors that have a fixed current flow.

  2. Calculate the amount of voltage drop or reduction in an electrical circuit that has a power source of "100" volts at "0.1" amperes and a resistor of "100" ohms. Using the formula "V=I_R" the equation would result "V= 0.1_100". The answer would be "10" volts. This means that a voltage reduction of "10" volts would occur across the "100"-ohm resistor and the circuit would have only "90" volts of power past the resistor placed in the circuit.

  3. Find the power size of the resistor that would be needed for the circuit. The correct wattage rating for the resistor is important, otherwise it may burn up during the electrical circuits function. The formula using Ohms Law for power across a resistor is "P=I^2_R". Where "P" is equal to power in watts. "I^2" is the current squared or "I_I" and "R" is the resistance in the circuit. The wattage requirement for the resistor in step 2 then would be "0.1^2 * 100" is equal to "1" watt. A safe resistor for this circuit could be enlarged to "2" watts.

  4. Find the resistance of a "60" watt light bulb in a common "120" VAC circuit. We must first find the current draw of the light bulb. Again using Ohms Law "P = I*V" where "P" is equal to power in watts and "I" is the current while "V" is the voltage. If we rearrange the formula to find Current then the equation is "I = P/V" where "I = 60/120" and is equal to "0.5" amperes.

  5. Modify the formula of Ohms laws in step 3 to find the resistance of the "60" watt light bulb that draws "0.5" amperes of current we will have the resultant equation of "R = P/I^2". Plugging in the numbers from the light bulb, the equation will read, "R = 60/0.5^2"or "60/(0.5*0.5)". "R" is then equal to "240" ohms.

  6. Tip

    Use the resource below to find the correctly sized resistors for circuits dependent upon the colour codes. The more power the resistor is going to consume or reduce from a circuit the larger in size it is going to be and the more heat the resistor will emit. An incandescent light bulb is a very large and common resistor that emits light and heat while reducing voltage in a circuit.

Loading ...

Things You'll Need

  • Ohms Law "V=I*R", "P=I^2*R", "P= I*V"

About the Author

G.K. Bayne
Loading ...
Loading ...