鶹Լ

Calculating the value of the series resistor

To calculate the resistance of the series resistor required, we need to look at the current and voltage values recommended by the LED manufacturer.

Suppose a manufacturer recommends a maximum voltage of \(2V\) across their LEDs and maximum current of \(10mA\).

To use one of these LEDs in a circuit powered by a \(6V\) , we would calculate the required series resistance as follows.

Circuit with a 6 V battery, resistor R and an LED connected in series.

Using properties of a series circuit, the sum of the voltages across the LED and resistor R must equal the supply voltage.

\({V_s} = {V_{LED}} + {V_R}\)

Now substitute the supply voltage (\(6V\))and the maximum LED voltage (\(2V\)).

\(6 = 2 + {V_R}\)

\({V_R} = 4V\)

So the voltage across the series resistor will be \(4V\).

In a series circuit, the current is the same at all points.

So if the current in the LED is \(10 mA\), the current in resistor \(R\) is also \(10 mA\).

Using 's Law, calculate the value of resistor required.

\(V = IR\)

Remember: \(I\)must be changed from \(mA\) to \(A\)(\(10 mA = 0.01 A\))

\(4 = 0.01 \times R\)

\(R = \frac{4}{{0.01}}\)

\(R = 400\Omega\)

A \(400 Ω\) resistor is required.

Question

A LED takes a current of \(12.5 mA\) when the voltage across it is 2 V.

Draw a suitable circuit and calculate the value of the series resistor needed to operate the LED from a \(9 V\) battery.

Related links