tRasH cAn maN wrote:Of course the current matters. It's all Ohms law. But to say that the resistor is there to limit current is simply not right. You alter the brightness of an led by changing the voltage over the resistor not the current through it.
Well, yes and no. An LED (and indeed, any diode) is fundamentally different from a resistor with regards to its voltage/current response.
A resistor has a linear response. Double the voltage over a resistor and you also double the current. Easy as a pie.
For a diode, on the other hand, the current increases exponentially with an increasing current. Double the voltage and the current increases thousandfold or something like that. See this graph. The term "forward voltage drop" is actually an arbitrarily defined point on this graph denoting a voltage that is "ideal" i.e. high enough to allow the diode to carry current, but low enough that current doesn't sky-rocket exponentially.
So what does this mean? Well, keep in mind that current and voltage are not separate entities, but completely interrelated. Any change in current is also a change in voltage and vice versa. In this context, they are just the same measurement expressed in two different ways (either as charge or as flow of electrons.) In reality, the "forward voltage drop" defines not a point, but a small range of a few millivolts where the voltage stays roughly constant while the current varies over a quite large range. (Maybe 5-40 mA for a regular LED 5 mm LED.)
So, does the resistor control current or voltage? It controls both, as voltage and current are closely related. However, it's much more relevant to say that it controls current as current is the unit that changes the most when you vary the resistance. (At least as long as you stay near the "voltage drop".)
I hope that clarified some of the theory behind LEDs (or if not, piqued someone's interest.)