Subject: Re: What is maximum current for LED?
Date: Thu, 3 Oct 2002 17:06:06 +0100
References: <firstname.lastname@example.org> <7cZm9.email@example.com>
NNTP-Posting-Host: modem-193-19-60-62.vip.uk.com (188.8.131.52)
X-Newsreader: Microsoft Outlook Express 5.50.4522.1200
"Spehro Pefhany" wrote in message
> The renowned markp wrote:
> > This is true, the eye remembers peak brightness too. By the way I don't
> > agree with Spehro here. Let's say you had a 5V source, 330R resistor and
> > at 1.8V forward voltage. So 3.2V across the resistor gives 9.6mA, power
> > 9.6e-3 * 3.2 = 31mW from resistor and 9.6e-3 * 1.8 = 17mW across LED.
> > double the current for 50% of the time. Power is now (19.2e-3 * 3.2)/2 =
> > 31mW across resistor and (19.2e-3 * 1.8)/2 = 17mW across LED. I.e. the
> The LED dissipates more power, however, since Vf is not fixed- it
> increases with increasing current. If your supply is fixed voltage, there
> is no efficiency consequence to this (you're throwing away the majority of
> the power anyhow) but the LED runs hotter, and hence less reliably.
OK, I didn't take this into account.
> > However, the persistence of peak brightness means you can probably get
> > same perceived brightness at lower current, hence less power.
> Not significantly *if* the frequency is high enough to eliminate visual
> flicker. Try it and see, it's an easy test to do on the bench (use a
> visual matching technique). From my tests, any differences are not
> significant in the context of the overall design. I used super-bright
> (red) and high efficiency (orange-red) ~9 mil dies, IIRC.
Yes, my mistake, please see my other post. If the frequency is slightly
below the fusion frequency the apparent brightness can increase, but then
you get flicker!