Sometimes there will appear a figure that flies in the face of reason, and challenges everything you think you know about a subject. Just such a moment came from [Chris Taylor] at Milton Keynes Makerspace when he characterised a set of LED strips, and the figure in question was that he found an LED strip creates the same amount of heat as its equivalent incandescent bulb.
We can hear your coffee hitting the monitor and your reaching for the keyboard to place a suitably pithy comment, because yes, that’s a pretty unbelievable statement. But it’s no less true, albeit that the key to it lies in its details. If you have a 100 W incandescent bulb, 88% of the energy is radiated as light and infra-red, leaving 12 W heating the bulb itself. To get the same light output from an LED meanwhile we’d only need 17 W, of which 11.9 W would be left to heat the LED. Which means that an LED strip can get as hot as an incandescent bulb with equivalent light output, and he’s run some tests to prove it.
If you’ve worked with LEDs, you’ll know that they get hot. But to learn that they have the potential to get as hot as their incandescent equivalents is something of a eye-opener, and should demonstrate the need for adequate thermal mitigation. It’s easy to take them for granted, and we’ve taken a look before at some of their safety pitfalls.
“an LED strip creates the same amount of heat as its equivalent incandescent bulb.” the heat produced by an incandescent bulb is not shown here and is not shown in the linked article/test data. anywise, i read and dismiss many unbelievable statements as base hooks.
And don’t forget that the wattage being consumed by these low cost LEDs is being limited by series resistors. So, let’s assume for the sake of this discussion that the LEDs drop 2.1V and you power the strip with 12V, the balance – 9.9V – is entirely consumed by the series resistors! Simple ohms law tells you that 9.9/12ths or 82.5% of your power is being converted into heat even before you get to the LED!!!!
The trick of LEDs is that all manufacturers quote the nominal figures at a junction temperature of 25 C – in other words they blink the lights on for a millisecond or just enough to take the measurements for color temperature, efficiency, luminous output etc. and then turn it off. This is standard practice, and all LEDs need to be de-rated for actual operation. Better manufacturers provide the de-rating curves, while the cheap Chinese ones don’t.
Many manufactures now rate LEDs at 85c. The practice you described isn’t really happening anymore although there are still disreputable efforts there
And of note, 85c is about middle of the road for the surface temperature of a 100W incandescent bulb.
something fishy here I’ve run a LED strip in a design for years and its only ever stone cold… okay they are running different colours and only half of them are on at any one time, but still… there should be scorch marks on the thick card they are glued on shouldn’t there?
Depends on how hard you’re driving them, and how much cooling there is. LED efficiency goes down as current goes up. It’s equally possible to have a dead cold LED setup, or one that sets fire to the enclosure.
This agrees with my observations. Temperature at the bulb for incandescent >>> temperature for LED with same light output. If the discrepancy was due to the great amount of infrared in the light, the temperature difference would be seen at the thing being illuminated, where the IR is absorbed, not at the glass. So unless the glass envelope is absorbing most of the IR, what is asserted in the article does not agree with observed facts. And if it IS the envelope absorbing the IR, then the envelope will get a good deal hotter than the IR-poor LEDs, which would invalidate the basic premise.
I’ll give it the benefit of the doubt – the vernacular “hot” could be interpreted as heat and not temperature. But I suspect in this context, people think “temperature” when they hear hot, and that is clearly not true. (I’ll let someone else explain now). It does get to the conventional issue of thermal management for LEDs, of course – you’re /conducting/ away similar amounts of heat as a light bulb, but your materials can withstand far lower maximum temperatures than glass and metal, so you really need very good heatsinks.
Disagree. How “hot” something is is expressed completely by its temperature. By definition. How much HEAT it contains, or dissipates, or whatever else, is a whole other thing.
There is some inverse correlation between surface temperature (for a given ambient temperature) and weight (inversely correlated to “hotness” in a very nonlinear way). Thus there is correlation between surface temperature and “hotness”. For example, Naomi Wu eats a lot yet she is crazy skinny. (Also know two other TV show hosts – Joanne Chiang and Dany Kao – who are also like that, as well as my friend Allie Moore.) The laws of thermodynamics states that energy cannot be destroyed, and I have a clue a thermal camera would show where that energy is going.
I agree that it’s a nonlinear function, but perhaps the “hotness” of a woman can be measured by temperature rise (so to speak) induced in others.
That’s not necessarily true though. Since the outer atmosphere of the sun reaches quite a ways out, the gas the earth passes through is quite hot in terms of temperature. Like thousands, if not millions, of degrees. So the mean molecular energy definition of temperature is kind of a lousy one. I doubt the entropic definition of temperature is much use there either (negative entropic temperature, incidentally, describes very *HOT* things). Maybe a definition of temperature that only takes into account vibrational energy is good for that scenario?
As to the original story, it’s still misleading. While I’m prepared to believe that ohmic heating from the circuit elements amounts to 12W on both, that’s not (as BrightBlueJim points out) a good measure of temperature. It’s (if anything) a measure of how much more efficiently an incandescent bulb works as a black body, compared to it’s efficiency as a conductive heater, and how much more efficiently an LED works as a conductive heater compared to it’s blackbody efficiency.
Seems to be the same number as in the article. “88% of the energy is radiated as light and infra-red” == “5% visible light, 83% infrared radiation”.
Yeah, but how does that 83% infrared radiation NOT contribute to heating the bulb and fixture? It certainly will if you enclose the bulb – let’s not forget our easy-bake ovens and similar… An unexpected (?) issue with the latest high-efficiency LED is that if you block their emissions (say, by putting your flashlight face down on a table while on) the temperature rises rapidly, since they’re counting on that emitted energy GOING AWAY… Don’t forget that “color temperature” is essentially black body temperature needed for a particular spectrum. Your 2700K incandescent filaments are at about 2400C, which makes the statements like “the LED surface temperature of 57C is relatively high” a bit incongruous…
” An old-fashioned 100W incandescent light bulb produces around 5% visible light, 83% infrared radiation and 12% heat. An equivalently bright LED bulb might produce 30% visible light and 70% heat. Although the total power consumption is much less (around 17W for the LED), the amount of heat which must be dissipated at the source is almost the same (12% of 100W is 12W, 70% of 17W is 11.9W), and LEDs are easily damaged by excessive heat.”
The point is that there’s equal amounts of heat left in the lamp itself, which gets just as hot, and this becomes a problem because LEDs are much more sensitive to heat.
Can you explain why the 83% infrared output isn’t factored as heat? Does it not do the large majority of the heating of the bulb or fixture?
She said “light and infrared”. But the 5% figure is also wrong: visible light only accounts for about TWO percent of the total power. But I still don’t buy the whole premise of the article. 12 Watts doesn’t account for how hot a 100 W incandescent bulb gets. Where does the 88% figure come from, anyway?
It probably comes from the luminous efficacy of incandecent bulbs, which ranges from 6 – 24 lm/W with a typical figure around 12 lm/W and the author understood it to mean direct efficiency.
(Sigh), this is a real forehead-smacker of an article. It could be restated in a much less click-baity way as “things that generate heat but don’t get rid of it somehow will get hot. Things that generate lots of heat but get rid of lots of it can reach the same temperature as things that don’t generate as much but also don’t get rid of it very easily.” Mind still blown now?
Hey, I’m not the one who started an article with “Sometimes there will appear a figure that flies in the face of reason, and challenges everything you think you know about a subject.”
Unless you somehow thought LEDs were magical heat-free electricity-to-happy converters, it’s a little hard to understand how the information in the article would be surprising.
Ok, I concede: if your definition of click-bait requires a misleading or provocative title specifically, then this article doesn’t qualify. My apologies for the misuse of the word. Perhaps “hyperbolic and odd” fits better?
LED Strips Are So Hot Right Now is also a clickbait title, because it’s playing on the ambiguity of taking “hot” literally, and if you do then why “right now”? It’s intentionally confusing.
In other words, the title avoids giving away what the article is really about and uses ambiguity to generate curiosity, which leads to clicks, which makes it… guess what?
A non-clickbaity title would have been: “Waste heat in LED strips turns out to be unintuitively high.”
>A non-clickbaity title would have been: “Waste heat in LED strips turns out to be unintuitively high.”
And none of us would be here reading it. We’d still be watching archival footage of juvenile felines manipulating small objects in a manner seemingly designed to elicit from the typical human observer a high-pitched vocalization indicative of delight, to trigger in said observer internal sensations akin to liquification of core cardiac components, and to inspire that observer to distribute references to the footage to any and all persons with whom the observer has ever had even a passing acquaintance.
Maybe [Xian] writes descriptions for patents. And now I’ll go back to watching time varying images of feline activities.
Unrelated question but how do I embed pics, just HTML? There seems to be some HaD owner’s manual I’m missing because I see others doing cool stuff where I’m stuck with ASCII art.
I don’t know if you can “embed” pictures, but you can link to an existing picture on a web service, such as pinterest, youtu.be, or ggggle through copy link and paste link. That was in Hackaday User Manual 2.3.1.a
A manual for hackaday? If you’re lucky, It’s available in Chinese or, if you’re very lucky, Engrish. If you can’t find it, just look at the comments in the source. If you can’t find the source, decompile it from the binary. If you’ve not got the binary, poke around at the JTAG Headers. If you’ve not got JTAG, de-cap the chip…
There is clearly a linguistic issue here because the assertion made is in conflict with the first law of thermodynamics.
Yeah, no. If your LEDs ever get as hot as an incandescent lightbulb it’ll only do it for a small fraction of a second and then it’ll stay cool forever… The comparison is also bollocks, the mentioned 83% infrared radiation are also heat, so in total an old fashioned incandescent bulb produces 95% heat, potentially more.
Really the only difference is: Incandescent light bulbs don’t care about the heat while LEDs do and that creates some interesting challenges e.g. in car headlights because in damp situations LEDs don’t produce enough heat to dry the inners of the head lamp while in hot weather getting rid of the heat can be rather tricky…
It can generate the same heat figure without any one part getting as hot as an incandescent bulb because it’s spread out over the entire length of the strip. Also an LED strip is not typically the most optimized light source in the LED world, those actually produced for illumination are more efficient typically than those intended for decoration or indication…
“it’s spread out over the entire length of the strip.” Thank you, that is what I was going to say/post/type/write.
Leave a 15ft strip on its plastic reel and power it up. Within about a minute the reel will be melting and the whole mess will be too hot to touch. This is what happens when the entire length of the strip is gathered in one spot.
A good LED strip features aluminum or copper throughout its length to ensure all that heat is spread out as much as possible. Users often disregard thermal management and current/voltage tolerances. They also tend to assume all strips are made the same so they focus entirely on price and lumens, and sometimes CRI. I’m sure anyone reading the comments on this article have better awareness than the typical user.
Oddly the “copper” in my LED light fixture is so minimal it’s practically impossible to solder.
Well, depends on the srtrips. I have plenty of strips for household use that don’t get nearly as hot as you say, even when coiled up and turned on.
Many LED strips are configured to run off of 12 volts. Rather than use a dedicated constant current regulator, they just have three white LEDs in series plus a resistor.
Typical LED incan-replacement bulbs have more LEDs in series driven by a constant-current switching regulator.
Last, as others have mentioned, I strongly question the base premise of 88% efficiency from an incan.
BTW, back in the early-mid 2000s, white LEDs were indeed less efficient than large halogens in lumens/watt. They were popular for flashlights though because small halogens had far lower efficiency and poor reliability/lifetime, plus LEDs dim far more gracefully than incans. Over the past 10-15 years, LEDs have plummeted in price and increased significantly in efficiency.
True, but the LED strips I have use a 39 ohm resistor. The ones above have a 27 ohm resistor. Mine dissipate about 25% if consumed power directly as heat from the resistor.
A incandescent puts out about a 10th of the lumens per watt of a led. So a 12w led puts out much more light than a 12 watt incandescen, and about the 10th of the heat. Author is confused.
1:5 would be closer to truth if you care about light quality at all. Don’t forget the power converter is not 100% efficient either.
I have been into LEDs for years and years. I’ve got a 12v lighting system threw out my house. Ever since the beginning when they made white LEDs affordable, It took me for ever to try to figure out why they keep burning out on me. Then it hit me drop the voltage a bit. Now after so many years I try to calculate a voltage drop over my wires so that I get no more then 11 volts to run my 12v LEDs. but I like 10.75. I have not had to replace as many LEDs now in my lights. And yes I am running a lot of LEDs and it has taken me years to train the family to use them properly. And in some cases I had to Automate things because they just could not understand what to do. ( Like turning things on in my workshop.)
All the powerful LEDs are usually driven with constant current regulators (350mA and 700mA are probably the most common). This is the safe way to drive any power LEDs, because the forward voltage is very temperature dependent, so you could overpower the LED with a constant voltage supply. And it adds the benefit that LEDs powered in series do all get the same power and output if you regulate the current. If you regulate the voltage for a series string, there could be one LED that pulls a lot more power because of little differences in the voltage/current curve of the LEDs. So, i’m not shure why lowering the voltage of your system from 12V to 11V would do you any good. You seem to drive the LEDs wrong to begin with…
Lowering the voltage shifts the operating point of the system further away from “catastrophic failure”.
LEDs are self-regulating up to a point because their forward voltage increases with current. This is why they light up below their nominal threshold voltage, and the current increases more or less linearily with voltage until the threshold is reached.
The voltage-current relationship beyond the threshold is exponential, and a small increase in voltage, e.g. from turning off a parallel lightbulb, can result in huge increases in current and that’s what keeps burning the lights out.
Had some LEDs crossing the river Styx in the last years. (all from China) #1 – a 24V white strip with rubber dome (eg. “waterproof”) started fading after only a week and died in week three. #2 – 4 sets of 6x10W LEDs in row – with 2 sets each mounted on 5mm strong, 4m long aluminum profiles. After 2.5 years the first LEDs started to cross over tot he netherworld – first flickering, then breaking the whole string. (ran them at their 1000mA current, now switched the broken LEDs out and run them @ 750mA) #3 – A 50W 230V COB LED module – mounted on a 20x20x3 finned cooler – died after only 3 days – but it ran so hot it was no wonder (about 75°C on the cooler!) – but I needed the light at that moment, so I did not care. But I imagine it would either need a heatsink the size of a 90s laptop or active cooling.
Oh and a set of 3 CREE 3W LEDs in a 20x20x160mm 2mm alu profile is shining as my bedlight in it’s 10th year now.
The cheapo $2 ebay 12V led strips I have under the kitchen cabinets seem to only have lost a little brightness over the years compared to new ones. (Or the new ones are a tiny bit brighter or the old ones are filthy) These are 2835 and run barely warm at 20mA so it’s no surprise at all.
Anyway the total efficiency of the strips is quite poor and meassured at the power supply use about 40W for a similar illumination level of 20W with compact led lamps.
But the compact led lamps keep burning and none last more than a year. These are the $2 6-9W led lamps with multiple “1W” 9V led modules of 3 led chips in a 2835 package all in series. Even removing the diffuser lid these run quite hot at 77mA. Fortunately are easy to repair since the driver is a current regulator and you only need to short the dead led modules and replace the curent-setting resistor to drive them at a more reasonable current.
So at 20W difference still is way cheaper to replace the burnt lamps yearly than to pay for the extra electricity bill due to the efficiency loss, but the strips look prettier.
This experement doesn’t account for heat produce by the strip conductors that link LEDs to each other. The longer the strip the more heat is produce by the conductors as resistance increase. If the strip is kept on the roll for testing the heat is restricted to dissipate an account for rise in temperature.
Those series current limiting resistors can generate a lot of the heating as well, so it’s not really an “apples to apples” comparison.
These usually are 150 Ohm resistors and there are one for each 3 leds in series. So for a 5m strip of 300 leds we have 100 resistors.
Similar to the efficiency of the common led power supplies. Now is when the system efficiency gets hairy:
bull. if the LED got as hot as the incandescent does, it would radiate as much IR as the incandescent does.
i really expected this to be a true but sad commentary on power supply efficiency but instead it’s just junk science.
While you guys argue over the subtle differences in how temperature is measured against wattage, I’m going to say “Thank You” for reminding us that LEDs get hot also and are not necessarily the solution to a thermal problem.
We use a lot of different types of LEDs for film lighting and every person who is not familiar with them are always surprised by how hot the fixtures get even with passive or active cooling. We use a particular fluorescent tube look-a-like bi-color LED fixture that is constant current that can get almost as hot as a 750w halogen lamp fixture after burning for a few hours.
It’s simply not possible. A Quasar 4′ tube is 50W, so at absolute most you get 50W of heat, and in reality you’re probably getting less than 25W of heat and a bunch of light. If it’s a 4 tube unit then 100W or heat or less. Halogen is roughly 10% visible light and 90% heat and IR light, which get converted to heat when it hits the glass of the bulb, the fixture, the air, etc., so well over 600W of heat. Enough to burn scars into your arm, as you know. Now right at the LED chip the temperature may be very high, and whatever the heatsink is may get hot, but nowhere near a 750 halogen as far as total heat or even temperature of the external parts of the fixture. Yes, they do get quite warm, but think in terms of wasted wattage and you can see the difference.
The little LED chips get really hot, but in a small area, and use heatsinks to spread out and dissipate that heat. So the temperature may be high in a small area, but the total heat is much, much less than an incandescent equivalent. The article is confusing and somewhat misleading.
The premise is silly. Saying that an incandescent bulb “created the same amount of heat” because most of the energy is radiated as ir would precisely imply that putting a heatsink on the led would similarly reduce the heat it “creates” by radiating it away through the heatsink.
The object itself doesn’t get hot, but only because it dumps most of its heat into the objects around it.
That’s not what is said, what is said is that old light bulbs are 10 times less efficient in producing visible light than LED by putting out a lot of IR but each light source produces the same waste heat separate from that IR. It’s just that LED don’t need to do an effort to create IR so they are low power.
If that’s true however depends on the make of the LED, each year they make them more efficient and have less waste heat but the stuff in the commercial pipeline doesn’t use the latest and greatest tech of course, and that tech also costs more initially and people prefer cheap.
So the luminous efficiency of the strips are probably at least triple that of an incandescent bulb, correct me if I’m wrong. So the incandescent is pumping out a higher percentage of heat at a higher wattage for the same lumens. Equals a lot more heat, not a similar amount. The article seems to ignore this simple fact while not very clearly discussing something different. It confused me more than anything.
Of course the efficacy is much higher, but I was referring to the actual efficiency, since we’re talking about a fixture in the real world. My guess is the efficiency is still triple or more.
To clarify: All (reputable) physics publications ascribe the mediation (read “carrying”) of ‘heat’ to photons whose wavelength lies in the infrared region of the electromagnetic spectrum. Not to put too fine a point on this: ‘heat’ equals infrared radiation (one can purchase therapeutic heating pads containing I-R LEDs). To obfuscate: One of my (very knowledgeable) physics professors puts the light vs. heat output of a 60W incandescent light bulb at 99% infrared output. Hey, he only researches this stuff, and he readily admits that this is just a guess. I’ll go with his guess.
You are correct: I stated, which you quoted accurately, that, “…’heat’ equals infrared radiation…”.
You seem to be arguing–academically–that the statement was made that infrared radiation equals heat; a statement that is diametrically opposed to my comment.
This method of argument–the reversing of what was stated, and then showing the fallacy of the reversed statement is a prevalent problem with most people, me included (but of epic proportions among lawyers, the news media, and politicians), and the on only cure is to be on the defensive against this problem when observing, rebutting and offering a contrary position.
By the way, Planck’s original work will answer your question, “…How hot does a blackbody radiator have to be for peak emission at 800 nm?”
Reminds me that you can do some sort of caloric test that I’ve never seen before though am wondering like with food if you place the bulb in an amount of water and study the power used by the bulb in relation to the water rise in temperature you can determine the conversion efficiency.
So was the point of the article to warn us about starting fires with LEDs, or is it about the misinformation on the packaging? I’d like to chime in, however extraneously, by stating that I have converted almost every light in my home to an LED and I have not noticed a significant reduction in my monthly bill.
Some of those LEDs that I have purchased actually say on the package, “Costs an average of X cents per year.” where X is often either 3, 13, or 33. Go figure that one. Also they say they’ll last 13 (or more) years. I have already replaced several but since they were supposed to last 13 years I, of course, had no need for the receipt. ahem.
I’ve been duped like this before. It was called the flourescent light. And then it was the electroluminescent night lights.
I have LED bulbs going on five years, and my electroluminescents going ten+ years. In other words buy quality and you’ll get life.
Well, if you do a bit of math beforehand can help to understand what are going to be the results. Or a kill-a-watt cheapo clone.
AC, Dryer, Fridge, Electric Stove, Dishwasher, Washing Machine, Vacuum, maybe lots of Computers… all use well over several KWh/day and you expect to see a significant reduction in you bill by just swapping to led when even with all incandescent lamps in the house barely would sum to a mere KWh/day?
If you electricity saving using leds is 75% from a generous 3% total house consumption the actual saving is 2.25%.
Unless you keep on all day and night all the lights in your house won’t see any significant reduction in the monthly bill while all other appliances are the bulk of the consumption.
In this case then, since we are doing math (and I try not to do that more than once a year), I would save money by using incandescent bulbs. Someone said that it’s the quality of the product that makes the difference; something that often can only be determined in hindsight but usually translates to a more expensive product. So yeah, I could buy the sixty five dollar fixtures instead of the twenty dollar fixtures but I wouldn’t start seeing my five dollar a year savings until 2035, rather than 2028 with the cheaper option- maybe. I think we’re due for an asteroid strike before that so…meh.
And…in the list of big guzzlers, you forgot the wife’s curling iron, sometimes left ON ALL DAY. Heating the house is another one unless you use a wood-burning stove. Sometimes I think it would be cheaper to just burn the money directly.
Well, the bulbs prices for the cheapest led and cfl options are about the same 2€. And was 0.60€ for incandescent but now are made from unobtanium. The branded ones are 5x that.
When I write lamp I mean the light bulb thingie not the apparatus where are installed. You buy new fixtures when the bulbs fail?
Lights are the worst target to look to save electricity because just one day (not)using the AC offsets a full month of the normal use of lights.
You are right that if led+power is more expensive then makes sense to use incandescent, even environmentally if you use hydro power.
I’ve had as many failed “quality” Osram and Philips led and CFL lamps as the cheapo chinese ones. So price may be proportional to probability of being lucky but not with how much they will actually last. At 1/5x the price you can try your luck many times.
In my experience just a few each of the led, cfl or incandescent lasted more than a year. And I always write the installation date on the lamp body. But you can’t fix incandescent lamps, and most cfl also fail with open heater filaments. On the other hand. the led ones are a 2 minute fix.
The branded lamps definitely use better quality parts and have RFI and proper fuses which almost none the chinese have and some are even a fire or electrocution risk. The chinese lamps are usually easier to fix though since at such price probably have more to canibalize parts if needed but most times just have to short the dead chip and replace the current sensor resistor. You could replace the led chip but the 3led type are hard to find and difficult to replace without causing more damage to worth the trouble for a 2€ lamp.
As stated in the article if you have 100W, 88% is radiated as light and infrared. So what happened to the other 12% of the power?
Saying the 12W goes to heating the buld itself is wrong. (Although, initially, some energy goes to raising the temperature of the bulb.) But the bulb temperature doesn’t rise indefinitely.
At some point you must accound for all the energy going in. If it’s not doing some sort of work or being stored then 100W going into the bulb must equal 100W coming out.
The heat leaves by three principal routes: 1) By conduction, through the filament supports to the lamp socket, 2) By radiation, mostly infrared, but also a little visible light, 3) By convection currents if the fixture allows air flow.
There are also routes that combine these, such as radiation absorbed by the anti-glare coating on the inside of the envelope, conducted through the glass to the socket and also convected to surrounding air.
The 500mm LED strip used in the OP example is made up of 10 segments, each segment 50mm long, containing three 5050 LED modules in series and a resistor. The resistor, as shown in the photo, is 22 ohms.
Each of these 10 modules is connected in parallel to the 12V supply, and each module of three 5050 LEDs has those three connected in series with the resistor.
The 5050 LED itself has three dies in it, and these three actual LEDs are connected in parallel. (This seems a bit dodgy and probably won’t work reliably due to current imbalancing, with no balancing resistors on each separate LED.)
Let’s assume the supply voltage is 12V and the diode forward voltage is 3.33V. So with ~2V across the resistor we get about 90 mA.
900mA total supply current for the 50cm strip, and 30mA into each of the three dies assuming they’re matched. Power consumption is 10.8W.
Resistor power dissipation is 182 mW, or 1.82W across the whole strip. Power delivered to the LEDs is 83% of the supply power, which is not bad.
The resistor approach remains fairly efficient, if the diode forward voltage is stacked up in series pretty close to the supply voltage.
The resistor approach isn’t really practical for single high-power LEDs such as 5W Luxeon Stars or similar… but LED tape strips with many small LEDs make it practical to get away with it.
some what perplexed I looked up the specs of some random Halogen light and found thishttps://www.ushio.com/files/manual/halogentechnicalspecs.pdf
100% energy in 93.5% All Radiation this is broken down into 7.1% Visible light and 86.4% IR 6.5% loss made up of loss at terminals 1.5% and Absorption by Glass/base 5%
THese figures are for a halogen Globe not a conventional Incandescent as indicated in the article but the figures are relatively close. There is no mention of “heat” as an independent loss apart from IR radiation which is where the heat comes from
compare this to the specs of a 5050 LEDhttps://d114hh0cykhyb0.cloudfront.net/pdfs/5050-WW6000.pdf Which has the bulk of its output between 550 and 600nm well below the IR spectrum It would be nice to have the same specs for all similar products published in data sheets.
Interestingly the blogger while he included temperature measurements of a strip of LED’s he omitted rather critically the temperature measurements of a incandescent globe of a similar Luminous output. which would have shown his conclusion to be absolute garbage.
I have a length of LED strip under a cupboard of unknown EBAY special LEDs. I thought id just qucikly make some measurements. Current draw 1.03A ambient temperature 23degC led temp after 20mins 30degC which is WAY below the values measured by the Blogger. I acknowledge I dont have te same LED strip and I dont know what the light output is but its sufficient to prepare a meal and read under so say equivalent to at least a 25W light globe which after 30seconds exceeded the range on my thermometer (50degC)
I would have to make up a similar test bed to realy compare my values but to suggest that the led strip produces as much heat as a incandescent globe for a given luminosity is pure bumpkin – I noted also the blog was closed for comments.
Who could have guessed that a 10W LED and a 10W incandescent bulb would need to dissipate similar amounts of heat energy?
Ten watts of power dissipated is ten watts, whether it be dissipated, as V x I by an incandescent bulb, an LED, or in a resistor. What is at issue is the amount (the power) of the visible light (≃400-700 nm) generated by each device.
Actually due to the higher relative efficiency of LEDs a lot of the energy is emitted as light not heat. The problem is that resistive (ie non radiative recombination) losses go up exponentially the hotter the diode gets. Its called “Auger droop” and is a right pain in the cloaca because even simple reflection back into some LED flashlights can burn them out very quickly. A little tip if it helps, if you want to get absurd levels of power out of a phlatlight (ie 50-90W) try cooling it down to liquid nitrogen temperatures and even putting active cooling on the sucker will work. White LEDs have the added problem of phosphor emission *dropping* at low temperatures so the colour spectrim changes a bit. Scientists claim to have solved Auger droop by redesigning the emitter surface however.
Re. repairing LED emitters. I have some spares here if anyone wants a few for tinkering. I believe them to be the 3 series variety from a linear “LED” lamp that lasted less than a month in actual use because the manufacturers put no surge protection in whatsoever. There are several vacant pads for capacitors and series limiting components. It looks like the emitters are in good shape as I ended up removing them with minimal heat (260c) and gentle tweezerage.
Useful tip with these, similar units can be snagged from old defunct LED TVs with scrap panels and I made lighting replacement panels with a few. The usual failure mode is one or more go open circuit but replacing the bad ones invariably leads to a failure somewhere else shortly afterwards so the best option is recycling as bargraphs etc where a bad emitter isn’t terminal.
I think this is misleading. I totally agree that LED lights can and often do get as hot as their incandescent equivalent, but to suggest they create the same heat? All that usually unwanted near and mid-far IR from a tungsten bulb is heating your room just not the bulb. For as long as I can get the supplies as they are banned in Europe, I switch my bulbs out in winter for incandescent ones because the net efficiency of my house is significantly increased due to IR heating. Much better to lower the thermostat 2C (4F) and top up the difference only in the room you are in and only while you are there. It’s like zonal heating that uses the light switch to detect your presence.
By using our website and services, you expressly agree to the placement of our performance, functionality and advertising cookies. Learn more
4dbi 433m Antenna
Wireless HDMI Extender, Wireless AV Sender, Antenna Tv - Pakite,https://www.pakite.com/