Author |
Message |
Lew Poppleton (Lewpopp)
Rating: N/A Votes: 0 (Vote!) | Posted on Monday, January 02, 2006 - 9:59 pm: | |
I heard from what I figure are some pretty reliable sources at the Arcadia rally that you can put 12v led lights in place of 24v markers and it won't hare anything . Is that the concensus of the rest of you experts? |
Gary Stadler (Boogiethecat)
Rating: N/A Votes: 0 (Vote!) | Posted on Tuesday, January 03, 2006 - 9:16 pm: | |
The worst that can happen is the LED will burn out, or it's resistor will get a bit hot. But in either case it won't hurt your bus... and they're probably right, it will most likely work just fine. LEDs these days are pretty forgiving and 2x operating current will likely result in them being brighter but working just fine. If you're really worried and your LED's are single Leds (not multiple groupings), get some 470 ohm 1/4 watt resistors from radio shack and stick one in series with each marker led. That'll make things proper... |
Tim Strommen (Tim_strommen)
Rating: N/A Votes: 0 (Vote!) | Posted on Wednesday, January 04, 2006 - 6:07 pm: | |
For the technically/mathematically inclined, a current limiting resistor (what Gary is describing) will make the 12V fixture work with a 24V system. It's prudent to measure the current that the fixture draws from a 12 volt power source (and many manufacturers are nice enough to post the current draws anyway) - then calculate the correct resistor to use. As an example: The Maxxima M11300Y (2 1/2" 13 LED amber fixture - details here) draws 112mA (.112Amps) at 14VDC. The typical 24 volt Charging system will run at 28.8 volts. As a safety measure, it's a good idea to add 20% to this number before calculating anything (i.e. worst case - yada, yada, yada...), so: 34.56 volts. Now you need to subtract the design voltage of the fixture from the system voltage (worst case): 34.56 - 14 = 20.56 To pick a resistor value that will "consume" the correct ammount of extra voltage, engineers have this cool little equation: E --- I*R Since we know what ammount of voltage (E = Volts) we need to "consume", and we know what the rated draw of the fixture is (I = Amps), we can solve for "R" (Ohms). Thus: 20.56V / .112A = 183.57 Ohms You would need to pick the next standard resistor value ABOVE this result or you would over-current the fixture. Additionally, you need to pick a power ratting (watts) that is suitable for the load. Power can be figured as such: Volts * Amps = Watts thus 20.56v * .112A = 2.3 Watts. Again you need to pick a resistor that is rated better than the actual draw or it can catch fire (i'm not kidding - this is serious). Here you can go crazy and super rate the resistor to 2x the actual requirement to 5 Watts (very common value). Although many fixtures are rated to support two voltages (24 and 12) the fixtures that are rated for 12 only will overload very quickly with no current limiter (poof! no light). To further demonstrate this effect lets do the calculation again (remember that LEDs are current devices - so the will only draw the voltage they need. Eveything else is disapated as heat.) We'll figure the "Equivalent Circuit Resistence" of the Maxxima fiture: 14V / .112A = 125 Ohms Now if we crank up the voltage to 34.56 volts here's what happens: 34.56V / 125Ohms = .276Amps Most manufacturers will design their fixtures to take 20% of the rated power as a "just in case", but anything above this will severely degrade the service life of the fixture. There is a further loss from this incorrect use of the fixture - the power draw will go up on what you are doing. Examples: With limiting resistor: 34.56v * .112Amp (limited) = 3.87072 Watts of power required Without limiting resistor: 34.56v * .276Amp (unlimited) = 9.53856 Watts of power required At ~10 watts per fixture, you might as well stay with the incandescent bulbs. Cheers! -Tim |
|