Below is a simplified schematic of the described switch supply.
Is a buck resistor a speed control with a coil in line? I thought the speed control/dimmer was the way to go.
That's the basic theory behind it - but it's a little bit different (and please allow me a slight terminology correction - it's buck regulator
). A PWM speed controller is an device that simply cuts-off and turns on the power in rapid sequence - based on the user's set-point, not based on any circuit conditions. Since the setting of the dimmer is external to the system, and does not get affected by the conditions of the system - it is defined as an "open loop system".
The "coil" (an inductor), is not a "consumer" of energy but rather a "resistor of energy chage" - much like a capacitor. Where a capacitor acts like a "short" for high freqencies (i.e. it allows changes at a high rate) and blocks lower frequncies or DC (resists change by dumping its stored energy, or storing surplus energy), an inductor acts like a "short" for low frequencies (i.e. it allows changes at a slow rate) and blocks higher frequencies (resists change by dumping its stored energy, or storing surplus energy). With the buck regulator, there is also current-sense resistor in the circuit (this IS
a "consumer"). This resistor works on the same principal as every other resistor - only the circuit is designed to cash in on the fact that for a fixed resistance, if either the voltage or current changes - the other will change in same direction.
E = voltage
I = current (in amps)
R = resistance
If you use a fixed resistance - like say 5-Ohms, now multiply a current value like 700mA (0.7Amps) to get the voltage across that resistor (5 x 0.7 = 3.5volts). If the current changes to say 900mA (0.9Amps), the voltage will change to (5 x 0.9 = 4.5volts).
In order to measure across a resistor, the driver IC needs two points. Typically with high-side switch drivers, this current resistor is between the load and ground, so if the driver IC has a ground, that's one point. A current sense pin that hooks up to the side of the resistor between the resistor and the load is the other point. In the schematic the second pin is marked as "CS" on the driver part.
This lets the driver IC "watch" the current of the load by watching the voltage across the current sense resistor (if the current goes up, the detected voltage goes up, and if the current goes down, the detected voltage goes down). This is compared to an internal voltage reference (since this is a very low current reference a very accurate linear regulator can be used without generating a lot of heat, or burning a lot of power). When the voltage is above the reference the switch turned off, and when the voltage falls below the reference it turns the output back on. This makes this type of control, a "closed loop system" where the system reacts to conditions within the same system.
Based on the size of the coil (measured in "Henrys", and marked by "h"), the speed of the switching action can be very fast or very slow (but with faster frequencies, the overall efficiency goes down). A common frequency range is between 30kHz an 100kHz (30 to 100 thousand switch cycles per second). A low-Henry inductor causes the switching frequency to go up, where a high-Henry inductor causes the switching frequency to go down. Higher Henry inductors for a given current rating require more wire to be wrapped - so as a result they end up being physically larger. There are some small inductor designs built for very compact system (at the expense of a few percent of efficiency), which have very small Henry inductors - and they can have switching speed in the megahertz (MHz) range (I've seen a few designs at 3MHz and higher... that's 3 million on-off switch cycles per second).
The only problem with a MOSFET switching voltage regulator, or dimmer switch, is I can see the light flicker and it bugs me...
...I assume that it would in fact be worse with an LED since LED's react so quickly to on or off (no lingering light after it is turned off)...
With PWM dimming, this can be an issue depending on the implementation - and based on the dimmer topology used, the switching frequency CAN
be very low. I too have noticed that with low frequency PWM (like that which is used on newer cars for their brake lights), I can detect a flicker as the car passes into my peripheral vision. This is because the PWM is not a true high frequency PWM (above 30kHz), but instead is a modified "pseudo PWM" at a very much lower frequency (sub-100Hz, so it uses differing "chunks" of on-off time - this is probably what you are seeing). I've discussed this with some people in the industry and it seems to be a trade-off in dedicated hardware vs. the use software on slightly more expensive in-vehicle networks. Basically, more and more systems in a car are being pulled into a computer as it helps with diagnostics and management (something the car makers will kill for). There just aren't enough processor cycles to run PWM at high frequencies (the change of the current sense line would be too fast for the software to react to), and there isn't enough incentive to use dedicate hardware drivers in every fixture (yet - prices are coming down, and this will soon be a "past" problem).
I have never met anyone who could see the flicker in a true PWM dimmer on LEDs at a frequency over 1kHz (and I'm in the video industry - where everyone's eyes are about as good as it gets
One other point of interest - because the current sense voltage does not have to fall all the way to 0 volts (which means 0 current across the resistor - and through the load), the brigtness doesn't change as aggressively as with PWM dimming. It may fluxuate about 5-10%, and you will not be able to see that (people need at least a 30% change over a long period of time to notice a difference).
I would think the best regulation for your output is just to get as many LED's necessary to run at that voltage without regulation (like series wired Christmas lights).
I did mention this before as an option to help with efficiency - but I mentioned the primary draw-back of this approach: It's designed to work at or slightly above the design voltage. If the voltage changes, one could over saturate the LEDs (causing them to over-heat and have a very short life), or they could just turn off from under saturation (no electrons can "jump" the gap in the LED die, so no photons are generated). This is how the lower cost "single voltage" LED fixtures are designed. As anyone in the LED industry will tell you, "The #1 killer of LEDs is HEAT
". Over-heating the LEDs will cost you longevity and reliability.
I used a 6 volt rated zener diode, low wattage, to drop the 12 volts to six volts on my restored 46 Ford. I converted the Ford to 12 volts and wanted to still use the 6 volt instruments. It worked great and never got hot at all. I really see no reason why a person could not do the same to drop 24 volts to 12 volts. Cheap and easy by using a 12 volt zener...
First, I don't imagine that the instruments draw a lot of power. This is a big help with zener regulation. With these LEDs, we are frequently talking in the 500mA and above range (the highest power single-die LED I've seen draws 1.5Amps at 3.7volts - that's 5.5 Watts!). To run three of those from a 12 volt supply (the max you could do, because of the voltage drop of each LED [11.1 volts]), you'd need an 11 volt zener at 2Watts or more (this is achievable). This would also cost you about 40-50 cents. But, if your supply voltage dropped below 11 volts (a lead-acid battery is considered "dead" below 12.01v) due to some high transient load like a starter or something else, the LEDs would shut off.
Now if you throw in an alternator to that, where the charge voltage is now 13.65 volts or up to 14.4 volts - the zener solution is a bit more risky. You can't just add another LED to raise the regulation point up to 14.8 because the LEDs would never get enough voltage to turn on - also, because the voltage is now 2.65 volts above the regulation point of the zener, it now has to dissipate 4Watts of power to keep the LEDs safe. This means your efficiency has gone down from ~92.5% (with the 12volt supply) to only ~80.4% with the 13.65 charging supply. That's a big penalty, and a lot more heat that has to be dealt with. If your supply does actually go up to 14.4volts, your efficiency takes another dive to ~76.4% [almost a quarter of the power consumed is lost to heat!] and you need a Zener that can absorb more than 5Watts (these power ratings start to get hard to come by in small quantities...).
Buck regulators are more constant with their efficiencies, with the National Semiconductor part I mentioned staying above 94% no matter what voltage you give it - up to 60 volts! One other problem is that power LEDs tend to change their equivalent series ressitance (ESR) in the negative as they get warmer. This means that they can draw more power as they get warmer, shortening their life even further. A current regulation supply like a buck which watches a voltage based on the load - has a better chance of regulating a dynamic load like a power LED. More advanced LED driver ICs even integrate a negative temperature coefficient resistor that changes with temperature variations - to create another reference voltage to compare to (and if that voltage changes - it can again be used to either add or subtract drive current based on the circuit design).
Again to each their own, and there are many "owns" that will work. One person's solution may not be another's - or it might be.