Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>If you can run a circuit without a converter, that would generally be superior.

No, this will waste a lot of power in the high-supply-voltage region of the battery's discharge curve. There are three regimes to consider, for (say) the case of running 3.3V nominal parts off a 1S 3.7V Li-Ion cell. Other nominal supply and battery voltages are very similar, they just shift the locations of the regimes around a bit or introduce boost converters.

Regime 1: Battery voltage above system voltage. Here you want a high-efficiency buck converter. This minimizes the current you draw from the battery: instead of drawing (say) 10mA at 3.7V, you draw 10mA × 3.3/3.7 × 0.98 ~ 8.7mA at 3.7V. A buck converter is generally superior to a buck-boost as the peak efficiency is higher, and for this scenario boost functionality will never be necessary.

Regime 2: Battery voltage slightly above system voltage. In this regime the appropriate solution is a low-quiescent-current low-dropout linear regulator (aka LDO). This regime exists because of the finite efficiency of a switching regulator. With an imaginary 100% efficient switcher, you never need an LDO. The lower the draw on the system rail, the wider this regime gets. For micropower circuits (average draw in the single-digit microamps) one might always be here. For higher power draws (tens of milliamps), this regime might be too small to justify the added complexity of including an LDO and you just skip right over it.

Regime 3: Battery voltage at or below nominal system rail voltage but above minimum. The best thing to do here is absolutely nothing: just run the system directly off the battery. This is possible with a regulator with a bypass FET. Once the regulator goes into dropout, it just turns on the bypass transistor and connects things directly (with a small R_ds(on) drop of course). A few regulators have these built in. Other regulators are efficient enough in dropout that they don't actually benefit much from the bypass transistor. This also assumes you're near the end of the battery discharge curve. If you aren't, then you're potentially heading into boost or buck-boost territory, which I'm not going to cover here.

Deciding which of these three regimes are important enough to warrant the design time for optimizing is a key part of low-power circuit design.



You make some solid points, it all depends on the battery I guess.

As you've noted, a 3.7V Lithium Ion battery (which is 4.2V when fully charged) would waste a lot of both voltage and current (excess current typically runs with higher voltage) if feeding a battery directly.

But for a 3.3V part, you really want to be running a LiFePO4 Cell, with nominal voltage of 3.2V. The chemistry itself provides you the maximum efficiency, as opposed to building devices to convert a 3.7V part to 3.3V.

----------

Compensating for the flaws of a slightly mismatched cell or battery is one approach. But I have my bets that selecting the proper battery to begin with is the optimal approach.

In any case, I think I can agree with you that using a standard 3.7V (4.2V when fully charged) battery on a 3.3V circuit... it probably would be most efficient to use a pure Buck Converter. 3.3/4.2V == 78% converters would gain you energy early in the charge, while near the end of the charge... 3.3/3.6V == 92%+ converters are still a net benefit.


Li-ion isn't even that great a choice, self discharge is pretty high. That will "use" more current than your microcontroller!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: