GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
No it's not homework, Andy is into RC helicopters.
I think michaelk has it correct, in short my thinking was "a bit longer than 17.5 hours" since the mAH rate is lesser than your 17 hour example. This is all assuming the battery is discharged to a uniform point, the battery age is a factor, etc, etc. I'm sort of agog at these charge times. To me it would seem that you should be able charge a battery within 4-5 hours.
It does, indeed. 'Traditional' chargers depend on a series resistor, and the charging rate goes down as the cell voltage rises. 'Modern', 'high rate' and multi-chemistry chargers don't work like that, though.
These use genuine electronics and it is hard to predict what they do as cell voltage rises, short of actually measuring it.
And then you have to do something like integrating the varying current over time (simpler if the current doesn't really change much until the end point).
Quote:
Originally Posted by michaelk
It depends on the charger. Simple trickle chargers by design use rates <= c/10. You can charge faster if you monitor temperature and voltage etc.
The modern ones tend to charge at a constant-ish rate, higher than nominal c/10, until the voltage kicks up towards the end. You can get chips specifically for this and they have the advantage that work reasonably well with cells with different chemistries and hence cell voltages. OTOH, the battery capacity does tend to go off a bit faster, over life, than you think it should.
Quote:
I have a battery charger and the charging time for a 2300 mah NiMh AA battery is 18.5 hrs at a charging rate of 170 mA.
My math skills are rusty.
What would the charging time be if the charging rate is
150 mA ?
If that really were the end of the story, the time ought to be (roughly) 170/150 times the original (the charge/discharge discharge efficiency varies with charge rate, but usually not much if the current doesn't vary much), but I suspect that the current varies enough to preclude an exact calculation. Why not measure it?
To expand on what a couple others said about charging. If you are running a simple trickle charger (simple resistor) you can wear out your batteries early by overcharging. heat is the enemy of batteries and trickle charging slowly cooks them.
This is why a lot of cordless phone batteries never last more than a year or two, they are using simple trickle chargers. I managed to find one with a proper charger and its over 5 years old.
You would be wise to invest in a MAHA C9000, it might seem expensive for a charger but if you have a lot of batteries you can keep them working for a decade, mine has paid for itsself already.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.