I'm just too stupid for that - despite google…
maybe someone can help me for REAL "dummies".
So I would like to know how much battery power my laptop "sucks" on a car / caravan battery or how long I could theoretically leave my laptop there.
the following values are on my charger:
input: 100-240V ~ 1.2A 50-60HZ (probably irrelevant right?)
output: 20V - 2.25A
at home i once "hung" the device on a measuring device.
There I got these values among other things at 220 V:
approx. Between 5 and 15 W / Max W 23.5
and
approx. Between 0.05 and 0.15 A / Max A 0.2
so now I want to know how much storm is consumed in Ah with a 12V battery in the car.
so how long could I e.g. Work on an 80Ah, 100 Ah or 120 Ah car battery with my laptop.
A short SIMPLE explanation for real low-flying would be great. Which values and numbers do I need and preferably with an example calculation.
You can't operate the laptop, which requires 20V supply voltage, with 12V.
Thank you, but that wasn't the question. I already realize that you need a converter, etc. This is about a residential moil… And which solar system do I need. Or. How big the battery should be when you stand somewhere self-sufficient at night. That's why I'm interested in how much power I take from the battery in order to then work out how big my battery needs to be…
You didn't write that you wanted to transform the tension. How do you do that First on 230V, and then you operate the power supply?
The power supply unit has an output of 45W (20V x 2.25A). You have now measured the maximum power consumption of the laptop as 23.5 watts. Was that with an empty battery?
I would assume the power supply is 45 watts. That would be about 4 A at 12V. You have to take into account the losses due to the voltage converter. So that I would expect 5A = 60Watt.
If you divide the capacity of the battery (Ah) by this 5A, hours remain as a unit. That would be the operating time.
At 20 W on average (laptop + inverter losses), this means converted to 12V: 20W / 12V = approx. 1.7A.
So theoretically 120Ah / 1.7A = approx. 70h
Since a (lead) battery can be discharged to a maximum of 40% residual capacity, this results in: 70h * 0.6 = approx. 42h
You can run your laptop safely and "battery-saving" for about 40 hours on the 120Ah battery, if it is good and charged.
Theoretically, you could also connect your laptop directly to the 12V battery via a StepUp DC / DC converter (12V to 20V). That would be such a device: http://vi.raptor.ebaydesc.com/ws/eBayISAPI.dll?ViewItemDescV4&item=164195340414&category=65507&pm=1&ds=0&t=1601965393141
You can "theoretically calculate" the whole thing
but for that you would have to know a lot…
let's start with the basics. In order to calculate the running time of a device, we first have to find a common denominator. The capacity of a car battery is usually stated in ampere-hours (Ah). While the laptop or power supply is usually given in watts. So either one would have to convert the watt to ampere or the ampere to watt.
at 12 volts, for example, 100 ampere hours are 1200 watt hours
with a consumption of say 60 watts that would mean. You divide the 1200 watt hours by 60 watts.
the watts are reduced when dividing from the watt hours.
since you can't connect the device directly to 12 volts now, that means that you still need a converter or something. And it also eats up energy again. Apart from that, the performance of the device is only valid for full load operation.
in a nutshell: for dummies there's only one option at this point! - get yourself a stopwatch, a car battery and the right adapter, then try it out.
We simply lack far too many factors for the calculation, such as the efficiency of the inverter, if you are using one, the reactive load component that the power supply unit generates can only be determined by measuring or roughly calculated using the nameplate
Another little tip: on http://www. Reichelt.de provides universal power supplies with 12 volt input from Ansmann for laptops, which have the huge advantage over control with the inverter that you don't have to first switch from 12 volt direct current to 230 volt alternating current and then back to 19 volt direct current, but directly can go from 12 to 19 volts direct current. That saves a lot of used energy.
If you choose the direct current method, it is also a little easier to calculate. We're going over amps here. Let's say your laptop draws 3.5 amps at 19 volts.
theoretically one volt would be 66.5 amps. Divided by 12 volts that would be around 5.5 amps.
the whole thing is called the rule of three.
Even the DC converter does not work without losses. Let's say the efficiency is 90% percent, i.e. 100
90% that is expressed in decimal 0.9 so to calculate the loss you do not have to do anything more than divide your starting value by the quotient (here the efficiency). So 5.5 amps / 0.9 that is just under 6.2 amps.
now you only need to divide the 100 ampere-hours by 6.2 amps and you have the hours.
is that all too complex for you? Then I just say: stopwatch!
Please don't get me wrong, this is not an insult. With all my experience, I honestly wouldn't do it any differently, because there are even more factors that come into play.
the laptop doesn't always run at full throttle and you can't just empty the battery completely.
Thank you. But trying it out is such a thing… I just want the battery that I want to install and calculate its performance.
it is not about a 100% value but "roughly". I also realize that such an inverter consumes electricity. Is it possible to say in general what losses such an inverter brings? 15 or 25%?
with a 12v charger, i already thought. Could there be problems with the device if you don't use the original charger? And how much "loss" does this mean?
but unfortunately my basic question has not yet been answered. What is my laptop using for "ah"?
i have plugged in such a device for measuring electricity to measure the values. Which of the many values do I have to multiply or divide by what, or pull roots, right?
Thank you. So in my case that would be the value "approx. Between 5 and 15 W / Max W 23.5", which I measured on the ammeter, right? If I now achieve an average of 10 W with my laptop. Then it would be twice as many hours. Right?
And what does the information mean: approx. Between 0.05 and 0.15 A / Max A 0.2 on my measuring device?
The link does not lead directly to ebay. Is that the same as in the posting about it. So just buy a 12V power supply?
Thank you. The 45W are the maximum. Or could also be used more "accidentally". Or is that always the upper limit (for my power supply)?
the measurement was only a few minutes. Maybe when switching on or something. My battery is actually charged anyway and is attached to the power supply
however, the calculation would now be based on the continuous maximum output with approx. 25% output losses. Right?
if I were to roughly assume that I use an average of maybe 20 W with my laptop (I can measure again with my measuring device over a longer period of time) and have another 5 W loss. So I consume a total of 25 W
then it would be 25 W / 12 V = approximately 2 ah (per hour). Right?
if I had a 50 ah battery and could use it completely (lithium battery), then I would get around 25 hours of running time. With a buffer, maybe only 20 hours. Correctly?
but one thing still unsettles me. When I charge my laptop on 220V at home. If the calculation would be as follows:
25W / 220 V = 0.114 ah. Then something can be wrong, right?
Yes, at 10W the time would be twice as long. But be careful, if you take a 12V to 230V inverter, then it already has "losses" (even without load) of 3-10W depending on the quality of the device.
If you have measured a current of 0.15A with an ammeter at 230V, then this corresponds to an (apparent) power (P = U * I) of 230V * 0.15A = 34.5VA (W). To determine the current for the same output at 12V, the following applies in this case: I = P / U = 34.5W / 12V = 2.87A. Or. Including converter losses, then at least 3.1A current from the 12V battery. So to stay with the example: 120Ah * 0.6 / 3.1A = 23.1h
No, not a 12V power supply, but a DC / DC step-up converter from 12V to 20V, because your laptop needs 20V, but you only have 12V available.
Or do the double conversion as you intended: 12VDC to 230V AC inverter, use your standard laptop 230V connection there. This solution is more flexible, but has higher (converter) loss.
The 45W are the maximum. Or could also be used more "accidentally". Or is that always the upper limit (for my power supply)?
The power supply does not do more. It can be assumed that the laptop has this power consumption at most.
however, the calculation would now be based on the continuous maximum output with approx. 25% output losses. Right?
Yes.
then it would be 25 W / 12 V = approximately 2 ah (per hour). Right?
No. You also have to calculate with the units. Watt / Volt = A. No Ah can come out of that.
2A in one hour would be 2Ah. In 2 hours it would be 4Ah (A x time).
if I had a 50 ah battery and could use it completely (lithium battery), then I would get around 25 hours of running time. With a buffer, maybe only 20 hours. Correctly?
Right:-)
but one thing still unsettles me. When I charge my laptop on 220V at home. If the calculation would be as follows:
25W / 220 V = 0.114 ah. Then something can be wrong, right?
That's not true either. 25 watts / 230V (that is the mains voltage) = 0.11A (no Ah).
The higher the voltage, the lower the current. But the performance remains the same.
OK. Thank you.
so i could always use the power supply to determine the maximum values that a device consumes. However, the average value will mostly be lower.
yes, but A x hour is yes Ah. That's why I wrote in brackets for an hour after it.
So I can now roughly say that my laptop sucks about 2 Ah from the battery per hour.
I didn't quite understand the last sentence though. Wouldn't it then have to use fewer watts at higher voltages?
and what does the display on my ammeter say: "between 0.05 and 0.15 A / Max A 0.2"
Thank you. But it was too high for me now:-)
now comes amps, alternating current, etc. In addition…
but for me as a loan it is easy to understand that there's really a lot of performance "lost".
is that really the case that 2/3 of the power "disappear?"
And where is the difference to the 12V power supply? Doesn't he have to "extrapolate" that too?
Do I really have 2/3 converter losses as described here, if I also convert 230V?
My measuring device shows A and also W. If you take them 230 times, the values should actually match. But they don't. That I do not understand.
e.g. Just checked it out. I have 0.14 A (max). That is times 230 = 32.2W
but my W specification says only 16.5 W max. On
Power adapter:
You described that your power supply has a 20VDC output. But if you have a 12VDC input directly on your laptop, you can of course use it directly (without an additional converter).
Losses arise with every conversion.
If you convert twice 1. From DC to AC (12VDC to 230VAC) and 2. Back from AC to DC (230V AC to 20V DC) there are 2 losses, I never wrote about 2/3 losses.
Depending on the quality of the "converter", you have to reckon with 2-15% converter losses per conversion. However, converters operated in idle mode also have losses which are small in idle mode, but if no "useful energy" is taken, they are 100% losses, even if the power is small.
Ok, that can still be true, because with AC voltage you not only have the ohmic load. The following applies to AC voltage: Apparent power (S) S = U * I = VA; P = S * cos phi = U * I * cosphi (unit watt (W).
If you convert the 12V DC to 230V AC with an inverter and use your standard power supply, you have to expect at least the apparent power and additional losses of the converter, so as in my example with 3.1A
In AC technology, there's also reactive power in addition to real power. The apparent power is made up of the two.
Since the reactive power also puts a strain on the grid, one has to reckon with the apparent power. As I said. The power supply unit draws 1.2 amps at 100 volts on the input side. This gives the apparent power of 100 volts times 1.2 amps, i.e. 120 volt amps.
lg, anna
PS: my "offer" is still available. Have a look around for a DC regulator at http://www.reichelt.de. That is far more efficient.
So i could always use the power supply to determine the maximum values that a device consumes. However, the average value will mostly be lower.
Yes. The power pack must be able to provide the power that is required to the maximum.
So I can now roughly say that my laptop sucks about 2 Ah from the battery per hour.
Yes. 2Ah means 2A x 1 hour.
I didn't quite understand the last sentence though. Wouldn't it then have to use fewer watts at higher voltages?
No. The power remains the same regardless of the voltage. But: Because power = voltage x current, the higher the voltage, the lower the current for the same power.
The 45W are the maximum. Or could also be used more "accidentally". Or is that always the upper limit (for my power supply)?
One should assume that the manufacturer has dimensioned the power supply so that it can provide the maximum required power.
however, the calculation would now be based on the continuous maximum output with approx. 25% output losses. Right?
Yes. Where the 25% is a presumed value.
if I had a 50 ah battery and could use it completely (lithium battery), then I would get around 25 hours of running time. With a buffer, maybe only 20 hours. Correctly?
But now i'm really confused. Some speak of 12V power supplies, you speak of direct current regulators and others again of step-up converters. Good question what that is and what is the best.