How do you calculate the correct, maximum power consumption of a power supply?

La
7

I would be interested in the following:
I have a power supply for my notebook, 19.5 volts, 2.31 amps = 45 watts.

How much would that pull out of the 230 volt socket? 45 watts too?
Or how do you calculate that?

Sa

Or how do you calculate that?

For this you need the efficiency of the power supply

Ti

A power supply shows how much can be put in and dispensed

Of course, pure is usually a bit higher, after all, a power supply also has losses

And you sure mean ~ 2,37A ^^

Pl

This requires the efficiency of the power supply, most power supplies have an 80% * efficiency (with a power of at least 20% they must have at least 80% efficiency) the 45W are only 80% of what was going in:-)

La

Oh, almost. The baby has 19.5 volts at 2.31 amps. But yes, you are right, it actually says on the back: 100-240 volts, max. 1.6 A. That would be 368 watts at 230 volts - so my notebook never sucks so much in life?!

La

Well, that would be around 56 watts?!

Pl

With at least 20% performance, theoretically yes; when plugged in, a slightly higher current may flow for a short time.

Am

You measure the current that the power supply unit draws from the socket… Which is not always easy because the measuring devices are built for 50 Hz and the power supply units are switched-mode power supply units, the current consumption of which is not sinusoidal and contains disturbances affecting the mains. But before you tinker around for a long time and then get one more: Calculate the output voltage times the output current of the power supply… And then x 1.5 and you will probably get the maximum current consumed.

Furthermore, your laptop does not always have the same power consumption: sometimes the hard drive runs, sometimes the battery is charged… Such questions are relatively meaningless!