Your computer consumes a large amount of power just idling there awaiting your command, so does charging a smartphone or tablet off one of the USB ports impose much of a demand on it?

Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.

Image available as wallpaper at WallpapersWide.

The Question

SuperUser reader Arnehehe is curious if charging a USB device imposes an additional load on his computer:

So what’s the story? How much energy is used when you charge up your phone via your computer?

The Answer

SuperUser contributor Zakinster offers some great insight into how charging from your computer differs from charging from a wall-charger, and the efficiency of both:

In other words: if you’re absolutely concerned about efficiency, use your computer to charge as many USB devices as you can (while you’re there using the computer) as opposed to plugging in a unique wall-wart for each device. Realistically, however, the loss and gain is minimal and you should charge your devices in the way that is most convenient.

Long answer:

A USB port can output maximums of 500mA (USB1&2) and 950mA (USB3) at 5V which gives maximums of 2.5W (USB1&2) and 4.75W (USB3).

USB ports don’t consume power by themselves. Without anything plugged, they are just open-circuits.

Now, if you get 1A (5W) out a USB3 port, it will usually increase the global power consumption by ~6W (depending on your power supply efficiency) which would be an increase of 2% to 5% of your computer power consumption.

But, in some cases, it may be different.

If you take a look at some PSU efficiency curve (from AnandTech) :

You’ll see the efficiency is not a constant value, it varies a lot depending on the load applied to the PSU. You’ll see on that 900W PSU that at low power (50W to 200W), the curve is so steep that an increase in the load will entail a substantial increase in efficiency.

If the increase in efficiency is high enough, it would mean that in some cases, your computer may not need to actually draw an extra 5W from the wall socket when you’re drawing an extra 5W from a USB port.

Let’s take an example of a computer drawing 200W on a PSU with an actual efficiency of 80% at 200W :

Now, depending on the efficiency curve of the PSU between 200W and 205W, the relative power consumption of the USB device may be completely different :

This is the usual simplified case, where the efficiency is the same, hence the power consumption of the USB device is equivalent to 5W / 80.0% = 6.25W

In this case, the PSU efficiency is increasing between 200W and 205W, thus you can’t deduce the relative power consumption of the USB device without taking into account the whole computer power consumption, and you’ll see the relative increase at the wall socket may actually be lower than 5W.

This behavior only happens because, in that case, the PSU is under-loaded, so it’s not the usual case, but it’s still a practical possibility.

In this case, the PSU draws the same power from the wall socket, whatever the load it receives. This is the behavior of a zener regulator where all unnecessary power are dissipated into heat. It’s a behavior that can be observed in some kind of low-end PSU at very small load.

That last case, is a purely hypothetical case where the PSU would actually consume less power at higher load. As @Marcks Thomas said, this is not something you can observe from a practical power-supply, but it’s still theoretically possible and proves that the instinctive TANSTAAFL rule cannot always be applied that easily.

Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.