In this thread: people who don’t understand what power is.
Power isn’t something that is “pushed” into a device by a charger. Power is the rate at which a device uses energy. Power is “consumed” by the device, and the wattage rating on the charger is a simply how much it can supply, which is determined by how much current it can handle at its output voltage. A device only draws the power it needs to operate, and this may go up or down depending on what it’s doing, e.g. whether your screen is on or off.
As long as the voltage is correct, you could hook your phone up to a 1000W power supply and it will be absolutely fine. This is why everything’s OK when you plug devices into your gaming PC with a 1000W power supply, or why you can swap out a power-hungry video card for a low-power one, and the power supply won’t fry your PC. All that extra power capability simply goes unused if it isn’t called for.
The “pushing force” that is scaled up or down is voltage. USB chargers advertise their capabilities, or a power delivery protocol is used to negotiate voltages, so the device can choose to draw more current and thus power from the charger, as its sees fit. (If the device tries to draw too much, a poorly-designed charger may fail, and in turn this could expose the device to inappropriate voltages and currents being passed on, damaging both devices. Well designed chargers have protections to prevent this, even in the event of failure. Cheap crappy chargers often don’t.)
When all USB could do was 5V I already didn’t trust any charger but mine - I couldn’t believe people dared to connect their devices to charge into any public USB chargers.
Now that they can go up to 20V, and we have to trust everything will work with the negotiation and wiring to get the right voltage, it’s even scarier!
48V and we’re back to POTS (plain old telephone system) voltages :-)
I agree, but that’s the problem even from reputable sources, glitch happens. Old 5V-only chargers would need much more things to go wrong to fry our devices. A 20V (or 48V !) one is just a small (sw or hw) glitch away to zap a device that doesn’t support such voltages.
The USB standard is usally really robust and the changes of SW errors is small. If you have a good brand laptop it will probably come with very reliable charger as well. I really don’t worry about it.
Yup. Usually the device being charged can scale down the power throughput so it’s not getting 60W+ if it’s not able to handle it.
That’s the core of charging management: The charged device controls the process, not the charger.
Anything else won’t work if you think about it.
In this thread: people who don’t understand what power is.
Power isn’t something that is “pushed” into a device by a charger. Power is the rate at which a device uses energy. Power is “consumed” by the device, and the wattage rating on the charger is a simply how much it can supply, which is determined by how much current it can handle at its output voltage. A device only draws the power it needs to operate, and this may go up or down depending on what it’s doing, e.g. whether your screen is on or off.
As long as the voltage is correct, you could hook your phone up to a 1000W power supply and it will be absolutely fine. This is why everything’s OK when you plug devices into your gaming PC with a 1000W power supply, or why you can swap out a power-hungry video card for a low-power one, and the power supply won’t fry your PC. All that extra power capability simply goes unused if it isn’t called for.
The “pushing force” that is scaled up or down is voltage. USB chargers advertise their capabilities, or a power delivery protocol is used to negotiate voltages, so the device can choose to draw more current and thus power from the charger, as its sees fit. (If the device tries to draw too much, a poorly-designed charger may fail, and in turn this could expose the device to inappropriate voltages and currents being passed on, damaging both devices. Well designed chargers have protections to prevent this, even in the event of failure. Cheap crappy chargers often don’t.)
Oh, please enlighten us, oh wise one. You might want to google “power draw” before you reply.
Have you ever heard the story of Darth Plagueis the Wise?
I guess he should have included the subtitles.
Great write up! Definitely filled in the info I only know a little about.
Not usually, but all the time. It’s part of the USB standard to negotiate the power that the device and even the cable can handle.
When all USB could do was 5V I already didn’t trust any charger but mine - I couldn’t believe people dared to connect their devices to charge into any public USB chargers.
Now that they can go up to 20V, and we have to trust everything will work with the negotiation and wiring to get the right voltage, it’s even scarier!
Will go up to 48V (240W) with the next USB-PD standard.
But as long as it’s reputable hardware that actually implements the starndard, I’m not too worried.
48V and we’re back to POTS (plain old telephone system) voltages :-)
I agree, but that’s the problem even from reputable sources, glitch happens. Old 5V-only chargers would need much more things to go wrong to fry our devices. A 20V (or 48V !) one is just a small (sw or hw) glitch away to zap a device that doesn’t support such voltages.
The USB standard is usally really robust and the changes of SW errors is small. If you have a good brand laptop it will probably come with very reliable charger as well. I really don’t worry about it.
Is there some exception to USB-C im not aware of? Am i putting myself in danger using high power chargers to charge low power devices?
No, they do a handshake through the USB connection and negotiate the best charging wattage.
And to add : if the handshake fails, or no common voltage can be decided, it will stay at 5v
If you use really cheap 3rd party chargers there is a possibility.