There are articles I won’t click based on the title:
- have a yes/no question
- have the word ‘expert’ in it
Guess which one this article falls into.
Porque no los dos
non lo so…
Yes.
You could save yourself cents per year!
They don’t actually say how much power a charger uses on standby, and make an unsubstantiated claim that they “wear out” due to “voltage fluctuations”. Sure thing.
Absolutely pointless article.
I would argue that you are much more likely to break the cable physically by constant unplugging and replugging.
Or wear out the wall switch.
I have never heard of someone having to replace their wall outlets due to wear and tear
I’ve encountered a number of outlets in American airports that should be replaced due to wear. They have very little friction on the prongs after millions of uses.
Life pro tip: bend the prongs a little to give your device more grip if you encounter outlets like this.
Pro tip: bend the wall where the outlet is to assert dominance.
As someone who just moved into a 1965 house - yep, plugs absolutely wear out. These are some sloppy bois.
Around here, people usually have to replace the wiring in such old houses, since they tend to only have two wires (i.e. no PE). But the Schuko sockets themselves are most likely fine.
I have had to do that - so now you have heard of someone… My house was built in 1973, some of the outlets in locations I believe previous owners would have plug/unpluged often have worn out, and thus I had to replace them. (think kitchen appliances or vacuum cleaners - the same outlets I’m using all the time). There other other outlets that still work, but they don’t grip plugs as well as they should anymore and I am planning to replace. Despite the above, the vast majority of outlets I’ve replaced have been perfectly fine, but with young kids around I wanted modern TR outlets anywhere the kids are likely to be playing.
deleted by creator
You could save yourself cents per year!
That’s pretty much it. Maybe even tens of cents. In pre-USB era that actually made sense, Nokia chargers with a barrel jack (and other that era wall-warts) consumed even several watts on idle but (assuming a good quality) modern USB-bricks are way more efficient. They still consume a non-zero amount of power when plugged in but you’re not going to see that on your power bill. You’ll waste far mor energy if you forget your bathroom lights on overnight, even with LED bulbs.
Tbh, you are supposed to ALSO unplug EVERYTHING else you arent using to actually start saving money.
In EU they are not allowed to consume more than 0.5 Watt on idle. And this regulation has been in force since 2008.
Since mostly everybody design for that, I expect this norm also benefit other countries. So this is not really an issue, unless you are in a country without such regulation, and you buy some cheap off brand charger.https://energy-efficient-products.ec.europa.eu/product-list/standby-networked-standby-and-mode_en
Since the standby power is so low, the wear is most likely insignificant too.
Having an idle unit that uses 0.5 Watt on constantly for a month, consumes about 1/3 kWh, but since this regulation has been in force since 2008, I suspect idle is improved further for most devices. 0.5 is a maximum allowed value, and most would prefer to stay below that to not get into trouble.I unplug the switching power supplies when I’m not using them because I don’t want to listen to the RFI that they produce.
The switch of my power strip broke before any of my USB chargers by turning it on and off everyday. So I stopped turning it off/on and it wont break again, I prefer wasting cents of idle power than having to buy a new power strip each 6 months. Cheaper and easier.
Has any study been done on how efficient they are as heaters? The electricity they use when idle doesn’t vanish; it’s given off as heat. In the winter it might be worthwhile to not bother to unplug them because what they’re giving off could offset what other, more conventional, heat sources might otherwise provide. i.e. you leave a charger plugged in, and your house heating goes off half a second sooner, saving you the pennies there that the charger costs otherwise.
Admittedly, this doesn’t apply to summer and hotter climates, so most people, most of the time, probably ought to be unplugging them, but there’s a small percentage of cases where the reverse might actually be beneficial.
100% is the typical claimed number as it is easy to measure watts of electric in and find that is exactly equal to watts of heat out - or if not the difference is easially explained by measurement error. There is no hypothesis (much less theory!) of where the energy could go it it isn’t heat and conservation of energy is enough to also decide 100% efficient.
The above isn’t the whole story though. If you could somehow measure watts from the power plant output you would discover that 4-12% (depending on a bunch of factors) of the energy is lost before it even gets to your house and so your efficiency goes down. If you measure fuel into the power plant, those range for 10% (old systems from the 1920s only run in emergencies) to 60% (combined cycle power plants) - I’m not sure how you would measure wind and solar. Eventually the universe will die a heat death and 0% long term.