> Our power grid is very reliable. Looking at availability information from “uptimed“, my home file server has been powered on for 99.97% of the time in the last 14 years. That includes time spent moving house and a day when the house power was off for several hours while the kitchen was refitted!
> However, in December 2023 a fault with our electric oven popped the breaker for the sockets causing everything to be harshly powered off. My fileserver took it badly and one drive died. That wasn’t a huge issue as it has a redundant filesystem, but I didn’t like it.
> I decided I could afford to treat myself to a relatively cheap UPS.
Just from that, I'd say he for-sure should have avoided UPS's, and gotten a "real" surge protector. The "shockwave" from that high-amperage short is far more likely to have done the damage than the mere loss of power. And the output of cheap UPS's is often far worse than the input.
[Edit: And good surge protection is never a feature of cheap UPS's.]
Pure Sine-wave UPSs are massive overkill and offer no advantage for any normal consumer.
If you have incredibly power-sensitive $100k equipment that needs protecting, maybe. But not for normal consumer electronics. None of them are that picky about the power.
I did a deep dive into this when buying a UPS last year, and ended up with a sine wave ups. But just now I couldn't find my sources for why I made that choice. I believe it was something to do with the new ATX 3.0 PSU standard.
I did find this older thread[1], but the answer is more of a "maybe if your PC or server is using an active PFC PSU." However, this is out of date (2015), and may no longer be correct.
Cyberpower's website says PSU with active PFC require pure sine wave.[2] But that might be marketing. IDK, seems like it something that may need to be tested to be sure.
It's a sign of quality. If you get a sine wave UPS, it probably is more robust in general. OTOH, for conventional UPS, unless you're knowledgeable enough and do lots of research, it can be hard to differentiate between a good UPS or one on which the lack of sine wave is just the tip of the cost-cutting iceberg.
That's not my experience. Modern power supplies in desktop PCs either don't work or trip up non Pure Sine wave UPSs. I don't know the technical specifics but it has to do with Active PFC on power supplies.
I've used square wave UPSs on my PCs for as long as I can remember with power supplies from varying manufacturers, I have never had a problem. I have heard this before but I suspect it's either a myth or it only was ever a problem with very rare combinations of poorly-made UPSs and poorly-made power supplies.
I was forced to change UPS relatively recently (a few years ago) because the cheap but not bargain-basement PSU on my computer didn't work with the non-sine-wave output of my UPS. Call them poorly-made if you like, but either I got unlucky or they're fairly common.
Power supplies tend not to advertise their compatibility with non-sinewave power, but UPSes will certainly make it clear if they produce sinewave output. So the safe option is to get the UPS which gives the PSU what it's expecting.
The quoted material sounds like a poor analysis. It doesn't take into account all the times the UPS would have failed while the grid was fine. UPSs are not pure goods with no downsides that merely add to availability. Commercial UPS are very unreliable, bordering on a scam.
Yea, a disturbing amount of reviews on amazon for the lower end (still 1350/1500VA though!) UPSes contain words like "smoke" and "fire".
My own experience with UPSes didn't involve smoke or fires, but it did involve remote equipment going offline for months until I had time to make a trip out to replace the UPS with a surge protector.
We recently upgraded to a whole house surge protector installed at the panel. They're not that expensive ($100-$200 around me, depending) and with the reduced quality of power we're seeing lately here and the damage it has done to our circuit breakers, it's become a necessity.
Can you be more precise about how a high-amperage short circuit can produce a "shockwave" of a form that would be ameliorated by a "real" surge protector?
I'm not an electrician by any means so take everything I saw with a grain of salt, but: in general, electricity flowing through a system has some amount of "inertia" (the technical term for this is inductance), and shutting off a circuit breaker will cause a brief voltage spike upstream of the breaker due to the flow of electrical current suddenly being forced to stop.
Have you ever unplugged a running appliance from the wall and noticed a brief spark at the plug? That's the same effect: motors in general have high inductance and try quite hard to fight changes in current.
It's bad enough that when designing circuits that switch power to inductive loads like motors, you have to include a flyback diode (read: one way valve) to allow a path for electricity to continue to flow internal to the load when the power is switched off, otherwise it will generate a voltage spike high enough to damage the switching circuit.
Put another way: it's not the "high amperage short circuit" that causes the voltage spike; it's the rapid stop to said high current that happens when the circuit breaker realizes what's going on and trips.
Eh, they do raise a valid point: Disconnecting voltage from a coil (which could be a motor winding or could be anything else inductive) generates back-EMF. It has to.
If there was a motor running (aircon, furnace blower, clothes dryer, whatever) when the main breaker tripped, then: Back-EMF is very likely to be a thing.
It's a simple thing, and it's how spark plug coils on car engines still work today: Send (boring 12VDC in this case) power to a coil, and take it away rapidly. This essentially angers the coil and makes produce a high voltage: High enough to jump the gap of the spark plug. The voltage maximum will be higher as the spark gap is increased, and lower as it is decreased.
And in the author's case, that back-EMF voltage will be sent to all of the household wiring. That's obvious on single-phase systems like Europe mostly has, and here it would also be the case even on split-phase systems like in North America (I don't know where the author was located), wherein: The electric range was reportedly faulted short, bridging the two hot legs together.
However, a modern household's electrical system is full of all kinds of stuff that is trying to use electricity. It presents a fairly low impedance. Things with linear power supplies (rectifiers and capacitors) will work to soak up any rise in voltage. The increasingly-common switch-mode power supplies that modern homes are full of tend to have MOVs on their mains input, as is also the tendency for even dollar-store power strips that modern homes tend to be littered with. (PC power supplies are also protected with MOVs.)
And the lower the impedance, the lower the voltage maximum of the back-EMF spike. It's got almost zero current behind it, and Ohm's law didn't just drop out of existence when the main breaker tripped.
This has the net combined effect of tending to reduce the back-EMF voltage from the ZOMG! level that it could be on an isolated test bench, to a complete non-issue status in a real household.
Our motor loads at home are not usually very beefy, and we've got MOVs (and power supplies with caps, including LED bulbs) scattered all over the place, and that all conspires to snub the back-EMF.
---
So what happened, then?
I think it's just a case of false causation here on HNN. The author is rightly proud of his nearly four-nines uptime, which is pretty cool -- but it also means that the hard drive had been spinnamathingin' nearly continuously for as long as 14 years.
An old hard drive that dies after a sudden power drop seems more likely to be caused by age, than by back-EMF smoking only that particular computer component and leaving everything else in the house unscathed.
All of us here have probably had hard drives die (this is how we learn about backups!), including hard drives that were working seemingly-fine yesterday and that did not work after turning the computer off (and back on again) today.
That is very normal and expected behavior for hard drives, which have always shared the common trait that all of them must die eventually.
There's nothing to see here. There is no smoke, and no mirrors, and no-one is behind the curtain.
Apologies - I know enough to do long-winded hand-waving on the subject...but if you want precise and concise (and really accurate), you'll need to ask a real electrical engineer.
I have one of the nicer APC units (for use in offices, not racks). Recently, when the power blipped, it went into backup mode and got stuck there, draining the battery. I had to physically unplug it and plug it back in to get it to turn back on.
I'm not sure if there's a decent UPS brand any more. I get the impression APC has been going downhill since the acquisition.
> I'm not sure if there's a decent UPS brand any more. I get the impression APC has been going downhill since the acquisition.
I’ve noticed a similar thing.
I had a 2005 Smart-UPS tower that never failed, but developed some transformer buzz that was annoying in an office environment—and hey, improved standby efficiency and an LCD panel would be nice—so I replaced it in 2013 with its latest equivalent. This one ran for about nine years and then started rebooting itself randomly, dropping the load each time. (It had no issues transferring and holding a load on battery, and self-test passed.) Its 2022 replacement now uses a non-standard USB-A male-to-male cable, is missing information that used to exist in the LCD menu, seems to have a problem charging from 98% to 100%, and has a broken event log (event 1 is always “Site wiring” and every other event is always “None”, even though there have been multiple power events, and there is no site wiring issue). It works, but QA issues are evident.
I’d previously tried a prosumer-grade CyberPower UPS (CP1500PFCLCD) on some other less critical equipment. When its battery failed after two years, it cut power to the load. When the charger failed a few years later, it cut power to the load. It died completely in about six years.
Tripp-Lite’s consumer grade stuff seems to work well enough for what it is, but their higher-end equipment seemed to all be designed for environments where noise doesn’t matter, which makes it a non-starter for an office. Eaton (their parent) seems the same.
So it would be great to know what is an actually good choice these days. At the moment I’m not in the market for anything and hopefully what I’ve got will run for another decade, but if it doesn’t, I have no good idea about what else to buy today.
> event 1 is always “Site wiring” and every other event is always “None”, even though there have been multiple power events, and there is no site wiring issue
Some UPS:es are adamant about wanting live and neutral on specific pins on the power plug and will throw that error message if they're swapped.
If it's a reversible power plug, flip it around. If the power plug can only be connected in one orientation the outlet is likely incorrectly installed (which is not all that uncommon, as 99.99% of stuff will work perfectly fine with live and neutral swapped).
> I’d previously tried a prosumer-grade CyberPower UPS (CP1500PFCLCD) on some other less critical equipment. When its battery failed after two years, it cut power to the load. When the charger failed a few years later, it cut power to the load. It died completely in about six years.
I've never (n=4) had a stock CyberPower UPS battery last me 2 years. 3rd party replacement batteries have all lasted significantly longer than what shipped in the box.
Mine has been running fine for 4 years now. It's a PR1500ELCD so one step up from the CP1500PFCLCD. I also bought it directly from the official distributor in my country if that matters.
No one on this site has a large enough sample size to speak authoritatively about the quality of consumer-oriented UPSes.
In my home right now, I have an APC (Schneider) Back-UPS Pro 1000, Tripp Lite (Eaton) AVR550U, and CyberPower CP1500PFCLCDa.
I purchased the APC in April 2017, the CyberPower in Nov 2020, and the Tripp Lite in Sep 2021.
The APC was originally for all my IT gear but for some reason that load caused it to cycle to battery at random times. I replaced it with the CyberPower which works fine with my IT gear. Meanwhile the APC works fine with my AV gear where I re-purposed it.
Both of those are still on their original batteries which I test every 6 months.
The Tripp Lite I use at my desk and I also test it every 6 months, but last month the power went out and the Tripp Lite immediately shut down. The battery had failed w/o warning and the Tripp Lite battery-monitoring is apparently useless since it still thought the battery was good when power was restored, but an actual load test proved otherwise.
When I replaced the battery, the Tripp Lite had a cheap Chinese battery in it. I replaced it with a Duracell (manufactured in Vietnam). I've had less trouble with non-Chinese UPS batteries.
Another fun thing: Tripp Lite has reused the model "AVR550U" for three different UPSes all which use a different battery. Turns out my version uses the quite common APC RBC2 sized 7Ah battery.
Which is all to say: I've had a lot of UPSes over the years and you can't make any claims about any single manufacturer. They all manufacture a range of models and none of us have any actual data about failure rates. The best you can do if you have the expertise is to take one apart and evaluate its design and manufacturing quality.
I have an Eaton 5P1150i that was about half that. It has plenty of capacity to run my NAS, webserver, router, switch and one access point for about half an hour. IIRC I paid around €450 for it.
Have low-end UPS units gone lithium iron phosphate yet? That's the battery technology of choice now for non-portable applications. Cheaper and longer-lived than lithium-ion, no thermal runaway risk, about the same energy density per cubic meter, but twice the weight.
Such UPSs are available in small rackmount units.[1]
The usual serious players, such as Tripp-Lite, have LiFeP04 battery powered units.
Searching for consumer-grade LiFePo4 UPSs turns up many articles on how to replace crappy batteries in UPS units with newer battery technology. The batteries themselves are cheap now.
APC doesn't seem to have caught up yet.
> Have low-end UPS units gone lithium iron phosphate yet?
no, and afaik it's a cost thing. the unit you link is "request for quote" and being a rackmount unit it's probably not what consumers would think of as inexpensive.
I have three of the classic fire-hazard cyberpower 1500PFCLCD and each of them has a pair of small sealed lead-acid (SLA) batteries that cost maybe $30-40 each to replace. If you assume that the OEM gets them for $20 apiece, how much LiFePO4 battery does that buy? I know the big industrial cells are cheaper than what you can get as small individual cells, but I'm positive it's still going to be 4-10x more expensive.
now remember that's a unit that retails for $135-150, and there are like 3-4 tiers of units below that... how much LiFePO4 can you buy for $10 to put into that $50 UPS?
(fake edit I looked one of them up from the datasheet and it's $700 for the lowest-tier 1000VA unit lol)
> (fake edit I looked one of them up from the datasheet and it's $700 for the lowest-tier 1000VA unit lol)
That's the bigger issue, that there's nothing medium price.
I can get a LiFePo4 battery that can do well over a kilowatt for an hour for <$150 with free shipping, and I can get an entire UPS for <$100, but the combination costs so much more.
It’s not only a cost thing, they don’t have the power output for the same form factor. You can technically swap in those 12v LiFePo4 “replacements” into a cheap UPS but they don’t put out nearly as many peak amps as good old SLAs. Lithium Ion and Polymer can but they also run the risk of blowing up. I’ve played with basically all these chemistries doing model airplanes and DIY drones.
I thought the bigger concern was that the charge and duty cycle of the different battery chemistries were incompatible so it was not a good idea to swap lithium based batteries into lead acid ups's.
One of those modding articles discusses how to reprogram the UPS's battery manager with numbers suited to LiFePo4. A bit iffy, that; who's validated those values?
Most of the packaged-up drop-in replacement lead acid to LiFePO4 batteries I am seeing aren't just a group of raw cells, but have a battery management system internally. Even the least expensive should be able to protect the cells from overcharging without modification to the UPS.
From what I read, it's not the overcharging or the charging that is the issue, it's the cutoff voltage at which a lifepo might be harmed being a higher threshold than a lead acid battery, so you have to have a BMS for the lifepo which increases the cost and can cause the ups system to miscalculate runtime as it may cut off when it thinks it has 10% remaining.
But if there is a hack to recalibrate them, then great. I'll do some searching for it now as I have two little bms's that need batteries.
It's very likely their switching time between wall power and battery isn't quick enough to keep a computer on. With ATX you need to switch power in no longer than 16ms else the computer will shut off.
Ecoflow switching time is listed at “<30ms” and the new Anker power stations are listed at “<20ms” so not the speed you’re looking for. APC’s tend to be listed at less than 12ms.
Energy density isn't the issue. Need to either buy the best cells or oversize the pack to meet normal loads. Both options cost additional money.
Go look at the spec sheets on the cheap ones, they aren't power dense enough to make a small (1000-1500VA) UPS with comparable or even double runtime of the lead acid.
Lithium not common for 120V, but I have several APC CP12142LI units for 12V devices. (No USB monitoring, and they're not terribly price-competitive for the capacity, but they've worked well for me.)
I have one of these APC Back-UPS 900 (BR900GI) here, and they work perfectly. Fully supported under apcupsd, that device has saved my bacon quite a few times. The only thing is that you MUST use the USB cable they provide.
The connector on the back is essentially a non-standard serial connector, and the USB cable is a USB-serial adaptor. You can probably make another work, but only by making an adaptor cable yourself. They have some models where the USB-serial adaptor is integrated into the unit, which is generally a bit easier to use and a more typical approach.
(Generally, such a protocol is a better option from a compatibility point of view: it's usually fairly easy to reverse engineer. It would be interesting to sniff what the windows software is doing to work out what protocol or behaviour difference is responsible)
APC doing horrible things with "standard" cables isn't new, though - they used a nonstandard null modem serial cable back in the day, such that if you tried to connect to the UPS using a standard null modem cable, it'd trigger shutting off the UPS.
This isn't to say it's good, just that it's not new.
That's always drove me nuts because the underlying issue is a great idea, frustrated by its implementation.
The serial port serves dual-purpose. Along-side the obvious serial, you can also provide some inputs by grounding certain pins, and some outputs by sinking current from others. This has got me through some unorthodox integrations by having the UPS signal that it's on battery by having these "simple signals" close a contact/relay on the load. It's not quite a full suite of dry contacts, but it's pretty serviceable.
The problem is that 9 pins doesn't leave many spare, so to achieve this they've repurposed some. I really wish they'd implemented this as an alternate mode so you'd, for example short tx to rx to change the behaviour - instead of having the 'alternate behaviour' sitting on standard DTR/RTS pins.
Getting industrial interfaces at SOHO prices was awesome. The unintended consequences .. less so.
Mine has a usb-A to rj45 cable with a ferrite bead on it, as far as I can tell it just wires the four USB wires (no ground) to the pins with spaces in between. I have never understood why they do this.
It works perfectly with apcupsd though, even showing up in my desktop power settings as a UPS.
It’s baffling. When APC first added native USB support to their Smart-UPS series, they chose a standard USB-B port. When they introduced “SmartConnect” a few years ago, this port was replaced by a USB-A port which requires a non-compliant USB-A male-to-male cable for monitoring. The only ‘good’ reason I can think of for this change is that someone realised that these new cloud-connected UPSes would be easily bricked by some bad firmware update (since apparently that’s a feature now), and wanted to use USB thumb drives as a recovery mechanism, but I have no idea. I think even though they have a built-in Ethernet port now, one still needs to buy their network management card for non-cloud remote management, so it could just as easily be another attempt at some weird vendor lock-in.
APC was the only UPS company to have their device drivers natively included with Windows 95.
That was before USB was available so they used regular COM port communication, I'm not surprised with that early foothold there was no reason to change and sufficient force to continue ever since.
At the time it seemed like the un-necessarily non-standard RS-232 cable was just to create a "profit center" from the cables themselves, along with the non-standard replacement batteries. They sure would have been a lot quicker to market and had lower up-front engineering costs if they had used standard batteries from the beginning.
It's good to recognize early when excessively high TCO is the primary feature around which a product (or company) is designed. There can be significant PR effort from the beginning to divert any perception of anti-consumer attitude.
Lots of stealth can be involved to muddy the comparison with alternatives which offer normal TCO or well-engineered low-target TCO.
I had a 20 year old APC start misbehaving on me a few months back, and new batteries didn't fix it. After messing up the repair on the first one, a second one started doing the exact same thing, and I immediately took it apart and, now an expert on how not to fix them and what the root cause was, I swapped every cap on the board that my ESR meter claimed was bad. It's now back in operation, and while I was testing it, I realized it's so inefficient that about 10% of its max rating is just going up in heat. The replacement for the first one, I assumed, would be much more efficient, and it is, but it's still dumping about 7% of its energy into heating the transformer.
OTOH, the older UPS behaves much better with apcupsd, going so far as giving me the firmware versions, battery replacement times, better power monitoring/etc. While the new one works its missing most of what the older UPS could do (although its not a APC).
So, i'm sorta on the fence about just using "backup generators" from Amazon/etc because one of the ones I have has about 5x the capacity of the new UPS and seems to be able to switch without the computer plugging into it having an issue. Plus, its power waste is less than 1%.
The batteries are definitely worth more than the unit is at this point.
A backup generator for me is not an option but this is mostly just to keep internet and a couple machines going in the event of a power failure. One day maybe the used lithium ion one will pop up and I will swap it out.
If you don't actually need the ~2 kw peak power output, there are a bunch of off brand 100-200W power stations with 50-150Wh batteries that aren't much more new than a new battery load for that old UPS (ex: random hit, flashfish 40800mAh Solar Generator - $129, powkey 200W $100). Although the Wh ratings are suspect on may of them, and they aren't online/line-ineractive (which is why they are more efficient) so you lose the power conditioning since they are effectively offline UPS's.
I see people dumping old jackery, ecoflow, models for 1/3 of retail on craigslist/etc somewhat regularly although those tend to be larger models. Its a bit of a mystery to me why some of them aren't advertised as having UPS functions if they have fast switch times, which many of them do since they don't have physical relays. My old line interactive UPSs are all physical relays which click when they switch over, the assumption is that the energy in the transformer is sufficient to cover the switching time, but of course they do brown out a bit when near their current limits.
I've had an AMD based home server a decade ago. Connected an HP printer, and it would frequently cause disconnect and reconnect events in dmesg, sometimes ending up in some error state when Linux would just give up trying to identify the device. After trying different ports and cables I eventually, out of desperation, threw in an unpowered old USB 2.0 hub and lo and behold, the connection to the printer was rock solid.
From my own adventure in this area, the battery is very standard and IMO reasonably priced. What gets expensive is shipping it, since lead is so heavy. I remember having trouble finding local sellers for them that would have avoided the shipping cost.
Also consider how the size of the UPS affects the battery replacement cost. I purchased my most recent UPS, refurbished by CyberPower, from its eBay store <https://www.ebay.com/sch/i.html?_ssn=cyberpower>. For a few dollars more I could have bought a larger size, but
* I didn't need more capacity (the one I bought will power the server it serves for an hour, far more time than I need or want) and, more to the point,
* The larger-size model has two batteries to replace, significantly increasing the cost.
I wish the battery layout/ amp hours was more obvious in UPS sales listings. The watts/VA are always front and center, but generally I want to know runtime which is harder to find and compare.
Also these brands are just using off the shelf batteries anyway (eg: a Cyberpower model using two 12v/8Ah SLA batteries connected together[1], with various OEM brands seen on the labels). Searching online discussions one finds which OEMs make better quality batteries.
First, it's extra work, but it's possible to configure apcupsd to run a script on events, then use that script to ignore spurious events.
Second, for people who can't necessarily afford either new UPSes or something other than "consumer grade", a good option is to find local sellers, perhaps on Craigslist or Facebook Marketplace, who are selling or giving away old UPSes. Many people and businesses don't bother replacing batteries, so they just buy new devices when the batteries near or reach end of life, and since they're heavy, they often just want to give the hardware away.
New batteries are not expensive. A common 9AH battery for many / most common smaller UPSes costs around $25 USD from Amazon sellers that have 4.5 or more stars.
I've personally never bought a new UPS for myself, yet all of my equipment and everywhere I've set up home Internet equipment for others has UPSes. I've only ever bought batteries. Just a thought.
We get a lot of power failures. I use laptops, but for other non-battery powered equipment, I have a BN1500M2-CA with a raspberry pi running debian plugged into one of the USB charging ports and the USB Data Port cable connected from the UPS to the raspberry pi. It keeps my ISP's 185 watt router going for longer than the ISP's switches will stay up when the power is out. I get more than 3 hours with the router and a 4K monitor plugged into the UPS. I remember seeing NUL characters in the text of an old log, but otherwise apcupsd works well:
2024-01-30 08:15:13 -0800 Power failure.
2024-01-30 08:15:19 -0800 Running on UPS batteries.
2024-01-30 08:20:06 -0800 Mains returned. No longer on UPS batteries.
2024-01-30 08:20:06 -0800 Power is back. UPS running on mains.
2024-02-04 12:52:58 -0800 Power failure.
2024-02-04 12:53:04 -0800 Running on UPS batteries.
2024-02-04 12:54:04 -0800 Mains returned. No longer on UPS batteries.
2024-02-04 12:54:04 -0800 Power is back. UPS running on mains.
2024-02-18 12:10:13 -0800 UPS Self Test switch to battery.
2024-02-18 12:10:21 -0800 UPS Self Test completed: Battery OK
You are correct. The input on the wall wart is 100-120vac 1.5a (150watt to 180watt) but the output is only 12.0vdc 4.6a:
55.2 watts
The "xFi Advanced Gateway (XB7)" router is definitely warm to somewhat hot on the outside. The AC here varies from 118vac to as high as 126vac (maximum rating.) Right now it's at 122vac. I think my 185 watt number might have been 123.3vac x 1.5a, or maybe it was the previous "xFi Advanced Gateway (XB6)" which was also a chimney.
I am not an ISP. We work remotely — there is no fiber here yet.
Man I hate to be that guy... but I do in fact have a blade chassis in my basement that can pull 6kW. I didn't even blink at 185W for a router, one of my switches can pull 125-200W alone. Some of us really do have that stuff just sitting around.
I have had maybe 10 UPS units over the years but I would never buy another one. I was once severely poisoned by an APC unit with a leaking battery. It released an odorless vapor, had me in the hospital twice before I figured it out. Not a fan of APC quality anymore either. I don't trust the product, don't want to plug it into my equipment. Most everything I use now has a battery or will just reset on power failure anyway.
The model he is referring to is the last UPS I owned and had some very serious QA issues;
It came from a leak in the battery during charging. Exactly what the gas was is still unknown but it knocked me for six a couple of days after I received the unit. I thought about getting it tested and even suing but it took out my whole apartment and I just had to get rid of it and cut my losses. I was in the ER unable to stand and in quite a state, lost balance and coordination and was incoherent for a while. CAT scan w/contrast, it was a whole production on Thanksgiving at 3am. They monitored me for a couple days and sent me back to the apartment where all the symptoms immediately returned, and then I put it together and looked closer at the UPS. Underneath it was a couple of drops of black liquid that had come from the battery compartment and after returning to the ER again to go throw up some more I went back and took it straight to the garbage. I think it was one of those types of gas that can effect you in the PPM range. Totally odorless. This was the enterprise APC UPS, the model was SUA1500
The SUA1500 is a pretty reliable unit and I highly doubt that's the issue. If there was a problem with the battery, those were poor quality batteries, but nothing to do with the UPS and not a reason to avoid them if you have a good use. I've also never heard of such a gas being produced that was odorless, do you know this is what happened for sure or are you just speculating?
I agree the SUA1500 is probably the best UPS APC made certainly my favorite but this happened. Surprised at all the doubt. I know because I smelled this liquid, it was odorless and about 5 minutes later I was pretty violently ill. Could it have all been a series of coincidence? I would bet a million dollars against it but technically sure. So it went in the trash and I was fine. This was a while ago and the unit came with batteries so that really isn't here nor there. In hindsight I should've made a big deal about it. I wrote a few angry emails and such but I was mostly relieved that I didn't die at that point and could care less about well quite a few hundred dollars at the time if I recall. This was a while back now somewhere around I want to say 2010ish. I had health insurance and no damages to show, lawyers are expensive. When you're actually in this kind of a situation very few people care to listen. There's nobody to call and very little to be gained from a fight anyway. I wasn't going to hand over the evidence to the company that made the product in exchange for another gd UPS. I had better things to do
If it was the UPS then that may have been a rare allergy to something.
Lead-acid batteries contain H2SO4 and can emit H2S when being charged vigorously; the latter is definitely not odorless, and the former won't have a smell but instead produce a burning sensation if you did inhale any vapors (which is difficult, since it has a very low vapor pressure at room temperature.)
A UPS is really just a car battery and a voltage regulator. Actually the cheaper APC models have no voltage regulator which is why you should avoid them. It's obsolete technology. Modern regulated power adapters are enough just use a strip with a surge protector. If you really need something to protect your gear get a voltage regulator. But if the power drops frequently then maybe you do need a UPS idk. If you really do then the APC enterprise units. But I'd avoid if at all possible
Can anyone recommend a good modern UPS? It seems like Lithium Ion would have the best power density, but perhaps lead-acid is actually better for this application? Good surge protection would also be great. It would be nice if there existed a UPS brand that is like Anker for USB cables and power adapters or Brother for printers.
The Cyberpower series sold at Costco is unironically good. No frills or cool stuff, UI from 1995 but supports USB monitoring out of the box and just… works. I was about to make the brother comparison but after reading your entire comment realize you already did it.
Just FYI, Cyberpower, at least for their consumer line, use problematic yellow glue[1] which becomes brittle over time and can become conductive, causing arcing/fire hazards[2][3]. There was also a prior HN discussion about it[4], with some similar experiences.
It can be removed[5], with some effort, though. Given its use in various electronics it wouldn't surprise me if some other UPSes use it, though have only seen discussions about Cyberpower.
Lead acid is used for most UPSes. Lithium ion is bad battery type for UPS since has lower cycles and safety, and power density doesn't matter. LiFePO4 would be perfect, with good cycles, current, and safety.
But manufacturers don't use it. Maybe they haven't redesigned after the price dropped. Or maybe cause prices is still too high. I think LiFePO4 would be most useful for rack mounts, or small UPSes that aren't possible with lead acid.
Regular Li ion has more recharge cycles than most lead acid batteries that come with UPS. Even high power density liion still usually has 4-500 recharge cycles before reaching a maximum of 80% SOC vs the original capacity. With smarter power management you can get away with much more (for example, if you don't fully cycle the battery)
Lithium in smaller UPS is mostly stuck in a marketing gap at the moment.
Generally lithium UPS win in a 10-year TCO, but SLA wins in up-front costing. Which leaves us with two issues - most consumers look at the sticker price not the TCO, and no-one wants to market their products as "Our lead acid offer only /looks/ cheap, until you see what we charge you for consumables!"
LiFePO4 can be had in the "portable solar generator" space, and some of them have enough input/output capacity to function as a usable UPS. All that they lack is comms for status and automatic shutdown.
It seems like a fairly recent shift -- heck, the whole solar generator concept is a fairly recent shift -- but maybe it's an indication that LiFePO4 devices are becoming more common in the consumer space.
Hey, thanks very much for the tip! I did some (random) searches and found, for example, this 7 pound device[1] that offers 300W output, 256Wh capacity, and 30ms failover at $180. Seems like a great little device that could serve a dual purpose as both a UPS and also a camping/project power supply if you can't have/don't want a gas generator. They claim a 10-year lifetime of the battery and 80% capacity after ~3000 cycles, which seems hard to believe, but maybe LiFePO4 is really some sort of miracle material.
No, LiFePO4 is actually pretty excellent in that way: It's long lasting, and not harmed nearly as much by charge/discharge cycles as many other types. 3,000 cycles is not an unreasonable claim, and it's a lovely chemistry in that way. (It also tends not to self-combust.)
It has other tradeoffs, though, like being physically large compared to a lead-acid battery of similar capacity (or, say, a bunch of 18650s).
And like other Lithium-based cells, longevity is maximized by never charging it to 100% and leaving it there long-term -- which, unfortunately, is probably not what Ecoflow is doing. (Ideal long-term state-of-charge is closer to somewhere between 50 and 80%.)
After having tried several brands, I like CyberPower for the combination of price/performance.
Things may have changed, the APC UPS systems I've used required you to power off to replace the batteries. the CyberPower units I've used allow for replacing the batteries hot.
I've had terrible results with Tripp Lite UPS systems - where they fail for no reason at all, and will not auto power up after a power failure.
At a very large size of 30KVA, I've used Eaton UPS systems and they have been rock solid.
In general, a UPS is just supposed to buy you time to assess the situation and shut things down gracefully. Power density doesn't come into the equation at all. It's just supposed to give you an hour to figure out what you need to do. It's all about the lifetime of the battery. I go with Cyberpower.
I put an oversized UPS with extended battery pack on my network gear (Cable modem, Router, ethernet switch and Wifi nodes) to give me almost 8 hours of runtime in a power outage. I've found, however, that the cable company's UPS only lasts around 6 hours since that's how long it takes for the internet to go out and the cable modem to lose carrier.
In the winter months (when we have most of our power outages), I pay for a pre-paid cellular SIM for backup. The cell tower(s) that serves my area must be on generator, it doesn't seem to go down even in long power outages.
That makes sense, network gear doesn't suck a lot of power. For compute though, I always have to have the conversation with clients that multiple hours of UPS backup for a server rack is crazy expensive. They are usually better off with a UPS that can take over while a generator spins up if they really need that kind of reliability. I always ask, what are the odds that your 8 hour power outage scenario isn't from something catastrophic like a flood or a fire, eg, if the power is out for that long, something is probably literally on fire and you've got bigger problems.
Are Anker cables considered good? I have a few, but recently one of my anker usb-c cables started to only go up to 15W, making me think something broke with the usb pd chip.
Anker is good. The limit for USB-C cables is normal ones go to 60W, marked ones go to 100W. The cables are passive except for the marker chip. It is the charger that does USB-PD. Do you have it plugged into regular charger that does 15W, or USB-PD charger that does more?
I guess you could have a bad cable where the CC wire has broken and can't do USB-PD.
Anker products are generally good (and they were the first to make consistently-good cables and chargers -- in the early smartphone days, everything non-OEM was junk), but things still break eventually.
Their customer service is supposed to be legendary. If you drop them a (polite) descriptive note, signed with your name and address, I'll bet you a beer that you'll have a new cable in your hands by the middle of next week.
Look for used enterprise-grade or at least higher-end office grade equipment on eBay and other surplus marketplaces, and install new batteries. Cyberpower is a good brand in my experience.
The only advantage to buying new is weight. Newer UPSes tend to be a lot lighter, but that's often because they cut corners on the magnetics and/or use expensive, hard-to-replace lithium batteries. Weight is a high priority for the manufacturer to optimize, but not for the consumer. When it comes to UPSes, old school big-iron transformers and heavy-metal batteries are the way to go IMO.
I have a BR700G it works flawlessly with apcupsd, minimal to no configuration... the cheap units of APC is for really really desperate consumers, the battery didn't last more than a year from my experience like the BR* ones that usually last 3 years, these lines is the better of the both worlds cheap and sophisticated enough to work "smart" but as expensive as the SMT/SMC units.
Also like everyone say... remember to look for pure sine wave output if you are going to "protect" servers, for network equipment you could use the pwm based ones.
Just a warning: there are differences in the output from UPSes with the cheaper ones' output not working well with higher efficiency power supplies. Better UPSes will output less 'jagged' AC.
I would never buy a cheap UPS. The one time I did so many years ago, it ended up catching fire. UPS fires are the most common cause of datacenter fires, and that's the "good" stuff. Adding a UPS is not a zero-risk action that has only upsides. You need to actually think about it.
At home, I have had a single rack w/ network and server gear for more than a decade and I always run two good quality UPS, that I run a test on annually, tied to two different circuits. My setup now is a bit annoying because I'm renting currently, but when I owned a home prior to my recent move I had an electrician run two 20A Dual Function (AFCI+GFCI) circuits with hospital grade outlets to power the two UPSes in the same room but on separate circuits and breakers at the panel. I don't think that's overkill, but I think anything more would be overkill at home. When I buy a house again, I intend to do the same.
Since buying better quality UPS and testing them regularly so I can replace batteries when needed I haven't had any fires, but using batteries past their useful lifespan can greatly increase risk of fire. Expect that you should replace the batteries every 3-5 years, and that they will cost nearly as much as the UPS itself. If you're not comfortable with that, invest in a quality surge protector instead and figure out how to deal with failures. UPS batteries are a big fire risk, so you need to understand it and work to contain it. Also, keep a good dry fire extinguisher near your rackmount equipment and get it tested annually.
For home use I bought a lead acid UPS specifically to avoid a fire. For a 5 hour power outage it makes no difference to me if the UPS powers my machine for 5 minutes or 20 minutes; as long as I can save my stuff and shut down it works for me.
> Judging by all these incorrect assumptions about electrical, I'd suggest stopping while you're ahead before you burn your own house down.
For anyone reading this, this person has no idea what they're talking about. Besides the fact that I once considered becoming an electrician and completed my apprenticeship prior to entering the tech industry, it is a published standard how things should be installed in residential properties in the US, called the National Electric Code or NEC.
The NEC 2020 states that GFCI is required (in additional to a general AFCI requirement) for any 125V or 250V receptacles in kitchens, bathrooms, laundry areas, finished or unfinished basements, garages and anywhere within six feet of a sink. Since my server rack was in a closet in a finished basement, it is /REQUIRED/ by code to have a dual-function breaker.
Also, I once again refer you regarding fire risk to simply search "UPS fire datacenter". UPS fires are the primary cause of datacenter fires. But sure, I don't know what I'm talking about, I've only been responsible for multiple datacenters spanning the globe for one of the world's largest hosting providers as part of my career after giving being an electrician a go and deciding against it as a career in my youth.
If anyone is ever in doubt about how to install something that's electrical: 1. Hire a certified electrician and 2. Read the applicable standard, which is the most recent published version of the NEC. Don't listen to people who make rude smart-ass comments on the Internet.
> UPS fires are the primary cause of datacenter fires
Do you have a source for this claim? I can find no documented evidence of it being true.
The Uptime Institute found[0] data center fires of any cause to be exceedingly rare. Data Center Incident Reporting network reached similar findings[1].
The only example I could find of anyone claiming data center fires have a principal cause is this article[2], and while I'm not saying it's wrong, I've never heard of these folks before and have no idea how credible they are:
"Electrical failures are the most common cause of data center fires. These failures can stem from overloaded circuits, malfunctioning equipment, or defective wiring, each capable of generating sufficient heat to ignite a fire when in proximity to combustible materials."
(Note: they break out battery fires separately, so they are not supporting your claim that batteries are the most common cause of data center fires.)
But, in any case, with details about data center fires being so rare, either because they are rare or because people don't want to talk about it, stating a primary cause accurately seems to me to be very difficult.
Lastly, since you've run multiple data centers, surely you understand the scale of the battery strings involved. Total energy storage is way higher, ampacity is higher, and to the extent lead acid batteries are involved, much, much, more hydrogen off gassing is possible. I just don't see how you can extrapolate that down to fire risk for a 12V, 12AH battery found in a typical home UPS.
The fact remains that GFCI/AFCI aren't really doing anything to address the UPS battery fire risk you're describing, while increasing the chance of problems that knock the power out.
Also the NEC has gotten ever more outlandish with those requirements. Like last time I looked there was no exception for sump pumps, yet you'd be foolish to actually use a GFCI there. Actually a quick search says it's now even explicitly required. rolls eyes. I mean sorry, I'd love to but I'm just too busy restraightening all the prongs on my plugs due to these wonderful "tamper resistant" receptacles.
UPS's are notorious for nuisance tripping GFCI's due to the extensive power filtering they usually have inside. It may not happen right away but is dependent on how 'dirty' your power is and on how sensitive your particular brand of GFCI is. All it may take is a neighbour using his AC or power saw and it could generate a spike large enough to knock your system offline (UPS filters spike to ground, leakage current trips GFCI which needs to be manually reset, UPS runs out of battery because no one's home to restore power).
This advice is specific to the US & Canada where GFCI's are calibrated to trip at 5 mA. In the UK, their RCD's trip at 30 mA so it's less likely.
If you read the fine print for your UPS and/or GFCI they will say not to use them together.
What about AFCI's? Well they are basically nuisance devices by design and most electricians hate them. Do not use unless absolutely necessary by code.
The difficulty comes into play because UPS's, servers, etc are essentially industrial equipment while the code is written for such common residential usages as plugging in a lamp (where a dog will chew on the cord, a good way to start a fire hence a perceived need for AFCI protection.)
What if you're stuck because you want to locate your UPS and equipment in an area where GFCI's are required (like a residential basement)? Well I won't tell you what to do in your own home but I'm sure you can devise a creative solution. :)
AFCI's feel like NEC's attempt to create tons of service calls under the guise of "safety". And eventually seems like they will be required for new builds everywhere as areas adopt the newer codes that require them slathered everywhere.
Of course its "for safety" but of course these breakers are famous for false tripping and causing expensive service calls.
anecdote:
In my case, small server rack, on circuit a with an AFCI - since "bedroom". No problem.
Until washing machine on circuit b (no AFCI) runs, then trips circuit a's AFCI. Repeatedly, every time(debug on the breaker a returns ArcFault detection reason too). So... either a) don't wash your clothes b) don't have tech or c) quietly violate the code.
Indeed electrician quietly suggested(after a bunch of triage) I swap out the breaker with a normal one. But of course he wasn't allowed to do that... lol
Been here a decade - nary an issue since - so clearly the usual case of nuisance tripping and nothing more.
I suspect the switching power supplies were close to annoying the ACFI, and a beefy motor on an adjacent circuit was enough to push it over the line. Incidentally, swapped UPSes, power supplies(quality Seasonic), etc and nothing improved.
The issues with UPS products tripping GFCI breakers is due to leakage current that is out of spec for the GFCI circuit, this is almost always caused by the load attached to the UPS, not the UPS itself. While most UPS manufacturers do recommend against connecting it to a GFCI outlet, AFCI is required in all new construction for all circuits and GFCI is required for many circuits, including any circuits in basement areas which necessitates a dual function circuit. Since the original thread is about APC, even though I don't use their products, here's what they have to say on the topic: https://www.apc.com/us/en/faqs/FA369034/
FWIW, running my server and network equipment never caused the breaker to trip erroneously while I was still living there. I sold the house when I moved cross-country for unrelated reasons so I can't say what happened afterwards, but my setup was installed according to code and functioned perfectly fine. I used quality UPS and decent quality network and server equipment. Cheap stuff may be farther out of spec and have more issues with leakage current.
> AFCI is required in all new construction for all circuits
This is absolutely false.
Though I'm sure you gave your electrician an earful and he did whatever was needed to keep his customer satisfied and happily took your money. Hope those $50 hospital-grade outlets are working well.
"For new construction, Section 210.12 (A) of the National Electrical Code states that all 120-volt, single-phase, 15- and 20-ampere branch circuits supplying all outlets must be Arc-Fault Circuit-Interrupter protected in the following dwelling unit locations:
Kitchens, Family Rooms, Dining Rooms, Living Rooms, Parlors, Libraries, Dens, Bedrooms, Sunrooms, Recreation Rooms, Closets, Hallways, Laundry Areas, or Similar Rooms or Areas.
Even though it is not listed, this includes finished basements because once the basement is finished, the area becomes one of the rooms listed above.
Adding to the confusion, most people assume that outlets are only plugs or receptacles. However, outlets is defined in Article 100 of the National Electrical Code as “A point on the wiring system at which current is taken to supply utilization equipment”. That means that the requirements for AFCI protection is required in the areas stated above at all 120-volt, single-phase, 15- and 20-amp receptacles, lighting fixtures, switches, smoke alarms, dishwashers, refrigerators, and so on."
So, yes, /technically/ it's not /all/ circuits, however please identify for me a room that's commonly found in a residential property that's not a Kitchen, Family Room, Living Room, Dining Room, Bedroom, Hallway, Laundry Area, Closet or "similar rooms or areas"? Look at the floor plan of most residential properties in the United States and every room noted on that floor plan is on the list. So /technically/ it's not /all/ circuits, so I was /technically/ incorrect, in practice what I said is absolutely true.
Maybe a bit less snark from you and a bit more reading comprehension and you wouldn't come off as such a dick?
You conveniently left out that few jurisdictions are on the 2020 NEC code cycle when this provision was introduced. New codes do not get adopted everywhere immediately. Your town or city needs to adopt it and that can take many years. So, no, not all new construction.
After using hundreds, hundreds of very cheap to huge expensive APC UPS' I can say I have never experienced a fire. They have all used lead acid batteries, some as heavy as 50lbs.
I have even had to cut UPS' open to remove swollen batteries.
I love hooking up large lead acid batteries to old UPSes, there's no reason to be limited by the original battery capacity.
However...I discovered why some APC UPSes are sealed and the battery cannot be accessed. I was converting one such UPS which had no monitor interface. Having already opened it up to splice in an external battery, I noticed an empty header for a serial port. Cool. Plugged that into my PC and flash/bang/smoke. Fortunately my PC had a good path to ground and was not damaged. But the UPS had some pretty large burn marks at various points in the PCB and was certainly dead.
It took me a bit to figure out what I did wrong. The reason it went bang is directly related to why there's no user access to the battery, and why there's no external monitoring port. They went cheap on the electronics and did not include an isolation transformer. Any points within the UPS which are "ground", including the battery negative terminal, are actually referenced to the full bridge rectifier, which provides a direct path to live. Ahh...I'm just very glad I didn't touch any of the battery terminals while I had the thing plugged in.
Nonsense. UPS's don't pose any sort of serious fire risk. UPSs starting on fire are exceedingly rare. Sure, it's probably happened. People get hit by lightning too.
>I then asked about this on the apcupsd mailing list. The first response:
“Something’s wrong with your..."
"Returning to my thread on the apcupsd mailing list, I asked again if there was actually anyone out there who had one of these working with non-Windows."
Linux's reputation in a nutshell. It just works after you spend days checking if you have a bad device and another on the mailing list. And then you are forced to find a working alternative.
I probably spent 20 hours this week trying to get a not-that-obscure piece of software that explicitly states "Debian 12" support on their website working on a freshly-installed (as in - install Debian 12, update system, install software); and it still just doesn't work. After hammering my way through 4 different errors, and hopelessly out of date (and wrong) documentation, I finally got them to issue a refund this morning.
"Linux Compatibility" often means "someone got it working once 6 years ago, on an extremely out of date version of the OS, but its probably still fine".
This is why you do an internet search instead of just looking at what the vendor says, and/or look at what the vendor actually ships for support. If you see something like a binary blob, run away. If you see stuff shipped upstream, go ahead.
As a counterpoint to your experience, I've had great luck buying laptops off of Canonical's approved hardware list.
As a corollary, I've had worse luck buying computers that come with Linux pre-installed, only to discover that the pre-installed distribution is very specifically and carefully configured and includes proprietary bits and it's impossible to install any other Linux on the device or upgrade it.
Once you learn a few soft guidelines (not rules, nothing is perfect), you don't really even need to research.
Intel/AMD: probably fine. Realtek: works, one driver codebase for everything means weirdness is likely. Broadcom: don't even bother. Nvidia/mobile: what kernel/stack are you intending to use?
I haven't had to care about compatibility with desktop or server components in over a decade.
It's fingerprint readers and dual-GPU laptops that really bum a lot of people out.
Well, good point. In addition of wrong or incomplete compatibility lists, sometimes internal details of products change without even changing SKUs, breaking compatibility.
With something like a UPS using some crummy non-documented protocol with Windows-only support, it really is 100% up to the vendor.
The best Linux can do as an ecosystem is reverse-engineer some stuff, which may or may not work for all models for varying degrees of "work", and that's it. The people working on apcupsd probably don't even have access to this model, and they're probably also not keen to dish out several hundred Euro/Dollar/Pounds to purchase one at their own expense.
Often things are complex and multiple people are "at fault" for something going wrong. Sometimes it's very simple and it's just one party doing something wrong. This is one of those simple cases. Linux is very much not "the problem", crummy vendors with crummy undocumented protocols are "the problem".
Lots of devices do not support Windows. I am typing this in an ARM based tablet that does not,but does run Linux. Can you install Windows on all Chrome books, or Apple Macs?
I do not usually check for hardware compatibility with Linux either. The only things I have checked that I have bought in the last few years have been graphics cards and USB wifi, and the latter was just a matter of reading product descriptions.
A lot of software I use either does not run on Windows, or is harder to install or is less mature on Windows. Obviously lots of server side stuff, but even some desktop stuff I would be doubtful about because the bulk of users are on Linux.
The post you're replying to doesn't say that Windows runs on everything. He's saying that every piece of commercial software runs on Windows, and that every widget has a Windows driver. I have found this to be quite true. If you have Windows, you can impulse-buy pretty much anything and end up OK.
And look how you immediately pivoted from the compatibility with Windows to the compatibility with running Windows.
> The only things I have checked
That's great but nobody needs to check if it's compatible with Windows. The whole step - it's just not needed.
So if the "everything just works in Linux!" people say "oh you should had just checked" then the former was clearly not in a good faith or outright delusional.
> A lot of software I use either does not run on Windows, or is harder to install or is less mature on Windows. Obviously lots of server side stuff, but even some desktop stuff I would be doubtful about because the bulk of users are on Linux.
And most of the people houses and people themselves can't run the welding equipment, so what? How from discussing compatibility of widgets and gadgets of a regular Joe you moved on to a highly specialized and a very narrow scope of the server software?
As far as I know, the actual testing is done by the vendor. It might get reviewed by an MS or Apple imployee, but that isn't the same thing as actively testing the hardware themselves. Or providing software to work with third party hardware.
It's a chicken and egg problem. Hardware vendors don't worry about linux compatibility, at least for consumer/desktop products because the market share is low. But one reason lack of good hardware support hinders adoption of linux on desktop.
And what happens if it doesn't work? MS and Apple aren't going to change their software to work with the device, they will tell the vendor to fix it. But for linux it is the other way around.
Raymond Chen's blog 'The Old New Thing' details the steps MS took to ensure that partner vendor products _would_ work with Windows. MS tested products extensively and did modify their own product (Windows) to ensure that products supplied by partner vendors worked reliably and consistently between different windows versions.
That is making sure that things that worked on an old version work on a new version. That's not the same as making sure something new works on an existing version of the OS.
Is there a power delivery equivalent of Ubiquti (I know we’ve discussed their shortcomings here) that offer better than consumer grade for less than enterprise prices?
My UPS is my hybrid inverter. It offers at least 1kWh of usable capacity (after losses), more of the sun is up or the battery is charged above 20%.
Of course there is no native signaling on grid loss, but in practice that's not an issue; for planned maintenance I can force charge the battery and keep the servers running for a day or more, and actual power loss incidents are very brief in my corner of the woods.
We have widely deployed the workstation devices and have had no maintenance issues. we installed ~10 9PX models and have had 2 fail suddenly and completely just out of warranty and one with a failed battery charger, also just out of warranty, and a gigabit management card that works for a week or so then goes offline and requires local reboot from console but we have about 10 deployed with no issues too, so buy extended support because their support team is good but useless if you don't pay the protection money. We also had a rack PDU just die right out side it's warranty phase and we only bought 6. Not great quality it seems but it does the job. We had one install of a full rack UPS and it's been running for more than 7 years, but you have to keep the maintenance up and schedule battery swaps. They do make a really nice 1U lithium-ION ups also which is neat. The IPM software is a little janky but also works well.
I mean -- I built a PC a few months ago and formatting the windows disk from a non windows env was not trivial.
So yes, Linux is still not on par with Windows in many ways, but boy oh boy was it weirder than it should have been to format a USB drive to install windows.
I lost a solid 6 hours to that try, yay it work, wait I'm getting a nonsensical error at a strange screen. Retry with different flags, with different hardware, etc etc etc.
Ditto, I was building a Windows desktop and spent hours faffing around attempting to follow official Microsoft instructions even with another Windows computer available; in desperation tried Rufus Installer, worked on the first attempt. Then months later was building another desktop and went straight to Ventoy installed from Linux and just dropping a downloaded ISO on the USB stick -- worked immediately.
Seems like your best bet is to use an industrial battery inverter/charger system (Victron Multiplus) for slow-switching (>20ms) and then layer a secondary industrial grade UPS to provide fast switching (<15ms) and shutdown process through USB/serial/network etc. Is it cheap? No, but you shouldn't be using cheap power supplies to protect your more expensive server equipment.
My biggest grief with the APC UPS are that they initiate the shutdown process immediately in case of power loss. I live in a location where we frequent has 10 - 60 seconds power cuts. The optimal solution would be if the UPS waited like 5 minutes before starting to shutdown the connected PC. For this reason alone I stopped connecting my desktop pc to UPS and stopped installing the UPS powershute software. I wrote their support and asked if i could daisy chain a battery with the UPS so the battery would be used first and when that battery was depleted the UPS would take over and start the shutdown process. They advices me against this and pointed me to this FAQ. It doesn't directly adressing daisy chaining a UPS with a battery. Anyone here that has experience with doing that?
I once installed apcupsd only to be so appalled by its lack of security design I ripped it out PDQ. It demands to run as root so it can run shutdown (and maybe access the USB?!) and goes downhill from there.
There's many safer ways to let a mostly untrusted process run shutdown. Like a sudo setup letting the UPS user only run shutdown. Or /etc/shutdown.allow. Or something using CAP_SYS_BOOT. systemd might have a solution too. I get the impression I just spent more time thinking about this writing my response to you than the APC folks ever did.
Yes, I am afraid my UPS will hack me. More specifically I'm afraid this badly written closed source software will have some security hole that can be used to escalate to root.
Some discussions here [0] some months ago. Seems they are stuck in the design phase. I had high hopes regarding this project. Shame nothing came out of it.
Yeah, it is. I own two good old APC SmartUPS 1000 (one for desktop and second for server). They are absolutly reliable, quiet and output Sine when on battery.
Good old reliable tech, easly monitored using apcupsd for example.
If you're home when it switches to battery power, the equipment will probably have been protected from any surge, and you'll hear the beeps from the UPS, and can manually shut down equipment.
If you're not home, and the power doesn't come on soon enough, power will be cut to protected equipment, without a surge, and at most the equipment probably only needs the auto-`fsck`.
And because no monitor means that you can have the UPS airgapped for data, there's less theoretical risk of a remote firmware hack causing a fire/explosion or toxic leak.
The point is for when you're not home, it can automatically shut down your PC. It's actually quite useful if you happen to have a home lab. The nicer units can monitor temperature and humidity and send you an email if they exceed specified limits, which can be VERY useful if you live in a flood prone area, for example.
I have a homelab of a few servers, with a nice rackmount UPS, and I decided data connection to the UPS is more risk than good, in my case.
Environmental monitoring doesn't have to be part of the UPS. And APC has long made separate devices for that, for data centers/racks. In addition to the modern IoT home devices.
Oh sure, I'm just thinking in the realm of HN... perhaps it's more common than average! But having a networked UPS is non-negotiable for me, as far as I'm concerned it's no use if the servers power off hard anyways. It's on a VLAN without internet access so really not much risk either.
>> TL;DR: Despite otherwise seeming to work correctly, I can’t monitor a Back-UPS BX1600MI in Linux without seeing a constant stream of spurious battery detach/reattach and power fail/restore events that last less than 2 seconds each.
It sounds like the Windows software probably debounces the incoming events, only bubbling up those that are sufficiently time-stable, to account for bargain sensor false positives?
I'd bet you're right. Experience at $Job is that certain cheapo UPS's can be miserably twitchy about detecting power problems and switching to their inverters. After a week of the clickity-clicks of the cut-over relays - $Employee and his office mates may ask for the UPS to be removed.
Right, the window tool debounces the events, IMO the battery attach/disconnnect messages are a poor mans heartbeat signal on the protocol. That ACPI/Windows didn't "sleep" the USB port, or that the controller in the UPS (chinese junk) didn't crash.
> popped the breaker for the sockets causing everything to be harshly powered off. My fileserver took it badly and one drive died.
Maybe someone with more hardware knowledge could chime in, but my default assumption would be that the hard drive died either because it couldn't take another power-off/on cycle, or because the event that caused the circuit breaker to pop was harsh enough to damage it (not sure how much the computer's power supply insulates components from things like that, but intuitively, there are much more voltage-sensitive components in PCs than hard drives). While a UPS might have been effective against a surge, it would not have saved the drive from dying by power cycling.
My hypothesis is that most equipment doesn't need to be notified when on UPS, it could in fact be harmful when it causes a power cycle that turns out to be unnecessary. An orderly shutdown doesn't matter that much unless you're dealing with an app that can become internally inconsistent, such as a database. In all other scenarios, the journaling FS takes care of data integrity.
So, especially for home use, why not just let the UPS run until it's out?
In my experience, all modern operating systems and widely-used databases handle power interruption well. I can't recall an time where a filesystem or database was corrupted by power loss in decades. Adding a monitoring/shutdown system for basic UPSes is not worth the risk/hassle. I'd just monitor the power directly and let the system fall over when the UPS runs out of battery.
> Even worse, two of the fifteen devices became massively corrupted, with one no longer registering on the SAS bus at all after 136 fault cycles, and another suffering one third of its blocks becoming inaccessible after merely 8 fault cycles
I've been monitoring several dozen APC Back-UPS units for more than 10 years using apcupsd on Debian without any issues.
There were only a few times (maybe 2 or 3 in all that time) when I saw some messages indicating that something was wrong with the battery (though they never were as frequent as in the author's case). In each of those cases, the messages went away after replacing the battery.
What's interesting is that APS's Windows-only software, Powerchute Serial Shutdown, is actually Java-based, accessed through the browser, and trivially de-compilable and readable.
After many UPSes at home over the years, I currently have only 1 (a business-grade rackmount Cyberpower), and once it needs a new battery pack, I'll probably just switch to a surge protector.
My local electric is pretty stable, I no longer run servers at home that need to be up 24/7, most of my work at home is on a laptop with integrated battery backup, I can use my phone WiFI AP mode for backup Internet, UPSes are a pain to move between apartments, and UPSes have a small risk of dangerous battery leakage.
I'm kinda sick of buying these cheap lead-acid based UPS units (only tried APC and Cyberpower so far) to have the batteries die in just a couple of years.
I had some battery issues with my previous BackUPS 750 but once I got a new battery, things were fine with apcupsd. Have been on a BackUPS-Pro 1500 S for a couple of years now also using apcupsd and zero problems. it legitimately records a low power event when my Brother printer first starts up, switches to battery then comes right back. Similar happens when it's stormy here and power lines are going down and being re-routed.
I have no issues with Back-UPS RS 1500MS2 on Linux.
I had to configure an udev rule though for it to set up a static USB device name since that wasn't happening out of the box, but other than that - no problems.
I bought one because that's what I could get at Best Buy. This was particularly important because being able because being able to purchase it at a known physical store near me meant I could get it the same day from a relatively trusted vendor, be acceptably confident that I wasn't getting a counterfeit, and that there was hopefully at least some minimum level of quality (read: it probably isn't going to burn down my house).
I just looked at Eaton's website. They recommended two companies near me that carry their products: one doesn't have a website (it's not a retail address, maybe just some random person's home), the other's website has lorem ipsum text on their Eaton partnership page, and their shopping site has no Eaton products at all.
So, yeah, it's understandable why people might not be buying Eaton.
This. I finally switched from using various APC models to an Eaton 5S 1500 unit for my desktop and a couple peripherals and it's much better while still being comparably priced.
I’ve had an ultra cheap Eaton 5E (900 VA) for a few weeks. Because I still don’t entirely trust it, I’ve tried cutting the power to it several times (like 10x maybe) and it’s been fine every single time (it talks to a Debian box with NUT). The whole thing was around $80, if it doesn’t burn the house down, there are no downsides.
I have my APC Back-UPS CS650 (BK650EI) plugged into my router (running OpenWrt) which works fine with Network-UPS Tools. The router shares the UPS status with the whole LAN, and the machines on the LAN that are powered by the UPS will shut down automatically when the UPS battery goes critical.
I refused to go down this route; the hours lost in frustration are not worth it.
I purchased a UPS that supports a network management card, basically an overpriced SNMP daemon with an Ethernet interface, and have been happy ever since, since every system can query the status of the UPS.
I own two Back-UPS 1400VA units, one monitored with apcupsd and the other with a MikroTik router - so I guess it's just the newer units that have this issue?
I have an APC on macOS, too, and as of a couple of OS iterations ago it recognizes it natively. I used to run apcupsd on macOS, but it was quite tricky to keep it working, due to the USB interface. This was on an older Mac than I'm using now (the trashcan Mac Pro), but it mostly worked and I would always have an indication when it stopped working as the icon in the menubar would change. I think the usual issue was so lock file was owned by root and couldn't be deleted.
> However, in December 2023 a fault with our electric oven popped the breaker for the sockets causing everything to be harshly powered off. My fileserver took it badly and one drive died. That wasn’t a huge issue as it has a redundant filesystem, but I didn’t like it.
> I decided I could afford to treat myself to a relatively cheap UPS.
Just from that, I'd say he for-sure should have avoided UPS's, and gotten a "real" surge protector. The "shockwave" from that high-amperage short is far more likely to have done the damage than the mere loss of power. And the output of cheap UPS's is often far worse than the input.
[Edit: And good surge protection is never a feature of cheap UPS's.]