Hacker News new | past | comments | ask | show | jobs | submit login
Hacker mods an M1 Mac mini to receive power over Ethernet instead of AC (inferse.com)
436 points by nrsapt on Aug 1, 2023 | hide | past | favorite | 163 comments



Site's getting hugged to death by HN possibly, cached link to article here- https://web.archive.org/web/20230801195510/https://www.infer...


" the M1 Mac mini was the perfect hardware to test out PoE, as on idle, the device only consumes 6W. When some load is applied to the internals, that power draw can go up to 40W. After some thorough research, we found out that the maximum throughput of Power over Ethernet was 15.4W "

They'll have to bump it up to 802.3bt (Poe ++) which can support 60W.

Cool Project though, I've been wanting to mod my Mac Mini m1 to run off USB-C PD which should be possible with modification because it uses the same PD IC as the Macbooks (CD3217) which could mean I could get it to eventually run off of a battery pack


Minor correction: 802.3bt added 51W (Type 3) or 71W (Type 4). 60W isn't a standard power level. Some switches support up to 95W per port with PoH.


Surely they're all 'up to'? It's not 'you must sink this much current or else not compliant'?

It seems like a weird thing for TFA to say anyway - my PoE[+] switch was the cheapest 8 port I could get a few years ago on Amazon, and does 30W per port. I don't really understand how you could look into it at all, be willing to attempt the hack, but not use a switch (or injector or whatever) that's capable of powering it under load.


Depends. Within the standard, you can do 51w at the full 100m on standard cat6, which is 300ma/pair. If your runs are shorter, you should be good to go at the full type4 current, or, 960ma.

This is a good reference for this: https://www.5gtechnologyworld.com/what-every-engineer-should... which goes into how to calc your total bundle dissipation and whatnot.


Hah, yes they are all 'up to' on the device side. A switch port or injector is not compliant with a particular standard if it does not provide for the type-specific load, however.


“By ‘Energy Star’ compliant, we mean that it burns as much energy as the sun.”


They said it "can support 60W." As you point out, Type 4 can support up to 71W, so yes it can support 60W. They didn't assert that it is a standard power level.


What's the goal of having a MacMini running off an external battery? I mean, it seems less cost effective than just getting a MB Air and not using the screen...


The goal of most of these projects are usually to demonstrate that they're possible. Actual use cases are left as an exercise for the reader.


Headless servers where you want a modular or easily removable battery backup.

I really wish desktop PC power supplies would support such a thing. It seems stupid that we need to use a UPS with an inverter when the power supply should be rigged to accept DC input straight from a battery.


Exactly, the PC PSU coverts AC to 12v for the PC anyway, so 12v batteries should slot into that scheme, without a separate PSU and inverter.


You would still need a DC to DC converter as lead acids deliver anywhere from 12.8 to about 10v depending on SoC and load.


Remember that DC to DC power supplies are much smaller because existing AC PSU has to deal with the transformer aspect.

I worked for an ISP that was also a CLEC back in late 90s / early 2000s. All of the telco side was DC (massive 48v battery banks) - so when we were doing server implementation it made sense to get all DC PSU on the servers. Not sure why DC PSU aren't a more universal option as you'd think UPS providers could easily offer DC output models.

There are options out there, I ran across this [0] looking for typical ATX DC to DC PSU.

[0] https://lazer3d.com/dc-dc-power-supply-guide/


For anything larger than about 500W that'd become untenable because at 12V as you'd need cables that can carry in excess of 50A.

UPS systems start connecting the batteries in series to provide higher voltages for this reason


Or multiple cables, just as PC power supplies of 500w+ have, and in fact some of their power is delivered at 5v.


Even more annoying is why aren't decent USB-only ups systems?

just ditch the 110 outlets for usb-a or usb-c ports and skip the inverter.

I'd love to power a pi or 3 from ups-based usb ports.


Where I live, this is actually extremely common now - typically referred to as a "Mini DC UPS" [1].

These cheaper ones just provide 9V and 12V output to power a fibre and/or WiFi router, as well as 5V USB-A, and start at around 30Wh capacity - some for less than USD 20. If you pay a little more, you can get 24/48V PoE included as well.

Unfortunately, ones providing USB-PD are quite rare. You do get "power stations" [2] with USB-PD (in addition to an AC inverter). These are typically USD 100 - USD 1000, and have much larger capacity.

Examples (I haven't tested these and do not specifically recommend them):

[1]: https://www.amazon.com/Lithium-Battery-Appliance-Monitor-uni...

[2]: https://www.amazon.com/Portable-Solar-Panel-Power-Station-Ge...


Yeah that would be great. Pimoroni makes a version of the Pi Pico that takes a LiPo battery and automatically switches to battery power when the USB power goes away, but I don't know of any full computers that support that. I think adafruit makes a LiPo addon for picos. I wonder if it could be modded to work on a zero or something?


That's where USB-C PD is nice. It's all DC at several different voltages the recipient can request. The only downside is it's not quite powerful enough for many desktop PCs, but it's getting better.


Pretty much expect the Mini to end up this way within a few generations. Somebody in the Mac team really hates wall warts, but given all their other hardware is going USB-C one would think they’d want to standardize & eliminate the space/heat of the internal power supply.


Avoiding the inevitable spicy pillow seems like a good reason

An external/easily replaceable battery would be excellent come time to deal with cell age


Does this read like an attack against Apple laptops specifically, or something? Already -2 five minutes after posting it.

Leaving as-is for feedback. I don't get the controversy. I could've been a jerk and said just buy a UPS - this is an established concept.

Edit: thank you kind souls for restoring the imbalance - carry on :D


Not really. However, as a long-time Mac and Apple fan, I can say the community has a lot of overly aggressive people who can't stand any criticism (valid or not) of Apple and will do various things in retaliation.


No it just reads as kinda paranoid because li-ion cells typically become spicy pillows long after the device becomes obsolete. It's also trivial to avoid by limiting cell voltage. Many devices, most notably iPhones, automatically go into kiosk mode after charging for a few days for exactly this reason: https://support.apple.com/en-gu/HT208710


Definitely disagree. My own anecdotal evidence is that many many laptops left plugged in 24/7 will develop a spicy pillow after a year or two. That includes a set of Apple laptops and hundreds of Dell laptops.


Anecdotally: this has never happened to me despite owning a whole bunch of laptops over the past 25 years or so.


Thank you for taking the time to follow up!

Definitely a bit paranoid - even if I'd prefer to call it robust :)

I've had very bad luck in this regard... but haven't honestly used a laptop in probably a decade. I was pleasantly surprised to see my latest (dust-collecting) Lenovo will stop somewhere around 80%!

Point being, sure - there are safety things... but batteries are consumables. I like these things to be easy-to-replace!

I find these being difficult to replace accelerates this pattern of obsoleting. CPUs and (especially disks) tend to pack plenty of punch/life, these days - well beyond the mechanical/chemical things they depend on.

Something to ponder - these chips may see particularly long use, being so power efficient. The utility bill won't be such a driver, sipping power and parallelizing decently.


> ... but batteries are consumables. I like these things to be easy-to-replace!

Replaceable, consumable batteries often get thrown out in the trash. ... and can set garbage trucks and recycling centers on fire.

https://www.waste360.com/safety/lithium-ion-batteries-are-ca...

https://www.dispatch.com/story/news/local/2023/02/09/garbage...

https://arstechnica.com/gadgets/2022/12/recycling-firm-fined...

https://gothamist.com/news/lithium-ion-batteries-a-growing-f...

https://www.abc.net.au/news/2023-05-29/garbage-truck-fires-c...

(and many more)

How does one make sure that removable, consumable, high capacity batteries are not discarded as trash but instead taken to the proper facilities to handle them?


Respectfully, today people just chuck the entire phone/laptop in the bin. I don’t think making the battery module removable changes the outcome one bit, except that it would allow a tremendous amount more life out of these devices, reducing all the other (non-battery) waste streams and reducing the environmentally costly manufacture of new devices…. which is the reason why the shareholders love the current strategy and why all electronic manufacturers are moving the same way. Why allow users to get 3 more years of life from their laptop when you could convince them it’s e-waste after 3 because of a $15 battery?


We can't. Instead we're making it worse by selling disposable vapes with rechargeable li-ion batteries [1].

[1] https://www.recyclingmagazin.de/2023/07/17/das-ungenutzte-po...


Is the dichotomy here really that we can either have modular batteries, or we can have more intact waste trucks and workers?

However, having just taken an entire stroller out of my shared duplex recycling bin, I know there will never be an ethical way to get 100% of people to care enough to dispose of waste properly.


I wish I knew a good answer - this is an excellent concern. I'm skeptical of recyclers/waste centers, but that's what I've got.


Having things that are have special handling for disposal not be as user serviceable is one solution... but that goes against the replaceable battery.

The AA lithium ion batteries getting tossed into the trash and setting garbage trucks on fire are problematic enough. The energy capacity of a cell phone is quite a bit more and correspondingly more spectacular in the combustion.

From iFixit:

https://www.ifixit.com/News/69041/how-batteries-can-catch-fi...

https://www.ifixit.com/News/34034/lithium-ion-batteries-are-...

> USA Today reports that 65 percent of fires at waste facilities in California were started by lithium-ion batteries. In a 2018 survey of 21 waste facilities across California, 86 percent reported a fire at their facility in the last two years, according to the California Products Stewardship Council (CPSC). Of those fires, 56 percent were attributed to batteries, with the remainder attributed to “traditional hazards of combustibles.” In other words, batteries are causing more fires than the oils, fuels, and other hazardous materials of waste management—combined.

> And that’s only the fires actually reported.


heh, I understood what you meant by "spicy pillow" from context but it's a new phrase for me, I like it


Can you explain



The batteries puff up like an overstuffed pillow and if you let the spice out, you're going to have a bad time.


lipo batteries expand


The article is about a Mac Mini. Those machines are mains powered, not battery powered.


I was replying to a question on why one wouldn't "just get a MB Air/not use the screen"... not the article.

How is this relevant? I don't say this to be rude, but this internet phenomenon of relentless pedantry is annoying.

Things tend to make [some degree of] sense when your first reaction isn't to shoot from the hip.

Mains batteries exist - most people should save the effort, get a UPS. Hackers on the other hand... it's kind of silly to ask why. The answer is 'because'.


No idea, but if they got it running on USB PD, battery pack aside, it would make it a one cable connection to a monitor that has PD support. Multiple cords don’t actually bother me but it sounds kinda neat.

Anyhow I guess the goal is “because you can”


I could see it making sense in a 'van life' context to use a smaller local battery rather than plug everything in to your main leisure battery. Use the latter to charge Makita packs say and then run most other stuff off those (there's a decent amount of open 3D printable adapters for them, as well as third-party/AliExpress stuff).


Would make way way more sense to have a macbook that you can then charge direct from a DC battery over USB-C. With the mac mini you still have to work out how to power an external screen as well.


There's a lot of low power HD screens out there. It's not super difficult. Not the ideal case for sure but low power is all about tradeoffs.


Battery aside, POE also allows you to put it anywhere you have ethernet cable with sufficient power delivery. In this case you could have a workstation powered entirely off ethernet delivered to the location (some monitors will also work with a POE splitter as well).


That it happens to be 12 volts DC has some value versus whatever power loss an inverter has for use in a vehicle. Though I'm with you on the relative ease of starting with a laptop instead.


> it seems less cost effective than just getting a MB Air and not using the screen...

how ? why would paying for a screen and not use it be cheaper ?


Economy of scale, mostly.


E.g. to run home automation in case of a power outage. I have an Intel NUC with a li-ion DC UPS exactly for this reason.


In the event of that solar flare that wipes out the grid.


In that event, personal computing will be the least of our worries


Two large monitors or projectors with Mac mini.


cheaper and bigger batteries? always replaceable in the future? And tbh laptop batteries are not meant to be used 24/7, personally would avoid it.


This would be fun for when LEOs come in to take your computer, they can easily keep it powered so all of the decrypted keys stay in memory. Killing power means going back to an encrypted state. In high profile cases for desktops, there's techniques for splicing the power cable to switch to a battery pack. This would make it much easier for the unskilled LEO to take your shit. Cause we all know you're the one they're after. Sleep tight! ;-)


That's why pretty much all desktop and server mobos have a "chassis intrusion" header that you can connect to a sensor/switch. Your machine should wipe keys or reboot if it's opened or moved.


how does that work if the case is never opened?


You can connect multiple switches in series. Eg one could be a reed switch that detects whether the PC is within 1mm of a tiny magnet in the floor.


This is a good idea! Kind of makes me wish I was a drug kingpin with a PC full of secrets so that I could justify trying that. Sadly I would just end up with a kid rebooting the PC by bumping into it :D


I'm going to try this with my extra Mac mini and one of my switches that outputs POE++ at the full 60w. I should even be able to do this using a fairly bog-standard POE splitter from POE Texas that'll actually deliver a full 60W.

Edit: Here's the injector I'll probably use - https://shop.poetexas.com/collections/splitters/products/gbt...


This change/mod involving one of the most locked-down devices seems pointless. What do you like about the project? Or is it that the additional cable is so irksome?

I have my own "must-do" projects that from outside perspective is seen as pointless. So, this is more about understanding than poking holes.


Another way to look at it, you can put one of the most powerful mini computers at the edge with PoE, think machine vision with industrial USB camera type of scenario.


If you don't get PD working, you can always just build a battery pack that charges itself over USB-PD, and outputs 12V. Apparently the built in PSU on the Mini takes 120/240V and outputs nothing but 12.


I’d be very curious to understand (even very crudely) the breakdown of those 40W. Roughly how much does each internal component consume at its peak load? Or at least for the top offenders.


80%+ will be the M1/M2 SOC with it's on package Ram, ~5% will be SSD, 5%-10% Fan, remaining will be supporting circuitry (buck converters, regulators etc)


I'm very skeptical of PoE now after 3 PoE adapters killed 3 Raspberry Pis I have.

I wouldn't want to risk something more expensive to that shit.

At this point I'd rather just have straight up 19V + and - cables bundled together with the Ethernet with some heatshrink around the whole thing to make it look like 1 cable.


PoE is used extensively in network deployments involving rather expensive hardware.

I understand the instinct to avoid it at this point, but I’m curious what happened in your case because I’ve never experienced issues.

I did work somewhere where someone fried equipment by incorrectly terminating a batch of Ethernet cables thereby sending voltage to the wrong place.


That should only be possible if using non-standard "passive" PoE?

802.3bt has negotiation specifically to prevent faulty cables/devices from being powered.


I have 13-15 devices running over PoE at my house and all of them work just fine. Everything from the high end Ubiquiti gear to raspberry 3/4 PoE hats. It’s powered by a Unifi 24 port PoE switch.

The only time I had issues was with a 2 mesh APs and sketchy power at a condo that has minor power outages due to thunderstorms. I brought the devices back home and they work fine, the new device doesn’t have an issue. I was curious if a UPS would’ve smoothed the power blips but the new AP restarts and connects just fine.


PoE switches and devices are widely used in industry. I suspect your 3 PoE adapters are "passive PoE" consumer crap that doesn't meet spec


You sure you want to blame poe as a whole instead of the RPI hat?


It’s definitely that.

PoE is fine and the issue surely lies on some $5 Chinese rasppi part with little buffering and protection circuitry.


Sure it could be the RPi hat, but it just means the standard is so complicated that people don't implement it correctly.

I wouldn't trust the hat or whatever you call it for Mac Mini.


> Sure it could be the RPi hat, but it just means the standard is so complicated that people don't implement it correctly.

Or there are ways to cheap out on a PoE device that may work in some cases but don't fully and properly implement the standard.

The standard is widely used in VoIP phones, wireless access points, security cameras, and all sorts of other networked devices that get installed in places that may not have nearby power outlets or where a single wire solution is beneficial.

Personally I have three Pis that have been on PoE their entire lives and have had no problem, but I used a name brand PoE hat (the Waveshare hat with the OLED display) and am powering them from a mainstream PoE switch. If you're using some random AliExpress hat with janky injectors you get what you pay for.

> I wouldn't trust the hat or whatever you call it for Mac Mini.

Almost any large commercial building has had hardware running on PoE that costs more than the average Mac Mini for years. Most PTZ cameras for example, high-end directional wireless bridges, even some nicer wireless access points.


I mean, one of my hats that fried one of my Pis was a Waveshare hat. This one actually

https://www.amazon.com/gp/product/B0974TK3KD/

PoE operates at a pretty high voltage (almost half of line voltage!) and Waveshare products aren't UL listed.


>PoE operates at a pretty high voltage

No, it operates at ELV (extra-low voltage): https://en.wikipedia.org/wiki/Extra-low_voltage

The whole point of PoE, 48V PD USB-C, and similar tech is that they don't need to be UL listed. PoE is also electronically current limited unlike mains power so you can't pull 200A to start a motor nor can you start fires without a lot of effort. That's assuming you use a real PoE switch that negotiates power levels, not cheap passive injectors.

Anyway that hat has some bad reviews claiming DoA, missing components, and dead Pis. Why not get the official hat? https://www.raspberrypi.com/products/poe-hat/


You're dealing with the same manufacturers that won't even bother to include the USB C termination resistors required to make "USB C" charging ports on their devices work with actual USB-compliant PD. You're talking about the kind of part that's so cheap that adding five of them is going to round down below a penny in your BOM. The kind of manufacturer that, if it had a motto, that motto would be "no corner too cheap to cut".


What kind of PoE Adapter/Hat did you use?

From memory, and it's been a few years/models... but the GPIO pins, including the 5V input pin bypassed all of the circuit protection fuses and what-not the normal barrel jack had.

It wasn't uncommon for people to roast their RPi with incorrectly done, or poor quality GPIO powering devices.


This is still true. There are ample warnings to be careful when using the GPIO as input because of this. The hat probably passed on some variance that is supported by actual consumer devices and fried the RasPi because it has no protections.


Don’t buy eBay junk and blame it on the standard - which is used by tons of high end commercial equipment and is more appropriately a commercial feature - PoE isn’t cheap - if it is that probably explains your poor results.


When it is off you can charge it with a simple mobile phone charger.


i would actually like being able to power my mac mini with just a usb-c dock like a macbook


I have been thinking it would be cool to just salvage a functional MacBook Air motherboard from a “broken” device. install that in a small case, 3D print an IO shield, and you have what you’re looking for!


Might be too easy to disconnect compared to a regular power cable.


We powered the Mac Mini M1 using 12V DC, bypassing the built-in AC power adapter, and used it for some quadruped robot experiments. Some details on the connector are here: https://www.ifixit.com/Answers/View/574827/What+PSU+connecto... (that article mentions Mac Mini 2018 but the connector/pinout still works fine for Mac Mini M1)


Ivan also rack mounted a few Mac Minis using a similar approach

https://www.instagram.com/p/CvP8FuLtzzr/?img_index=1


That looks cool! Here is our paper with videos of the quadruped robot with Mac Mini M1 on the back, receiving power from the robot https://sites.google.com/view/fast-and-efficient I did a version without enclosure (just 250 gram) and (with DC-to-DC Buck converter for stable 21V -> 12V) but it was too vulnerable, hard to replicate and it didn't fit properly inside the robot.


I've been wondering more than once if we couldn't improve (larger) offices a lot by distributing power pre-transformed so every computer and every screen wouldn't have to include a transformer.

I also sometimes wonder why these days, when every new lamp is led, why we don't replace most of the dangerous 230V outlets in new houses with 12V and use 230V only for dishwashers, laundry machines, dryers and that kind of stuff.

Is there some kind engineer or hobbyist here who could shoot this idea down for me so it won't bother me for another 3 years?

(I have a couple of years studies in electronics, but not power distribution. I think I understand why backbone networks uses extremely high voltages to reduce losses, but at least for now think it is more a issue over long distances, but I am willing to reconsider.)


For a given wattage, voltage and current can be adjusted (inversely proportional) within reason, but higher current with lower voltage results in some problems: voltage drop (which mostly only matters at long distances, typically negligible within an office) and thicker conductors (double the cross-sectional area for double the current, if memory serves). This is why things like PoE and phantom power for audio are often 48V instead of something lower.


Many IP phones are PoE.

Ubiquiti used to sell PoE office lighting, oddly enough. Now discontinued https://www.bhphotovideo.com/c/product/1431160-REG/ubiquiti_...


It's kind of cool that you'd control lighting over IP supplied via the same PoE port, but a missed opportunity to not offer a unit with an AP inside.


It really is bonkers. I wonder if it was the case of a VP with a pet project who pushed it hard to ship, got a promotion to a different org, at which point it was promptly canceled.

Only makes sense if it had a line of sight wireless AP in it, like 40 ghz or (to a lesser extent) the new 6 ghz band.


Plenty of domestic applications still require higher voltage (well actually power but I presume you aren't suggesting 12V with high current). Vacuum cleaners, hair dryers, anything portable that produces heat. Building houses with low voltage sockets would be really frustrating for people living there.

Is the danger of 230V actually significant in this day and age? I wonder how many fires/injuries are caused by 230V that wouldn't happen at a low voltage.


Fires and injuries at 230V aren't caused by the voltage, but by improper installation and handling. Don't stick forks in outlets, turn the power off when you're working with it, etc.

Also, it's not the voltage that kills you, it's the amperage. With regards to the 12V discussion, there's also AC vs DC to consider; AC will flip voltage 60x a second, making your heart go haywire, whereas DC is a continuous jolt, meaning your heart and other muscles will freeze in place until the power is released again, like how a defibrillator works.


> Also, it's not the voltage that kills you, it's the amperage.

A bit off-topic but I never liked that saying, it is kind-of-right but also soo wrong in so many levels. Amperage is not a thing that happens on its own, it is always a result of voltage, voltage is the driving force, so it is the voltage that actively kills you, by forcing amperage through your internals. When the killing happens the voltage is the real murderer, the amperage is just the murder weapon.


My hypothesis has been that the saying is due to how we generally think about power supplies. The voltage is a fixed quantity but the current fluctuates based on what the circuit asks for and what the power supply can give you. Something could be rated at 100V, but if the supply can’t deliver any significant amps it doesn’t matter much


I=V/R


I have this vision of a future where everything just runs off of usb-c and usb-pd. Even today, expect in the kitchen and the bathroom (refrigerator, washing machine,...) you could totally run most things off of the 240w max. Much more so in an office. It would be so much more efficient than what we have now, and safer. No idea how realistic it is though...


I pulled the specs for a dual-compressor fridge this morning for an unrelated reason and it only draws 180w... maybe the fridge can run on usb-c too! https://www.bosch-home.com/us/productslist/refrigerators/fri...


Something about a giant fridge being powered by some small Apple brick gives me shivers. I guess it's doable but it feels wrong.


Office equipment doesn't really have "transformers" anymore, everything uses switched-mode power supplies. The way they work, is that they have an output buffer (a capacitor or inductor) and rapidly turn a switch on and off to keep that output buffer at the right level: if the output buffer is below the target voltage, the switch closes and the voltage will start to rise. Once the output buffer rises above the target voltage, the switch opens and the voltage will drop as power is being used from the buffer. This happens thousands or even millions of times per second. This principle works exactly the same for modern AC/DC power supplies as it does for DC/DC ones.

The problem with your plan is Ohm's Law, and more specifically that fact that wires aren't perfect and have some resistance (for now!). Ohm's Law gives us V=I*R, where V is the voltage in volts (V), I is the current in ampere (A), and R is the resistance in Ohm (Ω). In a wire, the resistance is constant, and V is the voltage loss across the wire. So how do we reduce the loss? We reduce I. Luckily we only actually care about the total power, which is given by P=U*I. If we want the power to stay the same and reduce the current, we have to increase the voltage.

Let's say the two wires from our central transformer to the computer are 14-gage copper, and they are 100 feet long. Their resistance is about 0.5Ω combined. We want to power a 120W computer. If we transfer that at 12V (the normal voltage computers use internally), we'd have to transfer 120W/12V=10A. The voltage loss across our wires is 10A*0.5Ω = 5V! So we put in 12V, but get out only 7V as we burned 50W in the cable itself. To get out the desired 12V we'd have to put in 17V at 10A instead, or 170W to power a 120W computer. It would also mean supplying way too high of a voltage to a computer connected with a 3-foot cable.

If we increase the voltage across the wires to 120V and down-convert that to 12V at the computer we'd only need to conduct 1A and the wire loss would be 0.5V, which at 1A is a power loss of 0.5W. That's completely acceptable, and because the computer down-converts anyways we don't really have to care about it getting 119.5V instead of 120V either.

But now we are back with a power supply at each individual computer, so in the end we didn't really gain anything. Instead of an AC/DC power supply in every computer we now have a virtually identical DC/DC power supply, so what's the point? You might have some small gains by doing the initial AC/DC conversion centrally, but in practice it probably isn't enough to care. It is only really worth it when your power comes from DC anyways, like an office with rooftop solar.

Alternatively we can use way thicker cables, but to get that same 0.5W loss at 10A would mean a wire with a resistance of 0.005Ω. To illustrate, that means using two 0000 AWG wires in parallel.


Yes, that's the tradeoff. High voltage for the long haul, low voltage locally is common.

There are computer equipment racks where distribution within the rack is at 12 VDC. These often have big busbars in the back, and a power supply in the base. Facebook's OpenRack started at 12VDC, but a later rev is at 48 VDC. That's just within the rack; there's a power supply in the rack base running off something like 3-phase 220VAC. There are advantages to running off 3-phase power; there's always power available from at least one phase, and the capacitors needed to smooth DC are far smaller.

Telephone central offices have run the whole office at 48VDC for a century, with a big battery for backup power. Big bus bars carry that around the building. (Do telco offices still do that?)

Much industrial control gear runs at 24VDC. So do many military vehicles. It's a reasonable voltage to send a few meters, but not hundreds.

(The extreme case is ultra-high voltage DC power transmission, where power is sent thousands of kilometers at a million volts.)


For telco gear - at least the local distribution stuff in a box on the side of a road - a common setup is AC for normal operation, but 24VDC (or is it 48VDC?) between UPS and device. Because of the telco heritage those devices come with both AC and DC power inputs, and using both protects against power supply failures and allows the UPS to last a bit longer.


Not op but thanks for the very detailed explanation!


> Office equipment doesn't really have "transformers" anymore, everything uses switched-mode power supplies. The way they work, is that they have an output buffer (a capacitor or inductor) and rapidly turn a switch on and off to keep that output buffer at the right level...

Overwhelming majority of mains-powered devices does, in fact, have a transformer. It is part of a switched-mode power supply, though. The only devices that do not have one are those that are fully enclosed in plastics and user can not under any circumstances come to contact with any of the conductive parts. Typical example would be a LED light bulb or wall-socket powered WiFi repeater (without RJ45 port).

It is pretty hard to create a transformer-less device using plain rectifier/switch/capacitor topology. The biggest issues obviously are the high voltage before the switch and dead time when the mains voltage drops near zero 100 times per second. Most switch ICs made in the "west" cannot go up to 325V and are thus usually used with a transformer to step the voltage down below 60V. If you get a Chinese chip that can work off-line (as in directly with the 325V mains voltage), such as KP1063 <https://datasheet.lcsc.com/lcsc/2103171532_Kiwi-Instruments-...>, it still needs an inductor to bridge the mains dead time. This is incidentally done by the transformer in traditional power supplies. Capacitor would have to be huge to smooth over those for any significant power draw.

> If we increase the voltage across the wires to 120V and down-convert that to 12V at the computer we'd only need to conduct 1A and the wire loss would be 0.5V, which at 1A is a power loss of 0.5W. That's completely acceptable, and because the computer down-converts anyways we don't really have to care about it getting 119.5V instead of 120V either.

We could use 48V, which would bring the cable losses to about 3.3W (for those 100 ft). Or maybe instead of wiring from the mains box, we would place transformers along the wall sockets. Some new installations already do that and install charging USB ports. USB-PD is now specified up to 48V / 5A.

I think that we are already seeing devices abandoning the traditional DC connectors in favor of USB. A lot more people now have a large bank of charging USB-A and USB-C ports on their desks. Some vendors already integrate them in extension cords. It won't take long for LED lamps to come with an USB cord and an optional mains/USB transformer SMPS ("phone charger"). Laptops are already charging (and docking) via USB-C with PD, it might not take long for screens to follow.

Stepping 48V DC down to 12V, 5V, 3.3V is way easier and can actually work the way you have described above.


At best I've seen USB ports in wall and desk outlets, but I suspect they have transformers inside the housing.


Transmission losses in DC are higher, so you need a lot more wire. You actually can get off the shelf low voltage lighting, though, but it tends to be high end designer stuff.


> Transmission losses in DC are higher

For the same voltage and wire size, transmission losses in DC are the same or lower (due to skin effect and capacitive/inductive losses). They are only higher when comparing low-voltage DC with higher-voltage AC, but the key difference is the voltage, not DC versus AC.

(As a bonus, DC has lower peak voltage for the same RMS voltage, so the wires need less insulation.)


Sure, but GP and OP are talking about supplying power at the voltages devices (such as LEDs) currently consume.

AC is associated with higher voltage in this context even if that isn’t intrinsic.


Nah 12V MR16 lights are quite common.


> Thanks to the power efficiency of Apple Silicon, the M1 Mac mini was the perfect hardware to test out PoE, as on idle, the device only consumes 6W. When some load is applied to the internals, that power draw can go up to 40W. After some thorough research, we found out that the maximum throughput of Power over Ethernet was 15.4W and that too over varying voltages, which are details that Ivan had left out when showing off his findings on Twitter.

The last sentence has enough typos that I'm not able to follow what they're trying to say. What happens when the machine requires more than 15.4W? If the thing isn't actually usable or stable in real-world scenarios, this becomes a lot less exciting.

It'd also be more interesting if the full components list of what was added to the inside of the machine to make this possible was shared.

Edit: Thanks to @ravetcofx for revealing how more power can be delivered over PoE https://news.ycombinator.com/item?id=36962808


There are more details in the twitter thread, including a table of PoE standards:

https://twitter.com/Merocle/status/1686093369322176512

> What happens when the machine requires more than 15.4W?

The voltage will sag and the machine will likely crash!


> The voltage will sag and the machine will likely crash!

USB-C has negotiated power. If they did things right, it should act like a 15W usb adapter, where macOS will gracefully handle the limited power.


It’s a desktop device though.


The article simply assumes that the creator only supports the original POE when they say "POE". There's a good chance they used a PoE+ or PoE++ adapter that supports more wattage.


The actual source referenced by the article appears to be this tweet thread: https://twitter.com/Merocle/status/1686093369322176512


Having inadvertently plugged a 24v passive PoE live connection into an old 2012 Mac Mini and immediately frying it, this is welcome news!


I thought most modern devices had isolation!


Indeed, surprising.

From IEEE 802.3 (revision 2012), section "32.6 PMA electrical specifications":

> The PHY shall provide electrical isolation between the DTE or repeater circuits, including frame ground and all MDI leads.

> This electrical separation shall withstand at least one of the following electrical strength tests: > [...]

> b) 2250 Vdc for 60 s, [...]

Non-compliant Ethernet PHY?


Passive PoE is always on

Active PoE is negotiated with handshake


Yes, but passive PoE is almost universally at 24v, and per the Ethernet spec (as quoted above) an Ethernet PHY should tolerate 24v fine. This is important as transients from nearby lightning or occasionally even coupling to power cables can produce this kind of voltage. Ethernet connectors are magnetically coupled for protection from these transients.

The problem with PPoE in these cases is, I think, not the voltage so much as the current. The continuous 24v supply may overheat the magnetic coupling transformer and cause it to fail. Some Ethernet interfaces, usually on telecom equipment and quality switches, have over current protection to prevent this. Unfortunately consumer devices usually don't.

It's important to understand this because 802.3af etc. does provide power without being asked - as a rest for a characteristic resistance on the receiver. Otherwise it wouldn't know if a PoE-capable device was connected. Up to 20v can be applied during this process but it is time limited. In general, 802.3 PoE supplies must monitor the current usage of the powered device and cut off power if it is too high or even too low for more than a short period of time. This is in part to prevent this overheating problem on devices that might, for some coincidental reason, fall into the appropriate resistance range to activate PoE.

In other words, 24v or even hundreds of volts for a few seconds is perfectly safe. 24v for minutes is likely to cause damage to devices without better protection than the spec requires. Old Ethernet equipment used to make the non-isolated components relatively easy to replace so that repairs after a problem like this were easier but now the isolation is a tiny surface mount part and replacing it will require tools and skill.


Looks like PoE uses pin 4 and 5 for +48V, and 7 and 8 for GND, which respectively are pair C and D of GbE. On ISDN, pinout could vary but one I could find said 1-2, 4-5 for V+ and 3-6, 7-8 for V-. In either cases, both sides of isolation magnetics are connected to pins of same potential and current is only proportionate to voltage imbalance within each pair which should be minuscule.

I wonder if the problem is that 1:1 signal transformers for decoupling are being replaced with simple DC blocking circuit. That can be most simply done with a spare 0.1uF and an R10k per pin, which by the way generate load of from V/R=I 48V/10kohm = 5mA and 48V * 5mA = 230mW > 1/6W. That could cause resistor to burn off if phone-sized components would be used. Or if the cap might only be rated for 10V, it could burn open. I have nothing to support these hypothesis though. I could be entirely off.

Also I suspect the "passive PoE" mentioned above could be cheap injectors with very rounded corners and always active 48V. Those are widespread for surveillance cameras and other neckbeardy applications.


> Looks like PoE uses pin 4 and 5 for +48V, and 7 and 8 for GND, which respectively are pair C and D of GbE.

There's three common pin outs. You've described 'alternate B' -- the unused pairs of 10/100; you can also use 'alternate A', using the pairs used for 10/100. Or 4-pair would have both.


Most people don't realize the huge transient voltages in wires of any significant length that can be induced by nearby electrical storms.

Granted, CAT 5-8 cables are supposed to be twisted pairs and sometimes shielded, but it's imperfect.

PS TIL: https://en.wikipedia.org/wiki/Litz_wire


Note that there are real 48v 802.3af/at power over ethernet devices where the power source has to assert that the power drain wants the power before it powers up and fake "passive" 24v systems where the line is always energized. probably where the GP got the injector from.

My personal involvement on this was several years ago when I was going to buy several unifi access points. It turns out you have to be careful because many of the models advertise as being POE but in reality are jank 24v passive systems. I have not kept up with the current unifi lineup but at the time you had to make sure to get the "AC Pro" to have real 802.3af compatibility.


I have a load of Unifi stuff here and it all supports active PoE, but works just fine with passive. The PoE outputs all autonegotiate.


Eh, it is not like they are outright lying. when you dig into the specs they do say 24v passive. but when someone says poe I expect 48v 802.3af compatible poe.

I don't know how true it is but I heard horror stories where ubiquiti 24v gear would activate, then fry when hooked to a real poe source. personally I suspect it was a 48v passive source.

I did a quick informal survey of the ui store and it looks like the newest generation(U6) all handles 48v while the prior generation(AC) the pro models were 48v and the lite and long range models were 24v passive.


Some of the older Unifi kit was non standard

By older I mean you could still buy it three / four (?) years ago


It should be that all devices have isolation. Ethernet is capacitively coupled which keeps you from having to coordinate your power/ground between devices. PoE is the recognition that if the DC levels don't matter because of that required isolation, we might as well not isolate in a power stage before the magnetics and deliver some DC power.


> Ethernet is capacitively coupled

It's usually magnetically coupled. I've only seen capacitive coupling when both devices are on the same PCB


Merocle is also the creator of the Raspberry Pi Blade.

https://www.kickstarter.com/projects/uptimelab/compute-blade


I have been thinking that PoE is seriously underutilized for what it's capable off. There are so many use cases where one needs both networking and power, yet if you want a one cable solution, you are forced to contend with wireless which, while very reliable in a lot of scenarios nowadays, still can suffer from interference and complicated setup. Sonos and other wireless speaker systems are one of my main examples for obvious PoE use cases that have yet to be realized.

Hats of to Ivan for their amazing project, the reliability and seemless switch between AC and PoE is especially impressive[0].

[0] https://twitter.com/Merocle/status/1686093369322176512


PoE makes sense if you have a house or building with ethernet cables everywhere, but using your Sonos example, 99.9% of people or more do not have ethernet cables everywhere. They do have regular electricity from an outlet and wifi though.

That said, I'm all for installing ethernet cabling and the like in every room. But it'll likely only happen for new builds.


This seems to simply ignore the max continuous power rating of 150W. The mini can output 15W on each thunderbolt port and another 15W on its two USB ports, which is already nearly the limit of 51W assured to each PoE++ endpoint, not even counting the mini's own requirements.


PoE is a mature technology. Curious that laptops have not been on board.


An electrical outlet or USB-C connection is more readily available to me than an Ethernet port with POE. Laptops aren't onboard because the problem of powering laptops is largely solved. Also...not many laptops have Ethernet ports.


Historic energy consumption is probably higher than classically POE supported.

Also, any length of POE run gets voltage drop, and POE switches and injectors often have tedious modal configuration based upon length of run and are designed with non-standard limitations such as maximum draw limits shared across multiple ports, which in aggregate will cause no end of issues. For verification, ask any experienced CCTV installer. These are exactly the sort of issues that cause users to take products back to their distributors.

So it's a case of "works in theory, PITA in reality, probable support and brand image impact huge, resulting priority zero".


It's really popular in factories etc, because it reduces cable runs I guess.


That's right, but factories have other problems too: especially EMI due to huge loads being switched, which twisted pairs, shielded twisted pair and ethernet termination magnetics are designed to resist. Also, industrial cable routing can get pretty hairy, this means the ability to round a corner at a relatively tight radius can be important. Using standard ethernet cables for low current DC power keeps these concerns generally within acceptable realms of normality.


The more direct problem in my experience is that all kinds of network devices that could reasonably be powered that way don't support it. My Fritz!Box router and repeater don't support it, my Sophos firewall doesn't support it (tho the Sophos XGS can supply PoE it doesn't seem to be able to be powered that way either) and of course the Hue hub and Raspberry Pi can't be powered by PoE either.


Is an ethernet port even common on laptops anymore? I thought it was one of the first victims in the great port culling.


I use PoE extractors to power a few different devices in my house, including RPis and some non-PoE switches. It's ridiculously easy to use them, but you generally need to know the voltage you want ahead of time.


PoE splitters with USB outputs are really handy since "everything" plugs-in to USB now.


I got some POE adapters for my Google WiFi points so I can place them around the house and not mess with an extra power cable. Works great. You can get them on Amazon with various outputs (USB, USB-C, barrel connectors of various types).


What's your experience getting gigabit adapters?

In my experience, cheap ones on e.g eBay will give you 10/100, but I don't think they typically support gigabit.


The ones that I got say they are gigabit and are currently $12.50: https://www.amazon.com/gp/product/B07TJ3ZNJ4


Holy moly, didnt know these things existed :O and fairly cheap as well, always though PoE a bit of useless since most devices dont support it and it was (is?) quite dangerous.


If you have modern PoE dispensing (?) equipment it's pretty safe as the devices all auto-negotiate.


I really want a bi-directional USB PD <-> PoE adapter. Do those exist?


Any reason he could not just use a 10$ of the shelf PoE splitter?


a) because he can

b) for the lulz

c) optics


Has anyone accomplished powering an Apple TV with PoE?


That's what I want as well. It will/should always have an Ethernet cable attached anyway, but why do I need to run two wires?

Apples are a little weird, in that they clearly have no love for cables, yet they refuse to adopt PoE. The AppleTV and HomePods are prime candidates for PoE.


I also wish the HomePod supported wired networking in some constellation. It's powered via USB-C but only uses it for power, whereas you could totally use some USB-C + Ethernet dock. It's not like iOS (which HomePods are kinda based on anyway) doesn't support those.


A few days ago I found "Charging My MacBook Air M1 with a Standard Mobile Phone USB Charger" [1]. PoE it is in similar ranges.

[1] https://news.ycombinator.com/item?id=36893299


I’d enjoy reading about this more if the page wasn’t riddled with ads.


How about we take it a step further and carry data over power line? Look ma no wires.


Cue snarky comments about how its 2023 and no one uses CAT cables anymore in 3...2...


>power over Ethernet instead of AC

Did they used to get air conditioning over Ethernet? This doesn't make any sense to me.


AC in the context of electricity has meant "alternating current" for literally more than a hundred years. This is a you problem, not OP's, so turn down the snark. https://en.wikipedia.org/wiki/Alternating_current


Could be read as humor, more charitably.


AC, or alternating current, is a type of power. Usually available as a wall plug in your house.

DC, or direct current, is another type. For example a battery. Or in this case, PoE.


Your comment may inspire someone to make an air conditioner that runs on PoE...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: