Hacker News new | past | comments | ask | show | jobs | submit login
ACE: Apple Type-C Port Controller Secrets (t8012.dev)
277 points by aunali1 on Dec 30, 2020 | hide | past | favorite | 125 comments



Apart from the processors, I'm amazed/bewildered that USB is now complex enough, that the power component of it has its own serial communication protocol (USB-PD), which also has a vendor registry and is extensible with proprietary messages - and by the looks of it, is already in widespread use for all sorts of things that have nothing to do with power delivery.

Isn't this sort of reinventing USB on top of USB?


When I lost the charger for my Nintendo DS, I was able to MacGyver one out of a piece of cardboard and two strips of aluminium foil.

A quick look at the USB-A port spec, and pin locations on the similar-to-USB-but-proprietary DS charging port indicated it was a very simple construction.

Amazingly, this hack worked successfully and I was able to have my yearly Elite Beat Agents binge. The hardest part was actually not building the "cable" but balancing and bending it all just right so that the pins actually maintained contact.

Now, in the days of complex calculated power delivery negotiation, a hack like this would be impossible.

And for good reason. Devices are safer now, more compatible with each other's chargers, and can charge faster. And the things we can do with USB-C/USB-PD are fantastic. I discovered yesterday that if for some reason I really want to, I can charge my MacBook Pro from my Android phone.

But my little foil hack was fun at the time.


I forgot to take my MacBook Pro's charger with me but could still charge it using my iPad's charger (USB-A to USB-C, mind you) - albeit very slowly. Still better than nothing, though. If only iPhones had USB-C…


Yes, Apple seems to be very confused in this regard. They’ve committed to USB-C for all of their computers, but then are continuing to push lightning for all the other portable devices.

It’s hard not to be annoyed at them about this. Enough to not buy Apple again? No, not yet, but I’d be lying if it’s not a reason why I’ve delayed purchasing a new device. It leaves a bad taste in my mouth, and makes me grumpy. So I’ll wait to upgrade to the last moment.

On a similar note, it seems Nintendo with the switch has hobbled USB-C in some way that standard equipment won’t work with it, like charging over generic USB-C, is this accurate? Has Sony similarly hobbled the spec for PS4/5?


As I understand it, Nintendo didn't intentionally hobble their USB-C port, they just screwed up the Power Delivery implementation. (And didn't care enough to fix it.)

Some chargers are tolerant of Nintendo's mistakes, others aren't.


The dock uses a non-standard implementation of USB-PD, the Switch itself can charge just fine with just about any charger, though. As I understood it, the dock randomly switches between different voltages depending on load, while portable mode uses plain old USB-PD with a consistent charge. While you can use the dock with Apple's 61W charger, it's not really recommended, since it can still get bricked with a software update. The 87W/96W charger isn't safe for the dock at all.


> Apple... committed to USB-C for all of their computers, but then are continuing to push lightning for all the other portable devices.

Ipad pro and the new ipad air have a type C connector. My phone and watch use cables that plug into the same Type C brick I use for my Mac.

But nitpicking examples aside I agree that they should ditch lightning. It's still entrenched for all the other devices.


USB-C is now a requirement for me to upgrade my iPhone. I'm holding onto my XS until they switch to that.


I had to clean the lightning socket on my iPhone XS as it had some pocket fluff inside that prevented the cable from filling clicking in. Cleaning it was trivial, just some cardboard and a well positioned light source.

On the flip side, cleaning out USB-C sockets from the same types of fluff is a pain because the data pins are in an island in the middle of the socket. Which gives you both less room (is requiring you to use a thinner implement) but also a greater risk of damaging the socket.

USB micro had the same problems as well.

I really with USB would address this because, with the best will in the world, devices sometimes get dirty.


> Which gives you both less room (is requiring you to use a thinner implement) but also a greater risk of damaging the socket.

A flattened wood toothpick works great for this. Take a regular round pointy toothpick and flatten the end 5mm or so by squeezing it in a pair of pliers or some such.

I was ready to replace my phone until I learned this trick. I had already tried cleaning the port out, but obviously not well enough.


I think we’re more likely to see a port-less (read MagSafe only) iPhone sooner than a USB-C version. I will say the iPad Pro with USB-C was a revelation for usability though. Not to mention companies like Flir being able to create one product line.


I don’t know about that. The physical port is still useful for hard resets and management of a “bricked” device (granted I’ve only needed that on an older device, and never on the model I currently have).


Does Apple still get a cut in lightning licensing? If so I think they will move engineering heaven and earth to avoid losing that revenue stream so also agree with the above poster that I’d put money on moving to the wireless.


That's absurd.

There is zero possibility that third-party licensing income for the Lightning port standard is a consideration in any engineering decisions at Apple.


Apple could keep around a hidden diagnostic port, like they do on Apple Watch and HomePod.


The future of iPhone charging is wireless. The current iPhone12 with MagSafe is the first of the new system. I predict future iPhones will have no physical ports at all, 100% wireless.


The lack of USB-C is the reason I didn't replace my Airpods with more Airpods when their battery died.

(Still looking for a decent replacement -- I unfortunately cannot use the kind that has those rubber seal plug things, I must have the hard plastic style like the non-pro Airpods)


Just got the Soundpeats Trueair 2, they're very very similar design to airpods. Case is slightly smaller even, much lighter/cheaper feeling but are well built enough, work reliably and have usb-c.


I think it is because they had already built out a lightning ecosystem years before USB-C came and mostly solved the same problem lightning did.

This left Apple in a weird spot. They could keep the iPhone on Lightning and users could continue using the same cables and peripherals they have been since 2012. Or, they could switch. This would require many users to buy new accessories, but they could be generics that work with any phone.

The switch from Lightning to USB-C is not as advantageous as the switch from the old 30-pin connector to Lightning was. To users already invested in the Apple/Lightning ecosystem, a switch to USB-C would not bring much benefit.


Not for all their portable devices; the iPad Pros have been USB-C since 2018, and the iPad Air since September this year.


Ah, thanks. I actually totally forgot about. Maybe I should get one now (2 years later) to help spur them on.


What's funnier is that Apple already has an iOS device that has USB-C for its main port, the iPad Pro. So I'd presume most of the engineering work to get rid of Lightning completely has already been done at this point.

On Switch, I did charge mine from a powerbank with a USB-A to USB-C cable, it did work but there clearly wasn't enough power. The battery kept draining but very slowly.


All these complex theories/complaints about lightning. Giving them the benefit of the doubt, one good reason for not shifting to USB-C is to keep IP ratings high.

Most USB-C sockets I can find are waterproof only with a matching cable plugged in.


> All these complex theories/complaints about lightning. Giving them the benefit of the doubt, one good reason for not shifting to USB-C is to keep IP ratings high.

Also accessories. Sure there's a sunk costs fallacy component to it, but still: it wasn't that long ago that apple users needed to replace all their ADC accessories with lightning ones.


There’s such a long tail of it, too: I still see ADC connectors in hotel room phone docks, etc.


All of the Android phones with USB C and high IP ratings makes that seem unlikely.


Before USB-C I would have agreed about giving them the benefit of doubt the doubt. But as a sibling comment pointed out, they have an iPad model that is USB-C, my hope is that means the entire line of products will switch over.

I’m actually guessing that they have a long tail of hardware designs that all need to be changed, and that process takes a while. Even their new headphones are lightning based, though they were under development for 4 years... that’s as far as I’ll give them.


I think they're keeping lightening to help their 'eco' brand.

They just removed chargers from the iPhone box because 'everyone has one' - if they suddenly changed to USB-C on iPhone it would create a LOT of ewaste.

I still think they should switch, but I reckon it's because they'd get accused of hypocrisy if they deprecated it.


Even apple chargers are regular USB at the charger-end, right? Just include a cable then, problem solved.

Or the apple way: an adapter.


Most of them, anyway.

I've been putting an Apple Watch SE through its initial paces over the past week and was mildly disappointed to find that only Series 6 ship with a Type C cable in the box - the SE uses Type A.

The same split is true for iPhone 12 Pro, so far as I can tell, even though (curiously) my 11 Pro from last year did include a 20W Type C Power Adapter in the box.


They could just as easily spin it as “Now you can use the charger from your macbook, meaning only one cable!”

Or, if they want to get really wild: they can openly talk about the other usb-c devices that you might have.


Part of me thinks they may intend to get rid of the physical ports altogether on the iPhone. This would serve to differentiate the "productive" iPad from iPhone, and allow them to scratch that minimalist itch they get every once in a while.


I can charge my Switch on random USB-C power sources.

(No clue whether it's as effective as the included charger, but certainly works well enough.)


That’s good to hear? I think I was seeing issues with that on some of my chargers, maybe I need to double check the power ratings on the bricks.


Huh. I think that's a good thing. Slow charging is better than a dead battery and unusable device.

My work HP laptop has USB-C charging but won't charge at all without PD. I tried it with a USB-A - C cable and a fairly hefty adapter without PD.


When I lost the charger for my Nintendo DS, I was able to MacGyver one out of a piece of cardboard and two strips of aluminium foil.

I did the same thing for my 2DS XL, but instead modded the device with a non-PD USB-C breakout board. Works great with both USB-PD and regular USB-C chargers and cables. https://nullsum.net/posts/usbc-mod-new-2ds-xl/

Now, in the days of complex calculated power delivery negotiation, a hack like this would be impossible.

The only devices I've seen _require_ USB-PD are PC's. All the modern phones, headphones, and gaming handhelds I've seen with USB-C do not require USB-PD and charge just fine with a USB-A to USB-C cable.

Better yet, these devices all use a standard USB-C port instead of a proprietary one. The only gotcha is that some USB-C devices break spec and don't work with some USB-C to USB-C cables. Both the Pi 4 and RG351P suffer from this.


Ah, that's a good point, I suppose many devices will still slow-charge from an old style connection with USB-A on the other end.


If USB-C retains backward compatibility with the existing USB PD implementations, you’ll be able to signal that you want power by permanently affixing a shunt resistor across your two power lines. If you need more control, then you upgrade that to something like a voltage-controlled resistor so that you can do more complex signaling. But USB PD was always designed to support dumb AND CHEAP devices, so if you didn’t need the extra functionality you didn’t need to implement any complexity.


USB-PD is almost always in place on top of normal USB 5V, 500mA current delivery as a fallback. Your hack would have worked if you had a type-C plug you could have hacked apart and soldered to, and charging would have been dreadfully slow.


Yes, it seems like you're absolutely right, thanks for the correction!

I should have realised this was obvious - new phones still charge just fine off any old USB-A charger just fine, provided you have a USB-A to C cable -- they'll just slow charge.


> Isn't this sort of reinventing USB on top of USB?

It's not on top of USB, it's a completely independent side-channel. And it cannot be on top of USB, since its second main use (besides negotiating the voltage and current in the main power pins) is to negotiate the alternate modes used for the pins which normally carry USB (not only the USB 3.x pins, but also the USB 2 pins in the VirtualLink alternate mode). You could think as USB-PD being below USB, except that it's also not true since you can use USB-C without USB-PD (using only the resistors in the CC pins to detect cable orientation and maximum current).


I remember thinking USB2.0 was over complicated, this is absolutely insane.


It was, and this is. You were right.


"Oh, 'USB 3.2 Gen 2' (aka 'USB 3.1' or 'USB 3.1 Gen 2' OR 'USB 3.2 Gen 2')?"

"There's your problem, you need 'USB 3.2 Gen 2 × 2.' You idiot!"


Just wait for USB4!


That's just Thunderbolt 3 without the Intel trademark "Thunderbolt" name, isn't it?


No, there are some differences; AFAIK, USB4 can also tunnel USB 3.x (Thunderbolt 3 uses the PCIe tunneling plus a PCIe xHCI USB host on the device), and its compatibility with Thunderbolt 3 is optional.


The good thing is that I could now fast charge my pixel 4a with my MacBook charger as the protocol allowed the charger to pick the right output.


Not any different than Ethernet's add on methods of OOB signaling for link integrity testing and auto-negotiation.


And it still hasn't got error correction?


The problem is that with short haul serial links there is a very small operating range where error correction works. Basically the difference between a working link (very low BER) and a link where even ECC is broken (high BER) is quiet small and adding ECC overhead is deemed an overall loss. These links have quite different characteristics from long-haul links that are dominated by ISI. One way to see this is to look at a waterfall plot and note how steep it is on short interconnects. There is a very small range between working fine and completely broken.


It does. (Checksums)


Aren't checksums for error detection and not error correction? As I understand it, error correction is having enough information to correct the error while error detection is only knowing an error has occurred.


Error correction only makes sense if the latency of the read/write is long(i.e. spinning physical medium, co-sharrd radio channels).

FEC isn't free(~2x overhead per bit recovery if I remember right) so if your error rates are infrequent then it's worse to use FEC over just resend/reread.


I guess checksums are ok for many applications if USB implements resending of data at the protocol level. For high-speed cameras it might be problematic though.


reed-solomon is most likely used here. It’s the tech in CDs and ECC RAM. Checksums that also have enough data for recovery.


Once again i'm amazed what ALSO has a processor in it. I'm wondering if at some point I'll just see it as normal that nearly every device, no matter how static it may seem, has more processing power than my first computer.


Almost everything has at least an 8-bit MCU now. 32-bit MCUs are also extremely common now, and even simple ones like the ARM Cortex-M0 are competitive with a 286/386 (albeit with less memory / no MMU).

A rather surprising number of devices run very powerful application processors. An amusing example is Apple's Lightning to HDMI adapter, which has an ARM SoC with 256MB of RAM and boots a Darwin kernel in order to decode a H.264 compressed video protocol. Depending on what exactly they put into it (wouldn't be surprised if they borrowed the Apple TV chip for a relatively low-volume product like this) it may be more powerful than a fairly recent computer.


The first time I saw that adapter and realized it was basically an external video card, not a pin breakout, very eye-opening.


Has anyone tried to run Linux on that Apple adapter?


Obligatory “can it run DOOM?”


Looks like this dev has patched the HW and has shell access https://twitter.com/nyan_satan/status/1322329047779713024?s=...


That’s delightful!

Anyone want to buy a few and make the first cluster of HDMI adapters?


Obligatory “can it run DOOM?”

Imagine a beowulf cluster of them.


Pretty much every microSD card has had an ARM cpu core in it since forever.

The Bluetooth radio chip in your phone and the RF chip in your car's key fob already have multicore CPUs inside.

Even your optical mouse has a multicore CPU in it to handle image processing and translate the optical feed into motion packaged into USB HID frames.

Your fast phone charger has a CPU in it to monitor and negociate power delivery to not burn down your house.

Even your basic budget electric toothbrush has a 4-bit CPU inside it made by Swatch.

CPUs cores are in everything these days [happy NSA noises]


Pretty much every microSD card has had an ARM cpu core in it since forever

A lot of them use 8051, presumably because it's cheaper and smaller: https://www.bunniestudios.com/blog/?p=3554


I've worked with TI BLE chips which were 8051 variants. Took me back to original CS classes as it is a very simple core with limited registers but can still be used to perform all the bluetooth functions.


IIRC, the BLE stack on those was a separate (undocumented?) core; the 8051 was for user software that gave high-level commands to the BLE block through magic registers. Still fun to program, for sure.


The older ones for sure, but all the modern ones have moved to ARM7-TDMI then to Cortex-Mx cores.


Some of the more, ah, cost-reduced Bluetooth chips have 8051s or all kinds of weird, proprietary processors, presumably because the per-chip licensing fee for ARM is too much at that end of the market or something.


I am right now making backlight with RGB LEDs for my monitors. It is going to have Cortex-M4 ARM in it.

I put into just about everything I build these days regardless of how small it is.


Now I'm curious about the things you're building. They wouldn't happen to be on a blog somewhere?


No, I don't have a blog. These are just my private projects that I make for my own use. I am working as a software developer and learning electronics as a hobby.

The backlight is going to be a box supplied directly from AC, connecting up to 6 strips of individually addressable WS2812 RGB LEDs, providing up to 10A at 5V (50mA per LED == 200 LEDs at full power). It will be connected using galvanically isolated Full-Speed USB to the PC.

For now I will have some pre-programmed sequences but I plan to make a piece of software that will make it possible to match LEDs to the borders of image on the screen though I have no idea how to do that at the moment.


I've also been playing with WS2812 LEDs as ambient/halo lighting around my monitor. I'm using a cheap ESP-32 dev board to allow control from my phone via WiFi even when my computer is asleep. I went much smaller though - only 16 pixels, powered by an old phone charger.

This works for now because my primary use-case is ambient lighting when the room would otherwise be dark. I'm planning to build some larger-scale higher-density light panels to provide more illumination for those dark winter days.


Is there a specific dev board with a Cortex-M4 that you like to use, or does it vary by project?


I have a bunch of STM32 Nucleo and Discovery boards that I use and which one I use will depend on circumstances.

For prototyping I use both breadboards and perfboards. I use breadboards for small fast prototypes and perfboards when I know I am going to develop it over a longer time or when I have some special requirements (like AC power on board or a component that has 2.54mm pitch but is not breadboardable) that exclude or make it more difficult to use on a breadboard.

For breadboard I would default to use STM32L432 Nucleo-32 which is breadboardable and doesn't use much space.

For perfboard I default on either STM32L452 Nucleo-64 or STM32F303 Discovery. I don't solder them to board but instead just insert it in the board and then put couple of pieces of plastic from 2.54mm pitch header with the metal pins removed. This mounts the board securely in place without having to solder it. I use dupont jumper wires to connect it to the rest of the board where I would typically solder the rest of components (unless I also don't want to solder them in for some reason).

I would typically solder in things that are disposable to me that I don't want to flap around.


This fascinates me. Is there a website anywhere that is collecting these sorts of use cases? It seems truly mind blowing something as simple as an electric toothbrush would have a processor in it.


Nothing mind blowing. Processor is cheaper than building analog circuitry.

Think about your toothbrush. All important timing parameters are configured digitally and you can easily change it. You can technically do the same with resonators but it would take much more board space, be less precise, require inductors which you want to avoid in the circuit, etc.


And cheaper/more flexible than custom digital circuitry too. Even if something is large-scale enough to justify a custom chip, that'll often be some components around a mask-programmed 8051 clone. straight-forward, well-understood, relatively easy to make variations by just changing the program.


You still need to provide the cpu with a clock signal though.


No, not really. The STM32s I use (I mainly work with Cortex-M4) have internal resonators that are enough for just about anything unless you need precise timing. Certainly good enough for toothbrush.


How would you explain this part of your argument then:

> You can technically do the same with resonators but it would take much more board space, be less precise, require inductors which you want to avoid in the circuit, etc.


Many microcontrollers (PIC, STM32, atmega8 etc) include an internal RC oscillator - which is literally inside the chip itself. Zero external components required.

Not only do you save the costs of using a crystal, you also save two pins - which was useful in the days of 8-pin microcontrollers like the ATtiny85.

As internal RC oscillator drift rates can be as much as 10% (and vary with temperature) they're not precise enough to run a serial connection, let alone a USB connection. That's why products like Arduino tend to go directly to using a proper crystal (which gives you a 0.01% drift rate for a few pennies).


I know next to nothing about microcontrollers, so this might be a dumb question, but how do they do the initial flashing is they can't run a serial port? Are they hooked up to something external before going on the final PCB? Do they not need a precise clock to read the onboard program from whatever's storing it?


Even if you've got a really inaccurate clock, you can still accept synchronous protocols like SPI and I2C where the bus master provides a clock signal.

Every chip brand would have their own protocol and provide their own programming hardware that could speak it.


yeah, almost all microprocessors in that price/power bracket have on-board oscillators, because it saves power, money, and space if you don't need a precise reference.


The parallax propeller has an on-chip RC oscillator. For obvious reasons it can only be used in low clocked scenarios, but many simple human interface applications are fine with this. The propeller is not a very common MCU, but it is fun to work with and has many technical merits.


> important timing parameter

You can also, well, use a regular toothbrush. What is actually important in a toothbrush has nothing to do with electronics.


Actually I switched to electronic toothbrush years ago as it cleans better. Now when I am deprived of my brush and have to rely on a normal brush I don't feel like I did good job cleaning my teeth. Part of this may be subjective feeling but various tests show that electronic toothbrush cleans better than regular one in most cases.

No, I don't need any special functionality other than to clean my teeth but if you were to design a toothbrush you would most likely be asked to implement those.


>Is there a website anywhere...

I doubt, pretty much anything that has electronic circuitry has got a micro processor - it's an off the shelf component, well understood, much easier to change/modify and test than custom built analog circuitry. For example - what's the option to save any end user settings with analog devises - knobs/potentiometers.... Compared non-volatile memory like NAND, the cost (and space and weight) differences are orders of magnitude.


Another thing to consider : more and more single-use items also have processors and electronics in them. At what point will disposable electronics have more processing power than, say, the Apollo flight computer?

A couple of examples that I've noticed :

There's a sports good store near here where they attach some form of RFID tag to each item (including individual protein bars), to automate the scanning. That means these are RFID tags intended to be scanned exactly once.

There's also the case of "digital" pregnancy tests, which consist of a regular paper pregnancy test, a processor to read out the results, and an e-ink display to show the results. All of this is included in the single-use disposable predictor stick.


> At what point will disposable electronics have more processing power than, say, the Apollo flight computer?

Already way way way way past that point. There's an article that does the rounds on here about it, but I can't remember the name.


Those RFIDs make it possible to not just scan them when the article gets bought, but also to very easily scan the inventory of the whole store by just walking through the store with a scanner.

Just as useful, although currently still more likely to be done with barcode scanners is to identify the products in the logistics chain. Right now you need scanning ports with at least 4 cameras/scanners or humans manually scanning each barcode, with RFIDs the port would be simplified further.


Those RFID tags are pretty minimal though. Basically just a thin trace antenna and silicon die directly molded into plastic.

https://www.youtube.com/watch?v=KAm7qAKAXwI

I still agree that it's amazing that our society can make functional structures with feature sizes in the nm/µm ranges so cheaply that we can afford to throw them away.


I think the insight is that designing and manufacturing bespoke silicon logic is much harder than slapping an off the shelf ARM core into a design.


Design is relatively easy. Manufacturing is the hard part.


Virtually every power tool has a micro-processor inside. LED dimmers have micro-processors. Christmas LEDs (that can blink) have micro-processors in the control block. Compared to Apple II, they might have more computation power but usually not 48KBs of memory.


I mean, nobody would ever need more memory than that anyway


It’s been said before, about ROM chips for sure, but I don’t know if history necessarily agrees regarding other chips, that there’s a point at which the opportunity cost of “just enough” silicon is too high and so you end up with a choice between way more than you need for simple tasks, or doing without (eg, embedding that responsibility in some other chip).

I think a lot of projects will be targeting ARM CPUs in the next era of computing, and I hope one day we will see entire processes moving off the cpu and onto peripherals.

Give me a RAID controller that can run Postgres directly on it, or an SSD that can run SQLite. Give me a network card that runs eBPF, or even nginx.


You're basically surrounded by 8051 processors at all times. Tiny controllers, many probably OTP or mask-ROM, doing all sorts of basic management things.


USB-C cables that support high current (> 3A) have a chip inside that communicates with the power source to let it know it can carry high current. Only then does the source advertise the higher power profiles to the consumer.


I'll start freaking out when radios start appearing in places like this.


The real problem I see is when IoT crap starts getting 4/5g modems with their own network plans so they can spy on you even while not connected to wifi


They already do.

I was looking at a sensor package (light, temp, humidity, particulates) earlier and it comes with a cellular modem and a free SIM with a modest data allowance for 2 years to send readings to the cloud.


I wish Zwave/zigbee were more practical to use. I'm not too worried about someone local pulling readings out of the air, but giving all of this crap access to my wifi, relying on cloud services that could go down at any time, and even adding a bridge straight in (your SIM comment) is just rediculous. For things like cameras or other high bandwidth applications wifi makes sense, but for the usual sensor/switch stuff, there's no need for a full IP network.

Plus, all of the stuff controlled locally means the latency is sooooooooo much lower. It's awesome being able to hit a switch in the homeassistant app and have the corisponding plug or light turn on instantly. It's like you're flipping a physical switch.


> I wish Zwave/zigbee were more practical to use.

That is essentially what Thread is:

https://en.wikipedia.org/wiki/Thread_(network_protocol)

It takes the Zigbee protocol, but adds encryption and makes devices IP-adressable. All of the major players are committed to supporting it, too!


That's not entirely what I mean though. The real issue is having people like my parents get used to using a hub.


Totally. I believe that one of the goals of thread is to integrate the “border router” capability into newer routers


Can you recommend specific parts? I'm hoping to build out a smart home lighting and sensor solution soon using local protocols. Part of it will be an IoT wifi network & VLAN without internet access but I'd like to experiment with Zigbee/Zwave as well.


Not OP, but I’ve been happy with everything made by Aeotec. My whole house is outfitted with their products on a raspberry pi + homeassistant setup. Entire process was really simple and easy, no need to “experiment”. You’ll be done in half an afternoon.

https://aeotec.com/


There should really be a database of such devices.


The FCC probably has that data, though probably not in a form that you can filter by "IoT" and it's conceivable that only an internal radio module is registered for cheap devices that didn't undergo regulatory testing.


I don't think you even have to register anything if you use a preapproved module so all of the iot stuff would not show up and only the module that implements 4g will.


Every single SD card or MicroSD card has an ARM CPU onboard to track flash wear and to handle physical to logical block mapping. Everything has a CPU anymore.


What bothers me, is the prevalence of counterfeit and just plain bad-quality, cheap chips.

These become so low-cost that corporate buyers pretty much have no choice, but to use them.

The result, is things like expensive Bluetooth headsets, crapping out, after a year. I have been through quite a few.

I/O ports see a lot of use (even virtual ones). Those are some of the hardest-working chips in a device. Not a good place for cheap dross; but they are also present in almost every device out there. Lots of money to be made.

To give Apple credit, I think they are fairly serious about keeping the quality of their internals up to snuff.

> cursed looking trampolines

I like that.

BTW: I do think this is an awesome write-up, and things like it, are why I like this place.


> Those are some of the hardest-working chips in a device.

(Mis)quoting Randy Fromm, "the things that work the hardest fail the most."

I've always liked that quote, even though at the time he was referring to CRTs and the color output transistors (well, mostly). It generalizes to most anything electronic.


My brand new M1 Mackbook Pro has only one working port. I didn't notice it until I tried to save something to a USB drive. Neither charging nor USB works on that port. Talking to Apple tomorrow. Hopefully still under warranty.


Me too. Well, one port allows charging, but no USB or phone access.


> Hopefully still under warranty

1 year iirc, but I still recommend extending it via AppleCare (and also other methods like credit card warranty extensions.)


Debugging hardware for the protocol described in the article:

https://github.com/AsahiLinux/vdmtool


One thing this article reminded me of is just how damned prevalent simple ARM based CPUs are in the world. Piles and piles of ARM chips running Linux, or in Apple's case some kind of Darwin derived OS. It doesn't talk about processing power, but these tiny embedded CPUs are far more powerful than a Commodore 64 or most early CP/M or DOS based systems which many small businesses relied on.

Just crazy to think about.


So, basically, apple copied samsung's debug hacks to "hide" data in the power lines of the USB plug?

the "end result" of the appl feature is exactly the same as the samsung one described in this paper https://ntnuopen.ntnu.no/ntnu-xmlui/bitstream/handle/11250/2...


Can someone summarize for us what this is used for? What features or functionality does it enable above and beyond what's in the common standard?


Reading this makes me wonder whether there is a case of non-open hardware and software but forced open up Api. Using modular hardware the effort turned into various other components including Hp etc. Can private market survived with this. But at that same time if a few firm controlling all these can we be slave to these firms and countries. No good answer.


Apple USB-C controllers are essentially implementing a standard API (USB), but adding proprietary additions on top.

I think the only thing to do is require open implementations and open standards.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: