Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

yes. bsder's comment should only be taken to represent his own reality.

As an embedded firmware developer for 18 years, I've seen every project has its own constraints, just like most engineering. Design constraints on microcontrollers include: power, clock speed, FPU, number of digital GPIO, number of analog GPIO, quality of IDE and debugger, price, package size, available operating systems, longevity of part, volatile and non-volatile memory, quality of compilers, peripherals and so on.

IME, Atmel had a great 8-bit series with good documentation and few bugs that scaled from 8-pins to 40 very well. With analog inputs and PWM output, they were well-equiped to do analog input->processing->output tasks. They really blew it when they moved to 32-bit as others have said. They weren't very fast, changed all the peripherals around and didn't improve on the clumsy fuse system. ARM M ("M"->embedded) Cores like stm32f4, Freescale Kinetix, etc. have all but obliterated Atmel.

Finally differentiation in the Cortex family includes ethernet (F27/F29), Front-side Memory Controller, FPU, low-power (STM32L0), extreme low power (MSP432), low-price (STM32F0) and so on.



> yes. bsder's comment should only be taken to represent his own reality.

No argument.

> Design constraints on microcontrollers include: power, clock speed, FPU, number of digital GPIO, number of analog GPIO, quality of IDE and debugger, price, package size, available operating systems, longevity of part, volatile and non-volatile memory, quality of compilers, peripherals and so on.

True. But most of the embedded work I have been dealing with over the last 5? years seems to be very different from prior. In the last 5 years, choice of microcontroller moved from contentious to almost an afterthought: 8-bit, 16-bit, 32-bit? Motorola/Atmel/Microchip/ST/Renesas? gcc vs proprietary? Sufficient frequency? GPIB/RS-232/USB? All gone.

I get asked about memory and footprint (BGA/CSP is starting to become popular ... bangs head on wall) and that's about it. Even cost just doesn't come up much anymore--I don't know if its that things are cheap enough or that everybody now actually knows what microcontrollers cost.

Pretty much things seem to be splitting into two bands: internal memory only (M0, M3/M4 class--generally an RTOS) and external memory (A-class+, runs Linux). And, given some of the higher end M-series, I suspect Linux is going to hit there shortly.


Which problem spaces are you working in btw? I've been doing medical, microwave radio, infra-red LEDs, bike light LEDS, music instruments, architecture and some other small stuff.


Lately: monitoring and sensing across a vast spectrum of industries.

Lots of people want data about business processes and throwing some coin cell sized boards with BLE and sensors into a product as it goes from start to end customer can be enlightening ... sometimes too much so for some businesses :). The real trick is having the infrastructure and team to receive and analyze the data (which we do now that we've done it enough).

Industrial automation is in the mix. A lot of that has been moving from combinations of IEEE-488, RS-232, USB or other non-differential links to CAN and Ethernet. Customers are often amazed how many fewer problems they have when they use a connection that is actually electrically differential.

It's gotten to the point that when I hear "reliability issues" I look for USB and just wipe it out. Generally there will be a network of PC's connected via USB to "random industrial machine X" or, worse, PC->USB Adapter->RS-232/IEE-488/etc. (to be fair--the old systems with actual RS-232 ports or IEEE-488 cards generally worked fine as they had more than enough signal overhead and shielding to deal with industrial electrical spikes--it's the USB interface that disconnects and causes Windows to crap its pants).

A replace of a bunch of those with something with a CAN interface generally fixes industrial problems. I finally sat down and wrote my own CANOpen stack because I was using it so frequently.

Medical and bio is an occasional, but that hasn't been such a big mover for me lately. The people who cut checks and the people who actually know what their problems are are too far apart in the hierarchy right now. My big problem in bio is that I could do quite a bit using full-custom VLSI to go to ultra low power functionality, but nobody wants to pay for that. So, I'll continue to use off-the-shelf microcontrollers and FPGA's and use way too much power.

Most of my grief these days isn't electrical. It's mechanical. I used 3D printers and silcone molds a lot. Very messy and annoying to work with, but it saves us having to cut molds for injection molding.

I can 3D print a lot of parts for the cost of even a single injection mold. And my margins aren't so tight that I need the cost savings. And I can change the part if something is wrong. And I don't have to scream at the mold maker when (not if) he screws my mold up.


Isn't USB data lines differential?


Yes, but the detection that a cable has been plugged in isn't. Whoops.

To be fair, this isn't really an issue unless your device grounds are likely to have more than a couple of volts difference due to spikes. And that happens very rarely in the home or office.

In industrial settings, however, that's not true.


Apart from this there are other issues to consider when you have to go through different certifications required for different domains. When you have MCU's in automotive,industrial,medical and millitary (invasive instruments) it is better to go through safety critical MCU's with built in core protection to pass through different certifications. Apart from this thermal considerations play a big factor when you go through different qualification tests.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: