

Integrated Circuit design for the IoT - MootWoop
https://blog.synflow.com/integrated-circuit-design-for-the-internet-of-things-iot/

======
pjc50
Why not design your own integrated circuit? Now you too can have the thrill of
extra capital and technical risk in your project! It has the added benefits of
setting your architecture in stone and preventing any kind of pivot while
hindering future expansion.

Seriously, you need a good reason to believe you can do better than the chip
companies before designing your own ASIC as a commercial project. Low power is
not the right end to do this at. The interesting case is when you need
moderate computing power for a special task at better compute/watt than
general purpose hardware. That's what provoked the wave of bitcoin miner
startups.

~~~
MootWoop
You're right, you only consider designing an ASIC if it's guaranteed to be
more interesting than other options. Note that the capital and risk are not so
high if you use an older, proven technology (like 90nm) which should be more
than enough for IoT-like devices.

Low power only may not be sufficient in itself to require an ASIC, but as you
say it all depends on the computing power that is required. Which for a
temperature sensor is close to nothing... I'll need a better example for the
next article!

~~~
pjc50
You're the original author? Sorry about the sarcasm :)

A couple of years ago I was involved with a project doing verification of a
custom CPU design for IoT purposes based on the 6502. There were a lot of
frustrated discussions in the breakroom as we couldn't really see the point of
the customness of this technology. Should we tell the client? In the end we
didn't and the client went bust before paying our invoice.

Yes, the thing about temperature sensors and the like is that they're just a
wireless peripheral. Eventually one of the competing standards will win
(6lowpan?) and they can be as commoditised as bluetooth headsets. The
important thing about IoT is turning a demo gimmick into a value proposition
with satisfactory UX. Home automation has been around as a concept for _years_
and remained a niche.

There might be a market in custom hardware for "security done right" for IoT.
Never mind changing the batteries, I don't want to have to update the firmware
in my lightbulbs (or doorlocks!) every few weeks due to exploits.

(I could write a whole other post agreeing with you about how HDLs are
universally awful)

~~~
MootWoop
Yes I'm the author, and no problem :-)

If you find HDLs awful, I'd be curious to know what you think of our language,
Cx. Maybe we can continue this conversation somewhere else?

~~~
pjc50
Running short on time. I have a laundry list of Verilog replacement
suggestions on another machine somewhere. Brief observations on Cx:

Based on C. Two objections: (1) why not SystemC? (2) why C when the non-HDL
world is trying to get as far from C as possible?

My personal prejudice would be for functional rather than imperative style.
AFAIK nobody is seriously attempting this outside of academia:
[http://essay.utwente.nl/59482/](http://essay.utwente.nl/59482/)

Saw an example with "new" and "import" which are kinda Java flavoured. "new"
feels wrong for hardware: no allocation or gc!

Plus points for ngDesign. Another plus point for trying to get hardware
designers to use git rather than clearcase or other abominations. Good that
you're leaning on the verification angle.

I spent a decade in an EDA startup. It was a tough market but interesting
work. Good luck!

~~~
MootWoop
Thanks for your feedback and encouragement! I'll try to shine some light on
Cx. This comment has become a bit of a rant as a consequence, I hope you'll
forgive me :)

Cx is not so much C-based as C-like, and the difference is subtle but real.
It's a dedicated language looking like C (and yes, a bit like Java) rather
than something based on C, because bending an existing language backwards and
using a small subset of it just doesn't feel right. Same reason why not
SystemC: SystemC is your typical design-by-committee/the-enterprise horror,
it's verbose, and it's just a tiny, weird subset of C++ with a lot of
templates. How do you know what you're supposed to use to get something
"synthesizable"? Another big problem with SystemC, just like any HDL, almost
all of them are mainly _simulation languages_.

We created something C-like to have something that most people would find easy
to start using; the C-like "curly" syntax is familiar to any developer (even
JavaScript uses that). Same reason to go with imperative rather than
functional: it's the paradigm that most people are comfortable using (not to
mention that functional programming is not really a good metaphor for what
happens in hardware - after all, a state machine is a sequence of things that
modify state). The article you linked shows an interesting approach, but I
believe it will remain a niche, kind of like functional programming in
software (possibly even more so given the large difference in abstraction
level).

Yes the "new" may seem surprising at first, yet it has the same meaning it has
in other languages: it creates a new instance of an object, it's just that
instances are created and connected at compile time rather than at runtime. We
could have done without it, but I've found myself confused about what goes
first in VHDL and Verilog enough times that I felt it was needed :-)

I think I will write a blog post on this, and link it on Hacker News one of
these days! And if you want to have a chat, just send me an email or a PM on
our forum.

------
kabouseng
Developing an ultra low power design is not a trivial task. It is here where
you discover the manufacturer datasheet somehow achieves figures you just for
the live of you cannot achieve, even with their reference designs and
evaluation kits.

You can spend an entire month just adjusting the state of the various pins on
your device to shave off uA's, and just when you have hit your power
consumption target, you realise your product now doesn't always boot up or
suffers from latch up in some circumstances / temperatures.

Also this article does not take into consideration that the little CR2032 has
internal leakage, your circuit has leakage currents and when transmitting you
are suddenly pulling a lot more current out of the battery, so it wouldn't
deliver that full 230mAh (but he did say it is hypothetical). Getting even 2
years of operation out of any of the CR __ __range of batteries is already a
feat.

-edit typo

~~~
joezydeco
The Jack Ganssle article linked in his post (see below) does an excellent job
of covering the issues of CR2032 battery performance, parasitic power loss,
and etc.

------
leppie
Author misinterpreted specs...

"Each packet has 20 bytes of useful payload and consumes 49 μA at 3 V"

That is a manufacture average for some profile, mostly sleeping 99% of the
time.

The real consumption during broadcast is in the order of 15-20mA for most chip
I have seen.

Here is a very informative article dealing with power demands of low power
devices: www.ganssle.com/reports/ultra-low-power-design.html

~~~
Gurkenmaster
Considering the 1 minute interval wouldn't it be better for the battery to use
a clock implemented in hardware that boots the chip everytime it's needed?

~~~
MootWoop
I'm not an expert on power optimization, and I don't know how this is done in
micro-controllers. But I would think that the hardware clock would be
implemented using integrated analog components (capacitor + resistor circuit)
rather than digital logic, to avoid repeatedly switching on and off even just
a few transistors.

~~~
BostonEnginerd
The power needed to keep the analog comparator going will probably be higher
than keeping the 32kHz clock running.

------
quarterwave
The impedance of free space is (unfortunately) a few hundred ohms. On top of
this, the minimum signal voltage of analog integrated circuits is set by
unsystematic (random) offsets, which get worse as transistor sizes shrink.
It's possible to mitigate these offsets by circuit techniques, but they cannot
be eliminated.

Near-field radio can break free of the impedance constraint (which applies
only to far-field TEM waves), but antenna area sets signal level, as in
flux=intensity*area. Why make a tiny chip when the antenna needs to be the
size of a quarter?

It's not easy to design radio chips for either of these scenarios. Pushing the
radio burden onto the DSP consumes power on the digital side, so no easy way
out.

[Aside: Referring to a recent thread on measurement of the Planck constant,
the ratio of the impedance of free space to the Hall resistance turns out to
be the dimensionless fine structure constant, alpha. This alpha sets the
coupling strength of an electron and photon in the quantum theory of
electrodynamics, and the 'Taylor series' for the self-energy of an electron
converges because alpha is much less than unity. Feynman diagrams are a way to
keep track of terms in that Taylor series, to ensure that the Schrodinger
equation is kept consistent with special relativity, etc.]

------
poseid
Hm.. as hardware becomes a commodity, I don't think that custom designs can
beat the commodity price point. What would be helpful, more web/interactive
kind of tools to make hardware better accessible. For example, simple power
calculators, that would compare the power consumption of an Arduino with
special low-power devices. Or, how a 5V system compares to a 3V or 1.2V.

~~~
timthorn
The economics all depends on volumes, of course.

~~~
poseid
but what would be really interesting would be to see open-source hardware
development, similar to a Linux kernel

~~~
pjc50
There are serious obstacles to this. The "freedom to redistribute copies" and
"freedom to modify" aspects of Free software really don't translate well to
hardware. There is unavoidable work and cost associated with copying a board,
and much more associated with copying an ASIC (modified or otherwise). It's
simply not going to be within the reach of the average user in the forseeable
future.

What you _might_ get is crowdfunded hardware, but that requires reasonable
demands from the target market.

See this discussion [http://electronics.stackexchange.com/questions/61873/why-
is-...](http://electronics.stackexchange.com/questions/61873/why-is-open-
hardware-so-rare/61875)

~~~
poseid
good points. maybe it is more about datasheets, examples, demo's, kits, ...
that suits to open sharing. Not for nothing TI is becoming a major open-source
contributor. Same for Atmel via Arduino.

------
pjmlp
Quite right.

Maybe the best approach is to use a modern IoT board to prototype the design
and then port it to a PIC microcontroler/ASIC for the production design.

