Why not design your own integrated circuit? Now you too can have the thrill of extra capital and technical risk in your project! It has the added benefits of setting your architecture in stone and preventing any kind of pivot while hindering future expansion.
Seriously, you need a good reason to believe you can do better than the chip companies before designing your own ASIC as a commercial project. Low power is not the right end to do this at. The interesting case is when you need moderate computing power for a special task at better compute/watt than general purpose hardware. That's what provoked the wave of bitcoin miner startups.
You're right, you only consider designing an ASIC if it's guaranteed to be more interesting than other options. Note that the capital and risk are not so high if you use an older, proven technology (like 90nm) which should be more than enough for IoT-like devices.
Low power only may not be sufficient in itself to require an ASIC, but as you say it all depends on the computing power that is required. Which for a temperature sensor is close to nothing... I'll need a better example for the next article!
You're the original author? Sorry about the sarcasm :)
A couple of years ago I was involved with a project doing verification of a custom CPU design for IoT purposes based on the 6502. There were a lot of frustrated discussions in the breakroom as we couldn't really see the point of the customness of this technology. Should we tell the client? In the end we didn't and the client went bust before paying our invoice.
Yes, the thing about temperature sensors and the like is that they're just a wireless peripheral. Eventually one of the competing standards will win (6lowpan?) and they can be as commoditised as bluetooth headsets. The important thing about IoT is turning a demo gimmick into a value proposition with satisfactory UX. Home automation has been around as a concept for years and remained a niche.
There might be a market in custom hardware for "security done right" for IoT. Never mind changing the batteries, I don't want to have to update the firmware in my lightbulbs (or doorlocks!) every few weeks due to exploits.
(I could write a whole other post agreeing with you about how HDLs are universally awful)
Updating wirelessly is also a possibility, not for ASICs, but BT low energy can make it feasible if you do something like a mesh network. Of course, that means if one device is susceptible, others networked may be too.
I don't really have a solution, but if you are interested in playing around with that kind of concept, here's something you can probably start with. [1]
The battery lasts about a year or more, supposedly, with a typical watch-kind battery. It isn't too hard to update and it has a few sensors on it already.
Running short on time. I have a laundry list of Verilog replacement suggestions on another machine somewhere. Brief observations on Cx:
Based on C. Two objections: (1) why not SystemC? (2) why C when the non-HDL world is trying to get as far from C as possible?
My personal prejudice would be for functional rather than imperative style. AFAIK nobody is seriously attempting this outside of academia: http://essay.utwente.nl/59482/
Saw an example with "new" and "import" which are kinda Java flavoured. "new" feels wrong for hardware: no allocation or gc!
Plus points for ngDesign. Another plus point for trying to get hardware designers to use git rather than clearcase or other abominations. Good that you're leaning on the verification angle.
I spent a decade in an EDA startup. It was a tough market but interesting work. Good luck!
Thanks for your feedback and encouragement! I'll try to shine some light on Cx. This comment has become a bit of a rant as a consequence, I hope you'll forgive me :)
Cx is not so much C-based as C-like, and the difference is subtle but real. It's a dedicated language looking like C (and yes, a bit like Java) rather than something based on C, because bending an existing language backwards and using a small subset of it just doesn't feel right. Same reason why not SystemC: SystemC is your typical design-by-committee/the-enterprise horror, it's verbose, and it's just a tiny, weird subset of C++ with a lot of templates. How do you know what you're supposed to use to get something "synthesizable"? Another big problem with SystemC, just like any HDL, almost all of them are mainly simulation languages.
We created something C-like to have something that most people would find easy to start using; the C-like "curly" syntax is familiar to any developer (even JavaScript uses that). Same reason to go with imperative rather than functional: it's the paradigm that most people are comfortable using (not to mention that functional programming is not really a good metaphor for what happens in hardware - after all, a state machine is a sequence of things that modify state). The article you linked shows an interesting approach, but I believe it will remain a niche, kind of like functional programming in software (possibly even more so given the large difference in abstraction level).
Yes the "new" may seem surprising at first, yet it has the same meaning it has in other languages: it creates a new instance of an object, it's just that instances are created and connected at compile time rather than at runtime. We could have done without it, but I've found myself confused about what goes first in VHDL and Verilog enough times that I felt it was needed :-)
I think I will write a blog post on this, and link it on Hacker News one of these days! And if you want to have a chat, just send me an email or a PM on our forum.
I think a nice approach to describe HW + visual systems is kind of patterns. Similar to what is done in this project to describe waveforms: http://wavedrom.com/
Developing an ultra low power design is not a trivial task. It is here where you discover the manufacturer datasheet somehow achieves figures you just for the live of you cannot achieve, even with their reference designs and evaluation kits.
You can spend an entire month just adjusting the state of the various pins on your device to shave off uA's, and just when you have hit your power consumption target, you realise your product now doesn't always boot up or suffers from latch up in some circumstances / temperatures.
Also this article does not take into consideration that the little CR2032 has internal leakage, your circuit has leakage currents and when transmitting you are suddenly pulling a lot more current out of the battery, so it wouldn't deliver that full 230mAh (but he did say it is hypothetical). Getting even 2 years of operation out of any of the CR range of batteries is already a feat.
The Jack Ganssle article linked in his post (see below) does an excellent job of covering the issues of CR2032 battery performance, parasitic power loss, and etc.
Thanks for the link, indeed it looks quite technical and very complete on the subject. Truth be told I'm far from an expert on power, hence the mistake :-)
Considering the 1 minute interval wouldn't it be better for the battery to use a clock implemented in hardware that boots the chip everytime it's needed?
That is typically what you do, microcontrollers have timers and low-power mode(s) which keep the timers running even though the processor core isn't executing any instructions.
I'm not an expert on power optimization, and I don't know how this is done in micro-controllers. But I would think that the hardware clock would be implemented using integrated analog components (capacitor + resistor circuit) rather than digital logic, to avoid repeatedly switching on and off even just a few transistors.
Apparently external xtal is marginally best (http://www.microchip.com/forums/m341592.aspx); which makes intuitive sense, given that it's exactly the same scenario as a digital watch. 32khz crystal; driver; counter; comparator. Few hundred tiny transistors. Consumption probably less than battery self-discharge.
The impedance of free space is (unfortunately) a few hundred ohms. On top of this, the minimum signal voltage of analog integrated circuits is set by unsystematic (random) offsets, which get worse as transistor sizes shrink. It's possible to mitigate these offsets by circuit techniques, but they cannot be eliminated.
Near-field radio can break free of the impedance constraint (which applies only to far-field TEM waves), but antenna area sets signal level, as in flux=intensity*area. Why make a tiny chip when the antenna needs to be the size of a quarter?
It's not easy to design radio chips for either of these scenarios. Pushing the radio burden onto the DSP consumes power on the digital side, so no easy way out.
[Aside: Referring to a recent thread on measurement of the Planck constant, the ratio of the impedance of free space to the Hall resistance turns out to be the dimensionless fine structure constant, alpha. This alpha sets the coupling strength of an electron and photon in the quantum theory of electrodynamics, and the 'Taylor series' for the self-energy of an electron converges because alpha is much less than unity. Feynman diagrams are a way to keep track of terms in that Taylor series, to ensure that the Schrodinger equation is kept consistent with special relativity, etc.]
Hm.. as hardware becomes a commodity, I don't think that custom designs can beat the commodity price point. What would be helpful, more web/interactive kind of tools to make hardware better accessible. For example, simple power calculators, that would compare the power consumption of an Arduino with special low-power devices. Or, how a 5V system compares to a 3V or 1.2V.
There are serious obstacles to this. The "freedom to redistribute copies" and "freedom to modify" aspects of Free software really don't translate well to hardware. There is unavoidable work and cost associated with copying a board, and much more associated with copying an ASIC (modified or otherwise). It's simply not going to be within the reach of the average user in the forseeable future.
What you might get is crowdfunded hardware, but that requires reasonable demands from the target market.
good points. maybe it is more about datasheets, examples, demo's, kits, ... that suits to open sharing. Not for nothing TI is becoming a major open-source contributor. Same for Atmel via Arduino.
exactly, and there are some open-source hardware designs (on sites like OpenCores), but the problem is that with existing hardware description languages very few people are able to reuse and contribute to these. This is why we've created the Cx language, to make it easier to design hardware for developers, and not just hardware designers.
We've open sourced the compiler; and we've designed some open-source hardware in Cx, see for example our Ethernet MAC: https://github.com/synflow/ethernet-mac
interesting how you generate blocks from the language: http://cx-lang.org/documentation/structure - I guess visual feedback would make the language attractive to non-HW designers. I gave a small talk once on sharing HW projects in the browser: https://speakerdeck.com/mulderp/sharing-hardware-with-javasc... - of course this is a different level, but still, having web/svg kinds of HW representations would be interesting. I guess for HW, good representations of schematics/waveforms are at least as important as he actual HDL, but I might be wrong.
Just to clarify, the diagrams on that page are actually made with Synplify from the generated HDL. As a matter of fact, I agree with you, and we do think that visualization is important. In our tool we have two views: if you're in a task, we show its state machine (kind of required since the input is structured code); and if you're in a network, we show a block diagram. This is how I made the pipeline in another post there: https://blog.synflow.com/hardware-acceleration/
No waveform yet because of limited manpower, but this is on the TODO list :-)
Seriously, you need a good reason to believe you can do better than the chip companies before designing your own ASIC as a commercial project. Low power is not the right end to do this at. The interesting case is when you need moderate computing power for a special task at better compute/watt than general purpose hardware. That's what provoked the wave of bitcoin miner startups.