
Michigan Micro Mote – World's Smallest Computer - saticmotion
http://www.eecs.umich.edu/eecs/about/articles/2015/Worlds-Smallest-Computer-Michigan-Micro-Mote.html
======
RobertoG
Very cool.

The pessimist (realist?) in me can't avoid remembering Vernor Vinge's "A
Deepness in the Sky" novel, where the perfect surveillance device is a network
of "dust computers".

~~~
vidarh
That may be so, but imagine when they get cheap enough that e.g. NSA
installations must be kept at ridiculous cleanroom conditions because every
speck of dust may be conducting counter-surveillance, or automatically dump
footage onto Youtube.

~~~
RobertoG
Actually, you can see that scene in another brilliant SF novel: David
Marusek's "Counting Heads" (even better that "A Deepness in the Sky" in my
opinion)

~~~
ceequof
Ehhh. Marusek makes several rather large conceptual leaps in order to make his
plot hang together: notably, both human clones and intelligent machines have
no civil rights, at all, to the point that either can be killed at the will of
their owners.

Most ludicrously, there's no hint of political opposition or a protest
movement. At no point does any character say anything like "Hey, wait, maybe
clones are humans too?" or "Maybe sentient machines should have rights?"

~~~
lione
If there's anything Sci-Fi has taught me, it's that sentient machines will
either be our salvation or our doom, and in both of those cases, treating them
like they have no rights isn't good for our health in the long run, so to do
so is stupid.

As for clones, considering how often our bodies replace all our cells, you
aren't remotely close to the same person you were even a year ago, which
proves that it's our minds and experiences/memories that make us who we are.
With that in mind, and knowing many other people would agree, the idea that
everyone would be ok with a clone slave force is absurd. Maybe if they were
brainless chunks of lobotomized flesh incapable of learning and totally empty
of sentient though, but otherwise?

~~~
ethbro
I'm reading through Ian Bank's Culture series currently, and enjoy the way he
treats it.

In a post-scarcity culture, where energy and information are more or less the
only resources, what argument is there against agreeing to give sufficiently
advanced AIs rights?

------
unwind
This is really cool.

The article is pretty light on the technical details about the actual
processor architecture.

I think their choice of naming is unfortunate, confusion with ARM's Cortex-M3
is quite likely. Or are you supposed to pronounce this "M to the third" or
something?

 _Update_ : D'oh, of course it's "M-cubed". I knew that, I blame not being a
native speaker for not thinking about it. :)

~~~
xgbi
Processor description is here:
[http://www.ee.columbia.edu/~mgseok/pdfs/phoenix_isscc_dac_de...](http://www.ee.columbia.edu/~mgseok/pdfs/phoenix_isscc_dac_design_contest.pdf)

I'm not sure it's an ARM..

~~~
unwind
Thanks! No, it's definitely not ARM, that was my point.

It seems to consist of "8-bit CPU, a 52x40-bit DMEM, a 64x10-bit IMEM, a
64x10-bit IROM". Those are some _seriously_ small memory sizes, a total of 64
instructions (the IROM holds the code) really isn't a lot to play with.

~~~
ansible
Yes.

Though if the silicon process technology is as good as they claim, then
scaling that up a little bit (to be on-par with conventional 8-bit micros)
won't be too bad. It will be interesting to see if they can maintain the kind
of leakage they're talking about with a smaller process size.

------
krapht
Smaller, lower power computers are great. But the problem is, and remains, the
power consumption of the RF channel. It's all well and good that your
processor consumes microamps, but one second of WiFi traffic at a reasonable
transmit power takes hundreds of milliamps. Smart sensing system's power
consumption is dominated by wireless communication energy costs.

~~~
ethbro
_> It's all well and good that your processor consumes microamps, but one
second of WiFi traffic at a reasonable transmit power takes hundreds of
milliamps._

Early cell phones took an exceptional amount of power as well. Not simply
because they were earlier tech, but because the nearest tower might be 20
miles away.

Mesh networking is going to be the new black if Io(miniature)T really takes
off.

------
ayuvar
And I thought it was hard to find my phone now.

But seriously, ambient light harvesting and (eventually) a 20 meter
communication range are the coolest parts of all this.

Ignoring the "smart swarm" applications I could see these being generally
useful as an alternative to JTAG for other really small devices.

------
ithkuil
I'm having troubles finding the actual spec (e.g. size of ram, rom etc).
Anyone has more info about it?

~~~
snops
From the pdf walterbell linked to,section "System Overview":

>...two ARM® Cortex-M0 processors are located in separate layers with
different functionality as follows: >The DSP CPU efficiently handles data
streaming from the imager (or other sensors), thus is built in 65nm CMOS
(Layer 3) with a large 16kB non-retentive SRAM (NRSRAM). >The CTRL CPU manages
the system using an always-on 3kB retentive SRAM (RSRAM) to maintain the
stored operating program, and is built in low leakage 180nm CMOS.

For ROM, the CTRL CPU just always keeps its SRAM powered. I can't seem to find
the CPU frequencies, but I would imagine they are very low, as a previous
slightly larger version in 2010 had a Cortex M3 working at 1MHz max[1].

[1][http://blaauw.eecs.umich.edu/getFile.php?id=394](http://blaauw.eecs.umich.edu/getFile.php?id=394)

------
grandalf
Is this named in homage to this?
[http://www.qsl.net/wb5ude/kc6wdk/transmitter.html](http://www.qsl.net/wb5ude/kc6wdk/transmitter.html)

------
giodamelio
This is really cool. Running off of ambient light! I am looking forward to
what they can do with swarms of them.

~~~
Intermernet
Sorry, I just epiphanised (don't worry, I'll clean it up later).

My "amazing thought":

One day, these things will be like Lego. You _may_ care that your younger
sister ate one, or the dog buried _that part_ in the garden, but you won't
"really care". _not like a computer, or a phone!_

If standards for mesh networking and cluster computing become a tiny bit
better (and more widely adopted), these things _are the internet of things_.
Almost everything else is obsolete.

Self powered, independant micro-modules that can somewhat autonomously join
and depart from processing pools mean that upgrading your "home computer"
means buying more Lego (or a new table, or light-bulbs, or a car) and moving
it within range of the other stuff in your house. _There will be no computer_
, just interfaces to your local compute cluster, which you probably won't
personally own much of, due to shared processing power agreements with your
neighbours and friends.

I know none of this is original, and I even get that this is the "big goal" of
the IoT, but _It 's actually happening, and we get to see it happen!_ (along
with a wonderful new bag of problems relating to super-distributed trust,
cost, control etc.)

Sorry, over-excited, taking myself off to bed now.

~~~
walterbell
Latency between compute "molecules" will be a limiting factor.

------
fit2rule
Dust-computing is upon us. All hail the holy grey goo!

Seriously though, I wonder what the setback is with using energy harvesting in
general - or are we just at the beginning of the wave of energy harvesting
revolution?

[http://www.st.com/web/en/press/p3498](http://www.st.com/web/en/press/p3498)
[http://www.linear.com/products/energy_harvesting](http://www.linear.com/products/energy_harvesting)

~~~
minthd
I think that in most cases, using a rechargeable battery is cheaper - and
cheaper leads in most areas of embedded systems.Also most embedded systems
need power for something like motor/lcd/relay/etc and energy harvesting cannot
supply enough energy for that - and the issue of wireless connectivity of such
networks is only partially solved and not yet mature i think. And even the
micro-controllers are only half way there.

Also the places that could use energy harvesting are sensors networks in
industry and building, and those are quite conservative industries(for good
reasons) and they greatly care about reliability, and also the field of sensor
networks for industry is not fully developed with applications like predictive
maintenance are quite new. Same goes for the medical industry , where such
sensors could be valuable.

------
saganus
Stretching this a bit, when someone marriages this with the MIT things that
auto-assemble... we'll risk getting some sort of Stargate's Asurans race,
which are based on nanites. Self-assembling nanobots [0].

Of course this is just sci-fi... but nonetheless...

[0][http://stargate.wikia.com/wiki/Asuran](http://stargate.wikia.com/wiki/Asuran)

------
trynumber9
In the linked paper, they describe their choice of 180nm as optimal for power.
Given the tools they used, they could make an even smaller computer with 130nm
or 90nm technology. But the power draw would be higher according to the graph.
Or they could add more memory instead of a smaller size.

The CPU is absolutely tiny, not much larger than the temperature sensor.

------
IgorPartola
So there would literally be a microchip in my brain?! All kidding aside, this
is cool.

~~~
Raphmedia
"So there could literally be a microchip in my brain?!"

Fixed that for you.

~~~
Raphmedia
Can't edit, so here's an explanation.

I didn't mean to say that I fixed the grammar. I meant it as in there could
actually be a microchip in your brain right now. The tech exists.

~~~
sfeng
The limitation is effective communication to/from the organic parts of your
brain.

------
nsxwolf
Can't wait for the Arduino equivalent of one of these.

