Hacker News new | past | comments | ask | show | jobs | submit login
LittleD – SQL database for IoT, can run queries in 1KB (github.com)
244 points by pfalcon on Nov 24, 2015 | hide | past | web | favorite | 103 comments



I am the author of LittleD.

I am more than happy to answer any questions. It has mostly been an academic project while I was doing during my undergrad, but I am now looking at continuing the project as part of a Ph.D. Of course, I would love to coordinate a broader development of this project. :)

You may also be interested in another project I and some lab mates have been working on over the last couple of years, IonDB: https://github.com/iondbproject/iondb.

EDIT: You might also be interested in the initial paper, which can be found here: https://people.ok.ubc.ca/rlawrenc/research/Papers/LittleD.pd...

I investigated query precompilation in another paper I am waiting to here back on, and once everything has been a little better tested, that code will get pushed out as well. :)

And seriously, if anybody is interested in contributing, I would love to have some help. Get in contact with me!


How did you learn to write a database from scratch? What books, courses, tutorials do you recommend?


I took the same course @geedy is speaking of (with him, in fact) and there are two good comparable resources I'm aware of:

(1) Ramon's (excellent!) lecture notes themselves are currently (but not always) available on his website: https://people.ok.ubc.ca/rlawrenc/teaching/404/notes/index.h...

(2) Garcia-Molina, Ullman and Widom's book and courses taught on their work: http://www.amazon.ca/Database-System-Implementation-Hector-G... http://infolab.stanford.edu/~ullman/dbsi.html


I took a computer science course at the University of British Columbia's Okanagan campus called COSC404 under Professor Ramon Lawrence (who is/was also my research supervisor at UBC), which was on database implementation. It was here that I learned most of the material.


Any reason you targeted "IoT" devices specifically? If it's a good DB, and has a small footprint, I'm sure more than just "IoT" devs will take interest.


I did not target "IoT" specifically if I am honest. Its one of the use cases I envision. But sensors and motes would could find just as much use, or anything having to collect and store data to make decisions with over time.


What is the computational complexity of looking up, adding and deleting a single row?

Is it comparable with binary trees [O(n log n)] and hashtables [O(1) on average]?


That is one area LittleD needs some improvement. We only support CREATE TABLE, INSERT, SELECT right now. O(1) for inserts because we only append.

That said, IonDB was in part motivated to be a storage engine for LittleD. Then you basically get exactly what you described: your choice of underlying dictionary data-structure for the performance characteristics you need. It will also make UPDATE/DELETE a breeze, once we incorporate the two code-bases!


Binary trees are log(n), not n*log(n)


You're right. n*log(n) is for building the BST in Tree sort.


How does this implementation compares to sqlite?


SQLite has many more features. It has more types, more functions, more SQL support. It also requires significantly more resources; they claim 100KiB of heap space and 4KiB of stack space with at least 300KiB of compiled code, stripped as small as possible. I don't know what this means.

LittleD gives you the bare basics, using well under 100K of compiled code, and under 1K of RAM for average queries, period. You will soon be able to strip out the query translator and "precompile" your queries with parameters to bring the total compiled code down to under 35K on device. This is not always desirable, but in some cases can be very useful.


Add more usage examples! Or even a tutorial. :)


I know I absolutely could make that README a lot better, and add an example set of code.


When would you actually need a relational database running on such a small platform, today?

I understand programming in general and microcontrollers; I was paid to program one. Still, surely nowadays a system with a tiny 8-bit microcontroller is probably considered as auxillary to some kind of larger machine, which should be way better equipped to do the heavy lifting such as SQL databases?


There are still lots of uses for tiny microcontrollers that don't connect to a larger machine, or only do so on occasion. By having a database backend, you make it a lot easier to query the microcontroller out in the field for a specific subset of data, instead of it needing to dump everything out, wasting time and battery life.


You can easily get just part of your data using a binary tree or a hashtable and simple C functions like for, while, if. In most cases there is no need for a full-blown SQL engine.


On the other hand, microcontrollers typically have ample Flash storage for code, but very limited amounts of RAM. Instead of making everyone rewrite their own memory-efficient data structures, it might well make sense to have a single, general-purpose implementation -- even if some of the code goes unused in a particular application.


Keep in mind that the Flash storage has a limited number of writes, so keeping frequently changing data in memory and choosing a right data structure can be crucial for your data's life span. SQL is not an universal solution for all use cases.


You are not wrong, but consider that the problem may be handled a number of ways. Once I move to storing everything through IonDB, you will be able to choose key-value stores that could wear level, should one be implemented. Such algorithms have been suggested in other works. Furthermore, if the device supports a storage API that already does wear levelling, you than a SQL layer on top should not hurt.


Well, if you can really make SQL work well within such constraints and be convenient to use - I see no harm in it.


If all you need is a key-value store, consider IonDB: https://github.com/iondbproject/iondb


I haven't created large amounts of C/C++, so I'm certainly not an expert, but I was impressed by how well the source was commented and structured. Nice to see well documented, well thought out code, congrats


    /* Advance a pointer num_times times.  Notice, this does no checking, so
    you better know what YOU ARE DOING!!!! */
The source is a pretty interesting read too.


Wow, thank you for the feedback. I take a lot of pride in making the code readable. Maintaining a large C project by oneself for notorably fickle devices has been challenging, so you need to get all the of help you can.


I have written large amounts of C/C++ and I too am impressed by how well documented the code is (trust me, most C++ code isn't!)


Thank you! I actually have become more anal about code quality as I've gone along, to the point of annoying people. IonDB represents a much better coding standard, in my mind than LittleD. Like I say, every bit of documentation and cleanliness helps!


This looks like a very cool project. I cloned it and generated the docs, and while the code is certainly well-document, I didn't see any type of high-level narrative or public APIs. Was hoping to see at least a "littleD.h" or something, but it seems you have to do some source diving or look at the tests to figure out how to use this library.


Please go to https://github.com/graemedouglas/LittleD/issues/6 and give it +1 then.


I absolutely need to improve that side of it, there is no doubt. Thank you for the feedback!


Get a link to the paper up front in the readme.


I'd be quite interested to see this on an ESP8266 board. I use the Adafruit huzzah version[1] which comes with a lot of extra fluff for developers, but the core microcontroller is a 32 bit MCU with 64k of ram. It is about the size of a US Quarter Dollar coin and can be had for as low as ~$2 USD.

[1] https://www.adafruit.com/products/2471


We've actually been looking at getting some newer devices, I'll look to add this to our list. Thanks!


What is the advantage of using a database vs. rolling your own data structures on such a small device?


Dynamic queries and data serialization. Imagine sending just a text query (in SQL) around to a sensor network. It would be a fair amount of work to reproduce that on your own but seems like it can open up some nice features for distributed IoT


It still seems pretty inefficient. With 4KB of RAM I'd just store my data in an array of structs and send it in raw binary to the central server. The SQL also leaves potential for running unwanted SQL.

I'm reminded of a commercial home automation platform that had agonizingly slow response times because the LCD controllers were sending raw, unauthenticated Python code to the 70MHz central controller.


The same as on any other device I guess - you either need to sit and roll out your own data structures, or just take and use a database.


I guess it depends on what you find more familiar. But at this device scale, one should really be comfortable with C data structures.


As someone who has spent the past year and change developing an automation system using the targeted platform (Megas) I can say that this sounds like a very bad way to do things.

While I applaud the effort, and it will certainly be useful for many people for many other reasons, I don't think IoT is the best use case here. If your data is at all valuable/useful, you don't want it sitting out somewhere on some device that will hopefully/maybe be online when you go to query it. Plus now just to be able to query it remotely you will still need to develop some sort of API that lives on the device that can talk to the LittleD database.

Finally if you really are doing 'IoT' you clearly have a bunch of things that you want to view/control from a centralized platform. When you have the devices talking to a server, you can do this. When you have to ask each device individually what is going on with it, this becomes much harder.


There isn't really a targeted platform. There just only happens to be one of me at the moment, and so I've only managed to compile it for a few devices.

As you noted, its not really an IoT databases. It could be used as such, but thats not the reason I built it.

I have also been working on the data transfer problem, because without the ability the share the data it is in fact kind of a useless platform in many applications. I have a job manager being developed that will allow for scheduled or ad-hoc execution of functions. I've also written some small library to encode LittleD results for network transmission. Using some basic networking stack, it would be easy to assemble these pieces into something that could be viewed/controlled from a centralized platform. LittleD could even be modified with relative ease to actually query over other LittleD instances!

EDIT: I would actually like to encourage you to divulge some specific criticism for the IoT application. Are you specifically speaking to your automation system, or to IoT at large? It seems difficult to predict exactly how any one person might apply any given technology.


We've built a sync-enabled database, and even at 500 kb it's getting a lot of uptake in IoT applications.

Open source databases: http://developer.couchbase.com/mobile

Info about how GE is using it: http://www.couchbase.com/nosql-resources/presentations/offli...

Long story short: when you get the network stuff figured out, it's gonna be an interesting product.

Feel free to contact me (info in profile) if you want to chat about how this can fit into the industry.


When you say IoT applications, do you mean phone apps which are dealing with data which originated from a "thing" in the IoT, but that the data has already gotten to the phone or cloud through some other, non-Couchbase Mobile channel?

Or is it possible to run or communicate/sync with Couchbase Mobile directly from a "thing" itself?


Couchbase Mobile has an optional on device rest API we use for p2p sync. You can also use it to push data from edge devices to a handset or base station, but typically phones will ping devices.


I don't work with Megas, but I have to say this sounds like a terribly brittle view.

Frequently it is exactly when connectivity is gone that you want some intelligence combined with data upon which to act, or a buffer in which to spool data. Otherwise you've just got a pile of sensors that are slightly easier to deploy than the ones with long extension cords.

Put another way,

> If your data is at all valuable/useful, you don't want it sitting out somewhere on some device that will hopefully/maybe be online when you go to query it.

So what do you do with data on your device when it is offline? If it is at all valuable/useful, hopefully you're not just dropping it on the floor when Daddy-node is unreachable. Sounds like a job for which some sort of structured data store might be useful...


That seems like a narrow-minded and dismissive thing to say unless you know what the intended use case is.

What if these IoT devices aren't just dumb sensors, and need to control some process? Then your argument is reversed; you want the data to be available locally, instead of having to contact a remote server that "will hopefully/maybe be online".


This is really neat. What's a realistic use case for a SQL-like database running on a microcontroller?


Author here.

Any time you want to store historical data, and query that data later.

Very similar to how you might imagine using a database for a website, just at a smaller scale. The project that inspired this work was a water metering project in Kelowna, BC. Basically, a friend of mine took a bunch of micro-controllers with soil moisture sensors, shoved them in the ground, and tried to come up with better demand-based watering schedules. It took them two months to develop the data management code, and knew it could have been days or hours with a proper database.


Thanks! What would the storage look like for this?

(I have an ulterior motive for asking, which is that I will probably end up shoplifting this code and embedding it into the next set of levels for our CTF game, which is serverside-emulated AVR).


I did a fair amount of testing with arduino-based SD-card storage, but as long as you had some sort of way to store bytes to your preferred medium, that shouldn't be too hard to hack yourself. Check out https://github.com/graemedouglas/LittleD/blob/master/src/dbs...

EDIT: Please let me know how/if you end up using it! Or if you run into bugs! :)


If I'm reading you right and extending your example more concretely, you're doing IoT SIMD parallelization, right?

So rather than 1M soil moisture sensors continuously sending records to a central repo, you're giving the mesh network something exciting to do by distributing "select * from db where moisture<200" or whatever then collating all the responses, if any...

There's a classic crypto thought experiment illustrating SIMD for cracking keys where the Chinese distribute 100M boom boxes (well, its an old thought experiment) and they randomly test keys and when the red light turns on indicating factor found, the owner of the radio turns it in for a substantial reward. And thats how SIMD key cracking gets a 27 bit parallelism speedup over one box. In the IoT era I assume its going to be a normal thing for lightbulbs and toasters to get powned to mine bitcoins and the like.

You could probably extend both the real story and the old crypto thought experiment to help provide some docs. Then again I read your unit tests and they kind of document the system pretty well. Nice tests.

In true supercomputer fashion, by going SIMD you've taken a formerly CPU bound problem, and turned it into an IO bound problem, but for little microcontrollers not classic supercomputer hardware, which I thought was pretty funny.


In the example given, it was more about needing to collect data over a large area, such as across multiple city parks. They needed to manage data at the source(s) to inform good watering decisions.


Interesting. Hope the author gets a usable subset there in a flexible coding style. Might be ported to other devices and become a general thing.

Love the name, too, haha.

Note: How much of a need is there for a SQL database on 8-bitters, etc? Can't one do that in a front-end at the client side and just have simple commands sent over network to device? What I always did for limited or security-critical devices. No way I'd put a whole 4GL on them lol.


My goal has always been to support a wide-variety of devices.

As for the name, I cannot take credit. There is a good story behind it though. ;)

As for the need, I sort of explained the motivation this comment: https://news.ycombinator.com/item?id=10622675. Data management would drastically reduce the development effort associated with data-intensive applications for smaller devices, in an IoT setting or otherwise. I've actually recently travelled to the University of Michigan to talk about this work, and got lots of good feedback. One of the professors invited me and has asked me to study with him, because he feels there is a need. One of his graduate students there is working on integrating LittleD into some of his work already!


Interesting. The main drawback I saw was that your paper is one of those paywalled behind ACM. Many end up freely available on academics' sites, Citeseerx, etc. Yet, some portion are stuck where few will ever look at them. Another example recently on here was OcaPIC: Ocaml on PIC MCU's. Recent paper locked up where people can't read and improve it.

Is there a public copy of your paper for people here to read? Might increase interest and contributions.



Excellent design and tradeoffs. I doubt people would've believed you two would've taken it this far down. Keeping a copy in case techniques like this might apply to future 8-bit work. :)


Thank you for the feedback!

I actually have a lot of thoughts of ways to improve this too, which is why I am applying to do a PhD in this area.


Good luck on the PhD!


BOOM! Thanks! For some reason, search didn't give me anything but ACM, ResearchGate, etc. Glad you kept a free copy online. :)


I discovered my IOT camera in my home runs sqlite.


Do we have transaction (ACID) support? If not, have you thought about it? Is is feasible with the IOT constraints?


What are the actually memory usage/recommended RAM sizes for LittleD?


LittleD can be run using as little as 1KB of RAM for simple queries involving selections (WHERE clauses), projections (not just SELECT *) and joins. For queries with more than a couple joins, 2KB of RAM should still suffice.


This would be a good learning project in CS classes as well.


Is the size of SQLite really a problem? Is its reliability something that hardware devs are willing to sacrifice?

(In other words, how big is SQLite and what are the size restrictions for IoT devices – I haven't built one before)


Although SQLite's memory overhead is tiny by the standards of modern desktops/servers/phones, it's still much too big for 8-bit microcontrollers like the Arduino. If you want to build something that costs tens of cents instead of tens of dollars, SQLite is way too big.

For example, https://www.sqlite.org/about.html claims that SQLite can run in "very little heap (100KiB)"; this project targets an Atmel ATmega2560 which has only 8KiB of RAM.

(I doubt "5 times smaller than SQLite" is a useful comparison; the limiting factor is probably RAM consumption, as opposed to executable code which can be stored in cheaper non-volatile memory.)


I thought hard what to put in the subject (limited length, you know) - code size or working RAM size, but figured if I write "can run queries in 1KB", people won't get it ;-).


Consider putting some comparisons on the readme!


That's what I'm telling LittleD's author too! I posted here on HN just to show him that it's not my whim, and other people think the same ;-).


Since the size comparison to SQLite in the title has made the thread mostly be an argument about size comparisons and SQLite, let's try the other phrase and see if we fare better.

(Submitted title was "LittleD – SQL database for IoT, 5 times smaller than SQLite").


I actually saw the new title without SQLite, and I my first question was still how it compared to SQLite. :)


IMHO, for that kind of 8bit device/size, the app storage requirements is most likely limited.

define a few C structs with standard read() write() function call are most likely be enough and it is easy to design/test/debug.

One has a lot more control on RAM/ROM/Code space with that approach. You can easily find out where/how every single byte/bit of memory is used.


128K/256K of builtin flash is a midline for microcontrollers now. A 16Mb SPI flash chip costs $0.5, and many newer devices have some external flash. Of course, nobody precludes you to manage all that space manually, bit by bit.


In our lab, we have used both SD-cards and flash chips built in to devices to make storage a possibility. When you need or want to do anything complicated, such as a projection or a join, IImplementedItDB tends to become a headache.


LittleD targets devices the size of an Arduino, while SQLite requires at least some sort of operating system offering a filesystem interface. The feature set of LittleD is also reduced as it supports only simple queries.

The corresponding paper [1] claims "SQLite requires at least 200 KB of code space and cannot run on popular platforms such as the Arduino." For comparison, Arduinos based on ATmega microcontrollers only offer up to 256 KB of flash memory at maximum.

[1] https://dl.acm.org/citation.cfm?id=2554891


SQLite definitely does not require a filesystem interface, or even standard OS interface. It does require some kind of storage, though, but that could be direct flash, RAM, etc.

I have spent a lot of time with it on ARM Cortex-M-class devices on small RTOSes or even bare metal, and you can adapt it to almost anything by plugging in new VFS interfaces (http://www.sqlite.org/vfs.html).

It does require at least 10s of kb of RAM as a heap to do pretty much anything, as others have stated.


You can adapt, or have you adapted? I looked at SQLite internals, and decided that I'll skip that adaptation for now, seeing if someone else would do that. LittleD's author says that he looked into adapting SQLite, and that's how he decided to write LittleD.


How little memory have you managed? They claim 100KiB, but if it could be even half that then there are some tremendous opportunities to use it. I'd love to hear more about your work adapting SQLite for Cortex-M-class devices.


The ATmegas only go up to 16KBytes RAM, with 4KBytes or less being usual. As a Harvard architecture device, none of your binary is memory-mapped, which can make porting existing code a bit annoying. sizeof(int)==2, which doesn't help either (however sizeof(void *)==2 as well, so there is at least that).

It's rather unusual by today's standards. Certainly quite different from an ARM.


...though still the latest and sanest of that crop of micro's, IMHO. Designed in 1996, it was once marketed as "the first new 8-bit architecture designed in 20 years", I believe :-)



LittleD is designed to run on devices the size of (higher-end) Arduinos. So, it's difference of not being able to use a database and being able to use it. Reliability is a different problem, which is addressed for deeply embedded devices is slightly different ways than for bloated servers (e.g. by employing flash memory properties, or monitoring power supply in realtime).


In the project's defense, I didn't see the size comparison anywhere other than this submission.


Yes, currently you need to build it yourself to do size comparisons. Generally, this HN post is in argument to LittleD's author that people care about his project, and that he should care about its potential users (and contributors!) and provide better disclosure of LittleD's capabilities and traits, not requiring everyone to dig deep to get basic facts.


You've been very good at nudging me pfalcon. Thank you!


It's not a tradeoff you are making, the fact is that your Arduino only has program space measured in the dozens KiB. If it doesn't fit, it doesn't fit.

But that's not the biggest problem. SQLite, like most programs, relies on having a C library like glibc available. This is pretty much the intermediary person to the operating system - you can ask it for memory, it will allocate memory for you, you can ask it to open a file, it will facilitate that. Most programs assume it's always present and so the dependency isn't properly abstracted. This is a problem when you don't have an operating system.

Notice also that you can't just go and implement or stub everything that glibc does. To do that, you would just end up having to add an operating system.

(This particular rabbit hole goes much deeper. A lot of programs sadly don't depend on a C library, they depend on glibc in particular. This causes massive problems for systems that run Linux yet can't accomodate glibc, like OpenWRT. They use an alternative C library that works fine 99% of the time, and for the remaining 1% it will cause programs to silently misbehave or crash because of an implicit dependency on whatever glibc does. I had this problem with the hwclock utility which started segfaulting when OpenWRT switched to musl libc)


As others have noted, the resource constraints associated with many of the most cost-effective 8/16/32-bit AVR or ARM devices simply cannot run on SQLite. I even looked at making it smaller before starting LittleD, but I deemed that a Herculean task.

SQLite itself requires at least 100KiB of Heap space, sometimes many times more memory than already available: http://www.sqlite.org/about.html


Back in 2004, I wrote code for the Intermec ck1 barcode scanner. It ran uClinux and had a stack size limited to 4K. That thing could only very barely run sqlite 2 (after applying a few patches to it works with the small stack size) and I could never get SQLite 3 to work.

Granted. That was 10 years ago, but it was also quite a beefy device for its time. So while naturally the devices have become quite more powerful in the last 10 years, current IoT devices are also much more compact than the old ck1.

As such I would say it's conceivable that you don't have the memory to easily run current SQLite on any of these devices.


Also mind that software doesn't stay the same either. As an example, LMDB (smaller, key-value DB) in its initial docs and slides bragged 30K of object size. Build current version, you'll get 60K, and there doesn't seem to be config options to just make it smaller. So either use older version, which is of course scary (who knows how many bugs was fixed later), or spend quite a lot of time tearing it apart, with unclear maintenance perspectives. Having something ROM and RAM optimized from the beginning seems like pretty good solution (if someone grew brave to do it).


In all honesty, keeping the ROM size small is the challenge.


+1

SQLite is probably on of the most well tested piece of software on this globe. I do not know if this space saving is worth it.


It is if the space savings gets your product sales due to lower unit price with big $$$ resulting. That's why there's a market for 8-16bit processors. Most developers don't want it so much as need it because getting job done for $1 a CPU is better than using $10-$100. ;)

EDIT to add two links by Ganssle to explain it better:

http://www.ganssle.com/rants/8bitsisdead.htm

http://www.ganssle.com/articles/8and16bit.htm

I think Microchip's 8-bitters doing $1+ billion in business with better profits and dividends vs troubles of many in 32-64 bit markets says a lot.


Those articles are over a decade old. The size of silicon between an 8 or 16 or 32 bit ALU contained in an MCU is miniscule. Most of the space is taken up by peripherals and this is where Microchip has historically thrived but this is no longer so. There are a number of 32 bit MCUs that are cost competitive with 8 bit parts.


And many of these still don't have enough RAM to make SQLite workable. That said, I look forward to the day when SQLite is the smallest database we will need; its just up for debate when that will occur.


You think that's difficult? Here's you the next challenging target for DB work or whatever after your PhD is done:

http://www.embeddedinsights.com/channels/2010/12/10/consider...

Bet you didn't know they were even still around, eh? Get's better:

http://www.electro-tech-online.com/threads/mc14500b-a-1-bit-...

With manuals and Verilog included. :)


I recently was invited to give a talk about LittleD at the University of Michigan. While there, I got to talk with the one of the lead researchers working on the world's smallest computer, this thing: http://www.eecs.umich.edu/eecs/about/articles/2015/Worlds-Sm...

The basic computer has so little memory (~3KB total for code/data IIRC) to make anything sophisticated impossible, but there are extensions that keep the thing much smaller than a typical embedded device (like orders of magnitude smaller) but providing enough for there to be interesting work.


That's really neat. So tiny a fly could trip over it haha. Many of the advances in process nodes are making complex hardware small & efficient enough to do stuff like this. However, I still think there's worthwhile gains in R&D where we just throw more limited hardware in there w/ dedicated functions. We're seeing that with 8-bits + smart peripherals but I think there's more to do.

My last idea was several hundred 8-bit processors in a multi-core config that could handle many data-processing problems. Turns out it wasn't entirely original:

https://en.wikipedia.org/wiki/Kilocore

Doubted about its marketability but supports my assertion these little CPU's & such can go much further on modern nodes. Just gotta experiment. Your work is another example.


How about this month with a market survey? ;)

http://www.edn.com/electronics-blogs/embedded-insights/44408...

Interesting details on how 8-bit companies are doing that from John Donovan the year before:

https://www.digikey.com/en/articles/techzone/2014/feb/is-the...

We'd need a price list for the other thing. Can you get 32's with decent memory & low watts for a few bucks a CPU in low volume? Aside from that, you still think they're dead and useless with the new articles?

I'll add that many chips from old nodes survived for decades due to stability. The new process nodes have all kinds of issues and break faster. So, there is that to consider if the application is long-term, safety-critical, or security-critical. Many in high assurance design stayed with old stuff (esp on SOI) because manufacturing bugs were less and interference/wear issues are lower.


Yes, the tiny Freescale, Cypress, etc can be obtained for less than a dollar in a reasonable quantity [1]. The microcontroller market is changing pretty rapidly of late, with the number of products getting more and more broad. For your high reliability stuff, you'd probably want to look at the cortex-r series, which is built for the type of use cases you describe.

[1] http://bit.ly/1NbLYov redirects to digikey search.


Appreciate the confirmation. Outside embedded, one use for these people aren't appreciating is offloading interrupts or security checks of I/O devices. Along with sensors/mgmt stuff you can trust more. Cost being so low let's one do physical partitioning of domains instead of software.

Also, thanks for the mention of Cortex-R as I hadn't heard of it. A quick glance at their description looks good. About $8 on low end to $50 on high-end. Getting quite cheap indeed for what 8-bits would be required for. Way cheaper than the old champ (RCA 1802 MCU) is today ($150 w/ 1k units).

http://www.intersil.com/content/intersil/en/products/space-a...

Ok. So, I'll probably not have to go to 8-bit or drop large $$$ if I take-on certain projects. Good to know. Dirt cheap chips with large ecosystem, lock-step, real-time, and MPU's... it's like the golden era in tech for embedded start-ups, eh? :)


The charts on the EDN article are flipped, 8 bit usage is dropping. Time is from right to left. ;)

My argument is that the economic cost of production between 8 and 32 bit MCUs is negligible. There is a _need_ for market segmentation, etc. But the cost of the package makes up for the cost in gates. It really is a wash. Much of what we still see in 8 bit designs is because of (no insult intended) 8 bit designers. I love me some 8 bit RISC, but AVR is having to fully embrace ARM even when it has its own AVR32 ISA.

I don't think they are dead and useless, but they are definitely on the out for most designs. Totally agree on old stable designs. Engineering tolerances are "better" now. Something to be said for the Brooklyn Bridge and DC3s.

Cheap and low volume? Yes.

http://www.mouser.com/Semiconductors/Embedded-Processors-Con...


"The charts on the EDN article are flipped, 8 bit usage is dropping. Time is from right to left. ;)"

I saw that. They still had almost 10% of a huge market, though. My point was Ganssle's points of them not gone for most cost-sensitive markets seemed still accurate.

"My argument is that the economic cost of production between 8 and 32 bit MCUs is negligible. There is a _need_ for market segmentation, etc. But the cost of the package makes up for the cost in gates."

That does seem to be true. Especially package vs gates.

"I don't think they are dead and useless, but they are definitely on the out for most designs."

I won't argue there. 32-bit space has improved significantly on cost and efficiency. Dropping down process nodes helped a lot, there. ;)


No. But running SQL where you never could run it before (where SQLite couldn't fit even remotely) is definitely interesting.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: