Hacker News new | comments | show | ask | jobs | submit login
Arm unveils 1mm x 1mm 32bit chip: "years of battery life" (bbc.co.uk)
281 points by replax 1772 days ago | hide | past | web | 102 comments | favorite



ARM Cortex M0+ processor specs:

http://www.arm.com/products/processors/cortex-m/cortex-m0plu...

Advantages over 8-bit and 16-bit processors for embedded solutions, according to ARM:

http://www.arm.com/products/processors/cortex-m/cortex-m0plu...


Interesting that they only make comparisons with PIC and 8051. AVR still blows them away on many angles; particularly energy efficiency, community, ease of use on not-windows, and price. (MSP430 too, for efficiency and price.)

I'm fooling around with the lower power ARM chips on weekends, and they seem like a great option when you really want to shoehorn in a full operating system somewhere.


Can you please write a word or two what you see as an advantage of AVR? I admit I don't have any experience but I'd really like to know. Thanks.

For other readers, AVR:

http://www.atmel.com/products/microcontrollers/avr/default.a...


From an engineer's perspective, the AVR 8-bit instruction set is far nicer to work with than the 8-bit PIC. It's been a few years since I've been an embedded electronics engineer, but I seem to recall PIC not having an easy way to implement a stack, and hence a (proprietary) compiler which didn't support reentrant functions.

AVR OTOH has instructions which allow one to easily manipulate a stack, and since GCC targets AVR, you don't have to deal with Microchip's crappy compiler and can use whatever fun C constructs you wish and still get decent code.


Pretty apples to oranges here when saying something as general as '8-bit PIC' there are many architectures within that family, 4 last i knew. The low end ones are not intended for coding to with C anyway (although there are some poor compilers that try). The PIC18 series isn't bad to work with in C, I used the C18 compiler at the time and its 95% ANSI C.

I believe there is gcc support for the pic24/dsPIC (16bit) and the PIC32(32bit).

All that said last I knew Microchip still leads in 8bit mcu global market share.


I don't understand the connection on why we need such an efficient chip for implementation in devices which are plugged in.

If the economic benefits were so strong for those devices being connected that should have already happened.

What am I missing?


You really don't care as much when its got mains power but not all of these devices are plugged in. You will see them a lot in simple hand held consumer electronics like garage door openers, remote controllers, wireless mouses, kitchen timers and so on. In battery powered devices saving a mA make a world of difference.


mA may matter for something like a phone but really the war in the 8bit space is over uA and nA.

Many 8bit mcus end in places where their battery is expected to have a life time measured in years. Last I looked I think MicroChip is leading the way here in terms of sleep power. Their PIC XLP series only draws ~20nA while sleeping.

There is more competition in the 16bit space with the MSP430 from TI.


You seem to be missing that it is always a good idea to save power.


I don't think of AVRs as being particularly cheap relative to their competition but I haven't tried to buy any in quantity lately so what do I know. Maybe things have changed.


ARM is a brilliant (if not the poster-boy) example of why the UK government should be investing in new technology companies rather than car manufacturing and banking.


I suppose you are too young to remember "picking winners", Harold Wilson, British Leyland, Red Robbo... Or more directly, Grundy and their NewBrain vs Acorn... Historically, government meddling in industry in the UK has been a disaster.


The US government picked winners like Intel, saving the US semiconductor industry through protectionist policies. (http://www.forbes.com/2003/10/10/1010grovepinnacor.html) In fact, the US technology industry was developed through enormous subsidy, from transistors to the internet.

Economist Ha-Joon Chang discusses that's how powerful nations develop. The entrepreneurial Japanese government's first forays into the auto industry were failures, but of course they kept at it, learning from failure. That's the nature of investment, as any HN reader should know.


My experience of government "investment" (both UK and EU) in the technology sector is that is has been the most frightful waste of money imaginable.

It's not down to lack of money - but the way it has been spent.


You have to remember how many failures it takes to have a big success. For every Intel there's a wasteland of shoulda-been companies that are dead and forgotten, with billions invested having gone up in smoke.

Of course, when it's private venture capital that's behind this, people don't seem to complain as much.


Honestly, I've been in a couple of startups (including one that got a lot of funding from VCs and going public) and I've never seen anything like the inefficiency that was normal in the public sector projects that I worked on.

Not that I'm against public funding of research - I'm absolutely not. But my experiences of public sector research funding on "large" projects completely put me off working in that sector (so much so that I left academia to co-found a start-up).


Publicly funded projects are such an inefficient mess, in part, because of all the shit that is done to make sure that tax dollars aren't "wasted."


That was probably the single biggest issue, although there were others, there was a huge administrative overhead on all of the projects that it made it difficult to spot where any actual research was being done.


True that.

I've worked on publicly funded projects and in the private sector. The publicly funded projects were the worst for pointless reporting, accounting, and paperwork.

In the private sector, the boss pops his head in and asks what I've done to day and what I'll finish by the end of the week, and we adjust goals. It takes about 5 minutes a week to do this.

When I was working on a publicly funded project, there was a report to be filed every day that took an hour of my time to do, a report to be filed every week that took about three hours over the course of the week, meeting with management to make sure that the first two reports were complete, accurate, and not fudged which took an additional two hours a week, and an end of the month assessment for both spending and time clocked in that took about twelve hours over the course of a month.

So guess where I'm more productive.


I would speculate that you're probably more selective about the kinds of companies you work for than a VC is in funding them.

There have been some legendary turkeys over the last ten years, but for every pets.com or hairdryersbyemail.com there's dozens of others so misguided you never heard a thing, they just went nowhere fast.


> Of course, when it's private venture capital that's behind this, people don't seem to complain as much.

That's because private venture capital is people spending their own money, rather than other people's money.


No it isn't. It's usually a fund manager spending other people's money; if s/he strikes out too many times, the money leaves and the fund closes down.


"Taxpayers money" versus "Taxpayers retirement money" is a subtle distinction here, isn't it?


Japan's record isn't that good - MITI tried to strangle both Japanese auto exports and the semiconductor industry in their cradle because they didn't conform to Japan's development plan.


Sony's initial plan was rice cookers.


Serious question. Was it a disaster or did they just pick the losing team?

Sometimes the government picking a loser can still cause a market to expand.


Well, they picked one winner - Leyland - and force-merged them with all the losers, in the hope that Leyland's magic would somehow infuse them. Instead, it destroyed them all. Some of the brands, like Jaguar, Daimler, Land Rover, etc live on. But not as British companies.


> Was it a disaster or did they just pick the losing team.

The government always picks the losing team. The winning team doesn't want anything to do with the government.


^dogma


My karma ran over your dogma.


The winning team is the one that pays the government, not the other way round.


What went wrong with Leyland? It's a very respected trademark here in Uruguay, we still have some Leyland buses in operation here, including 1949 Leyland Olympics (!!!)

It's even on Wikipedia http://en.wikipedia.org/wiki/Leyland-MCW_Olympic

"The Olympic was popular in Montevideo, with 240 entering service in the 1950s and 1960s. 50 of these were new to the Montevideo local authority, most of which passed to major independent and the other customer CUTCSA on privatisation. Some of the 240 were still in use as late as 2001 including a 1951 EL40 in use as a driver trainer."

Edit: kind-of-answered simultaneously here http://news.ycombinator.com/item?id=3698064 while I was writing the question :)

"they picked one winner - Leyland - and force-merged them with all the losers"


BL was responsible for some of the most ghastly cars ever created, including:

http://en.wikipedia.org/wiki/Morris_Marina

http://en.wikipedia.org/wiki/Austin_Allegro


I'm not quite sure why the government's choices back then would have any bearing on their choices 30 years later? It's not the same people. It's not even the same generation.


The poor state of car manufacturing and banking would seem to argue against inflicting "government investing" on new technology companies.


The banking industry managed to wreak itself without serious government investment. The trouble was an absence of effective deterrents to bankers seeking dangerously high levels of leverage.


I think many people are assuming this means direct investment but maintaining a high quality, accessible education system is arguably more important.


Especially as ARM originated from Acorn, whose success came from manufacturing the BBC Micro, for a publicly funded organisation (the BBC).


Ahem, Acorn paid the BBC a fee for every Beeb sold, as a licensing fee on use of the logo.

Acorn were also almost overlooked in favour of a company called Grundy, which had government backing, but inferior technology.

http://en.wikipedia.org/wiki/Grundy_NewBrain


I actually had a NewBrain, think it's still around in my parents' attic somewhere.

I did quite a bit of real paying work on BBC Micros back in the day, and think they were great - but reading the way people tell the story today, one would be given the impression that the NewBrain was some kind of grim socialist piece of junk. It was actually a pretty decent bit of kit, quite a bit better quality than many of the other micros on the UK market at the time.

In an ironic turn, the company I worked for doing the BBC stuff later moved into Grundy's old premises in Teddington after they went bust.


On the other hand, the BBC Micro would almost certainly not have been the success it was without the Beeb and the lift it gave to Acorn in the educational market.


Look at it as very well-priced advertising, after all the BBC effectively promoted their product. They nearly lost out to Sinclair Research as well, but that's a well known story...


The BBC were careful never to say "Acorn" on-screen (same as they never said Sellotape, it was always "sticky-backed plastic", to the amusement of the entire population).


ARM employs about 2 000 people [1]. They're definitely a huge boon for the British tech community and competitiveness, but justifying government support for such a small employee base would be untenable.

[1] http://www.arm.com/about/company-profile/index.php?setcookie...


Yes. There's a temptation, because they both "make CPUs" to view ARM as the "next Intel", but it's not like that at all. Intel is an industrial manufacturing company. They happen to manufacture CPUs that they've designed, but "what they do" (or at least, where their money comes from and where the bulk of it goes to) is manufacturing. In the ARM world, all that cash flow goes through TSMC, Samsung, TI, Global Foundries, Fujitsu, etc... Basically none of it goes through the UK.

ARM's model, of course, is absolutely the future of semiconductor design. But it's not going to bring an Intel-sized industry to the UK.


Most of the 2000 may be highly skilled or at least a large proportion to that of a bank, increasing funding in such a company may increase the number of people taking up science and engineering disciplines in education.


There's also all the other small companies that benefit from being near ARM, for instance the chip design team in Britain my company picked up.


Northern Rock employed 5000.


Bank bail-outs aren't about employment, they are about protecting people and other business from losses in assets in the banks.


How many of those 2000 are in the UK?


Most of them.


To be fair, via TSB, the UK is investing in new technology companies... not sure how that compares to their investment in car manufacturing, though. However, afaik the main "investment" in banking was to save the banks that were going bust.

Disclaimer: I might be completely wrong about the above!


And yet, according to: http://www.raspberrypi.org/archives/509

"If a British company imports components, it has to pay tax on those (and most components are not made in the UK). If, however, a completed device is made abroad and imported into the UK – with all of those components soldered onto it – it does not attract any import duty at all. This means that it’s really, really tax inefficient for an electronics company to do its manufacturing in Britain"


Meh..

ARM only employs ~2000 people, not all of whom are in the UK. That's hardly a mass employer like the manufacturing sector is.


Seems like a fairly straightforward evolution, they are claiming 11.2uW/MHz vs. 16 for the non-plus M0. Sounds like the instruction set is the same. I'm curious whether the M0 has been getting significant design wins over the established low-power microcontrollers like PIC, MSP430, etc.


It looks like the trade-off is peripherals: the M0+ (as far as I can tell) doesn't have the goodies a generic PIC at the same price-point does.


Replying to myself: judging by the Freescale chips out there, the peripherals will probably get added by IP integrators. Should be interesting.


Yeah, that's how the ARM microcontroller market works. I imagine NXP's peripheral set will be quite similar to the ones on their existing Cortex-M0 chips. Freescale will presumably base theirs on what's in their M4 parts, albeit scaled down. For another point of reference you could look at STMicro's Cortex-M0 implementation as well.


Well, strictly speaking the Cortex A9 is supposed to be 1mm x 1.5mm, but that's:

1. According to ARM's promotional materials. No corroboration from licensors, as far as I'm aware.

2. Without caches. If you look at those pretty pictures of Intel chips, those huge swatches of regular rectangular stuff are caches, and they take up space.


Hope that when the "smart grid" or whatever you want to call it does come around, it comes with the electricity equivalent of net neutrality.


Well, ZigBee is what most governments are pushing for smart girds, and isn't totally free it is an open standard that anyone can join. http://en.wikipedia.org/wiki/ZigBee

</shameless plug>


2.4GHz doesn't sound like a great frequency for smart grids if the nodes are spaced far apart.


Yes, if I were appointed dictator of the world I would certainly arrange things so that another frequency could be used.


At least in Europe it appears 868MHz is gaining ground and now 169MHz is being considered.


Yeah, there are lots of frequencies that are good locally, but not many that are good globally. If there were frequencies that were even just good in three of the US, Europe, India, and China that would probably be good enough.


Wow. Let's have government bureaucrats control not only how we use the Net, but also when we can use our appliances!

Why not just use nuclear to produce abundant energy?


The idea behind a smart grid is usually some form of the local utility company taking money off your electric bill if your appliances are smart about not using energy at times of peak load. Nothing about the concept means the government has to be involved.


Hope that when the "smart grid" or whatever you want to call it does come around, it comes with the electricity equivalent of net neutrality.

The post I was responding to was specifically calling for it to be government controlled.


Because it's expensive.


Link to non-mobile site: http://www.bbc.co.uk/news/technology-17345934

I'm a bit surprised this hasn't been done before.

Can this chip be solar powered (I mean, obviously, with a small solar cell)?


> I'm a bit surprised this hasn't been done before.

It kind of has been. This is not the first tiny low-power microprocessor out there -- it's just that ARM keeps pushing the envelope. The last "most power-efficient tiny 32-bit cpu" was Cortex-M0, of which this one is an evolutionary improvement.

Most of the embedded industry is still using really simple 8 or 16 bit microcontrollers. The business stragegy of ARM in the space is to push more advanced and complex cpus down to the space and grab marketshare by providing chips that are easier to program. It also helps that the size of the chip itself is a complete non-issue (the article referred to 1mm², while in reality the area for the core itself is actually less than 0.01mm² on a 40G process), and ARM can sell code density because they have more powerful instructions than the typical microcontroller, so while their cpu is bigger, the memory it needs is smaller.


No reason why should not be solar powered. Around here all the parking meters are solar powered, just for cost of installation reasons. Wiring electricity in to 50bn devices would be a pain, unless they are electric anyway.


I've seen solar powered road photoradars, and road lights on some crossings between cities. Solar cells are on 2-3 meter pole, at +- 45 degree, and have sth like 0.5 square meter surface, and big box the size of car battery under the solar cell.

I guess it's more cost effective than draw the wires a few kilometers from the city.

EDIT: sth like that http://www.google.pl/imgres?q=%C5%9Bwiat%C5%82a+baterie+slon...


Just curious, wouldn't a place that needs parking meters already be on the grid? Assuming we use parking meters to earn back the costs of building the parking lot or to stimulate the use of other forms of transportation than a car in that area.

On the other hand, having autonomous off-the-grid parking meters might be easier to install and maintain by a third party. Plus it allows you to start growing parking meters everywhere, even in the middle of nowhere. Let them pay!


The ones I've seen have been installed as higher tech replacements for traditional mechanical parking meters. Solar power means the old meter head can just be removed from the pole and the new one attached, with no need to worry about running underground electric service where none was before. I'm sure the savings in installation cost more than makes up for the addition of the solar panel.


wouldn't a place that needs parking meters already be on the grid

Sure, but not having to dig additional holes in the ground is a huge cost saver.


It has been done before, at least according to the article, it appears that there already is competition in this market:

  Arizona-headquartered Microchip Technology designs and builds a rival range of 32-bit "Pic" microcontroller, while California-based Atmel offers 32-bit "Avr" products.
But doing a bit of fact-checking, it might not be so simple.

The Microchip website gives 404 errors when I try to read their press release http://www.microchip.com/pagehandler/en-us/press-release/mic... - not really a good sign, hard to verify any claims.

The Atmel TinyAVR devices are small (2mmx2mm) but appear only to be 8-bit and aren't really comparable to the Arm offering.


32 bit uCs are nothing new. However PIC32s are large high performance devices with AES, Ethernet, USB etc built in depending on model[1]. They operate in a totally different area. You can use one to run a webserver and drive an LCD panel for example.

Atmel also offer 32 bit uCs in the form of AVR32 and ARM based parts.

[1] http://www.microchip.com/pagehandler/en-us/family/32bit/appl...


Improvements in "Ultracapacitors" will help with solar-powering devices.

(http://en.wikipedia.org/wiki/Electric_double-layer_capacitor)


Pair it with bluetooth 4.0 ant implant inside my body. I will charge it myself once in a while using inductive charger.


Why not just use energy harvesting. You move & you're warm.


But the warmth isn't on a temperature gradient so it's difficult to exploit.


Quick question: under specs for the M0+ where they say,

>Enhanced Instructions >Hardware single-cycle (32x32) multiply option

Does that mean their hardware can multiply 32-bit by 32-bit numbers in a single clock cycle?? I took a computer organization course where I implemented a simple hardware multiplier and it took a lot more cycles than that, so I was curious.


You can create HW multiplier using only NOR+AND elements. It will consume a lot of silicon, but will work fast enough.

BTW, most FPGAs have HW multiplier prebuilt block, because you will loose lots of flip-flops or LUTs (or both) implementing one yourself.


Probably means throughput, not latency.


How does a 1mm x 1mm chip actually get used? I'm guessing that the actual package would have to be much bigger to enable mounting on a board?


Generally the package is big enough to get all the connections out at a scale that won't irritate manufacturing.

There are tiny packages though, NXP (one of the ARM builders) has a 2mm x 2mm package with a similar ARM in it, but they can only get 16 pins out at that scale, 4 are power and ground, one is clock leaving you 11 pins to rule the world. (data sheet: http://www.nxp.com/documents/data_sheet/LPC1102.pdf)


These are not sold as physical chips, but as logic macros, where someone implementing a soc pastes this on with all the other stuff he wants. The total chip size will of course be bigger so that it can have pins.



this is mostly for msp430 I feel, msp430 is 16-bit and kind of strange to deal with, I'm happy to see this M0+ coming along to replace that.


I don't need years of battery life: I need 32-36 hours (of use, not standby). Then I can go from charging daily, like in the nineteenth century where you have to wind your watch up every night, and Sherlock Homes can tell a watch belonged to a drunkard by scratches around where you insert the winding key; how different is it to fumble with the power adapter jack? - ahem, to charging whenever it is CONVENIENT for me. It still wouldn't quite last through a long weekend at a lodge with just one inconvenient power outlet if you use it a lot EVERY day, but most people don't and if you do, you still only have to find a quiet period to charge it ONCE during the long weekend, like whenever the social activity dies down.

How many people keep multiple chargers at the different places (home/office) they are, or even take one with them ALL the time, just so they don't get left stranded. All this is fixed if the machine has a longer cycle than you do, over a typical 1-2 day cycle. Then you can pick the most convenient time to charge up from 20 or 40 or 60 percent or wherever it's at by then, you can work on it it at a cafe without plugging it in after working somewhere else you couldn't plug it in the night before, or go away for the weekend with it without any charger at all, if you know you're back in the ofice Monday and are sure to use it less than that.

In the meantime you don't worry, you aren't inconvenienced.


This is not about being able to go for long periods without bothering to charge, but about energy usage, and being able to get a long period of actual usage from a machine without having to charge it from a reliable mains supply.

You do need years of battery life. Not by improving the capacity of batteries, but by improving the rate at which energy is drawn from them.

As the article quotes: "Every developed nation country has a graph showing electricity demand is going to outstrip supply at some point in the next 20 years unless we do something different,"

You are inconvenienced, just not in the manner you highlight.


I need years of battery life. Recharging means not being late for a charge cycle for 50+ years lest I suffer ventricular synchronization failure, and replacement means cutting a hole in my shoulder - both of which rather inconvenience me.


These chips are not going to be used in phones.


They might be used in phones, just not as the CPU/GPU. Think, flash memory controller etc.


I was talking about my laptops -- terrible sorry this wasn't clear! I thought the cafe/home/office example made it clear. Who works all day on their phone?


They aren't for laptops either.

Think about something more like a toaster or a temperature sensor.


Or a pacemaker - something drawing power for a very long time, and where recharging/replacing is literally painful and power failure is deadly.


But it's not because you don't need years that nobody else would. And it's not because you immediately link ARM to smartphones and music players that that's the only use for a power-efficient CPU. Think a little harder :-)


I was talking about my laptop. Terribly sorry I didn't make this clear, too late to edit! I mean that nobody even thinks of making a laptop with the battery life mentioned. For office tasks, who cares if it's an arm?


I'm not sure it'd be possible given how much power would be drawn by the peripherals: screen, wifi, hard disks / SSDs, USB ports if used... although I do wonder if there are computers with netbook processing power and laptop batteries/form factor.


Right, but my point was that nobody even bothers to serve this laptop market. If there are laptops with netbook processing power (in both sense of the words: power usage and processing power), an ssd, and a massive battery that will push that sucker close to 20 hours (on email, word processing, office tasks), then I don't know about it. I just think nobody even considers taht a priority. Instead, they focus on "a full day's charge" -- like a wind-up watch in the 19th century.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: