I suppose you are too young to remember "picking winners", Harold Wilson, British Leyland, Red Robbo... Or more directly, Grundy and their NewBrain vs Acorn... Historically, government meddling in industry in the UK has been a disaster.
Well, they picked one winner - Leyland - and force-merged them with all the losers, in the hope that Leyland's magic would somehow infuse them. Instead, it destroyed them all. Some of the brands, like Jaguar, Daimler, Land Rover, etc live on. But not as British companies.
The US government picked winners like Intel, saving the US semiconductor industry through protectionist policies. (http://www.forbes.com/2003/10/10/1010grovepinnacor.html) In fact, the US technology industry was developed through enormous subsidy, from transistors to the internet.
Economist Ha-Joon Chang discusses that's how powerful nations develop. The entrepreneurial Japanese government's first forays into the auto industry were failures, but of course they kept at it, learning from failure. That's the nature of investment, as any HN reader should know.
You have to remember how many failures it takes to have a big success. For every Intel there's a wasteland of shoulda-been companies that are dead and forgotten, with billions invested having gone up in smoke.
Of course, when it's private venture capital that's behind this, people don't seem to complain as much.
Honestly, I've been in a couple of startups (including one that got a lot of funding from VCs and going public) and I've never seen anything like the inefficiency that was normal in the public sector projects that I worked on.
Not that I'm against public funding of research - I'm absolutely not. But my experiences of public sector research funding on "large" projects completely put me off working in that sector (so much so that I left academia to co-found a start-up).
That was probably the single biggest issue, although there were others, there was a huge administrative overhead on all of the projects that it made it difficult to spot where any actual research was being done.
I've worked on publicly funded projects and in the private sector. The publicly funded projects were the worst for pointless reporting, accounting, and paperwork.
In the private sector, the boss pops his head in and asks what I've done to day and what I'll finish by the end of the week, and we adjust goals. It takes about 5 minutes a week to do this.
When I was working on a publicly funded project, there was a report to be filed every day that took an hour of my time to do, a report to be filed every week that took about three hours over the course of the week, meeting with management to make sure that the first two reports were complete, accurate, and not fudged which took an additional two hours a week, and an end of the month assessment for both spending and time clocked in that took about twelve hours over the course of a month.
I would speculate that you're probably more selective about the kinds of companies you work for than a VC is in funding them.
There have been some legendary turkeys over the last ten years, but for every pets.com or hairdryersbyemail.com there's dozens of others so misguided you never heard a thing, they just went nowhere fast.
"The Olympic was popular in Montevideo, with 240 entering service in the 1950s and 1960s. 50 of these were new to the Montevideo local authority, most of which passed to major independent and the other customer CUTCSA on privatisation. Some of the 240 were still in use as late as 2001 including a 1951 EL40 in use as a driver trainer."
ARM employs about 2 000 people . They're definitely a huge boon for the British tech community and competitiveness, but justifying government support for such a small employee base would be untenable.
Yes. There's a temptation, because they both "make CPUs" to view ARM as the "next Intel", but it's not like that at all. Intel is an industrial manufacturing company. They happen to manufacture CPUs that they've designed, but "what they do" (or at least, where their money comes from and where the bulk of it goes to) is manufacturing. In the ARM world, all that cash flow goes through TSMC, Samsung, TI, Global Foundries, Fujitsu, etc... Basically none of it goes through the UK.
ARM's model, of course, is absolutely the future of semiconductor design. But it's not going to bring an Intel-sized industry to the UK.
Most of the 2000 may be highly skilled or at least a large proportion to that of a bank, increasing funding in such a company may increase the number of people taking up science and engineering disciplines in education.
I actually had a NewBrain, think it's still around in my parents' attic somewhere.
I did quite a bit of real paying work on BBC Micros back in the day, and think they were great - but reading the way people tell the story today, one would be given the impression that the NewBrain was some kind of grim socialist piece of junk. It was actually a pretty decent bit of kit, quite a bit better quality than many of the other micros on the UK market at the time.
In an ironic turn, the company I worked for doing the BBC stuff later moved into Grundy's old premises in Teddington after they went bust.
To be fair, via TSB, the UK is investing in new technology companies... not sure how that compares to their investment in car manufacturing, though. However, afaik the main "investment" in banking was to save the banks that were going bust.
Disclaimer: I might be completely wrong about the above!
"If a British company imports components, it has to pay tax on those (and most components are not made in the UK). If, however, a completed device is made abroad and imported into the UK – with all of those components soldered onto it – it does not attract any import duty at all. This means that it’s really, really tax inefficient for an electronics company to do its manufacturing in Britain"
Interesting that they only make comparisons with PIC and 8051. AVR still blows them away on many angles; particularly energy efficiency, community, ease of use on not-windows, and price. (MSP430 too, for efficiency and price.)
I'm fooling around with the lower power ARM chips on weekends, and they seem like a great option when you really want to shoehorn in a full operating system somewhere.
From an engineer's perspective, the AVR 8-bit instruction set is far nicer to work with than the 8-bit PIC. It's been a few years since I've been an embedded electronics engineer, but I seem to recall PIC not having an easy way to implement a stack, and hence a (proprietary) compiler which didn't support reentrant functions.
AVR OTOH has instructions which allow one to easily manipulate a stack, and since GCC targets AVR, you don't have to deal with Microchip's crappy compiler and can use whatever fun C constructs you wish and still get decent code.
Pretty apples to oranges here when saying something as general as '8-bit PIC' there are many architectures within that family, 4 last i knew. The low end ones are not intended for coding to with C anyway (although there are some poor compilers that try). The PIC18 series isn't bad to work with in C, I used the C18 compiler at the time and its 95% ANSI C.
I believe there is gcc support for the pic24/dsPIC (16bit) and the PIC32(32bit).
All that said last I knew Microchip still leads in 8bit mcu global market share.
You really don't care as much when its got mains power but not all of these devices are plugged in. You will see them a lot in simple hand held consumer electronics like garage door openers, remote controllers, wireless mouses, kitchen timers and so on. In battery powered devices saving a mA make a world of difference.
mA may matter for something like a phone but really the war in the 8bit space is over uA and nA.
Many 8bit mcus end in places where their battery is expected to have a life time measured in years. Last I looked I think MicroChip is leading the way here in terms of sleep power. Their PIC XLP series only draws ~20nA while sleeping.
There is more competition in the 16bit space with the MSP430 from TI.
Seems like a fairly straightforward evolution, they are claiming 11.2uW/MHz vs. 16 for the non-plus M0. Sounds like the instruction set is the same. I'm curious whether the M0 has been getting significant design wins over the established low-power microcontrollers like PIC, MSP430, etc.
Yeah, that's how the ARM microcontroller market works. I imagine NXP's peripheral set will be quite similar to the ones on their existing Cortex-M0 chips. Freescale will presumably base theirs on what's in their M4 parts, albeit scaled down. For another point of reference you could look at STMicro's Cortex-M0 implementation as well.
Yeah, there are lots of frequencies that are good locally, but not many that are good globally. If there were frequencies that were even just good in three of the US, Europe, India, and China that would probably be good enough.
The idea behind a smart grid is usually some form of the local utility company taking money off your electric bill if your appliances are smart about not using energy at times of peak load. Nothing about the concept means the government has to be involved.
> I'm a bit surprised this hasn't been done before.
It kind of has been. This is not the first tiny low-power microprocessor out there -- it's just that ARM keeps pushing the envelope. The last "most power-efficient tiny 32-bit cpu" was Cortex-M0, of which this one is an evolutionary improvement.
Most of the embedded industry is still using really simple 8 or 16 bit microcontrollers. The business stragegy of ARM in the space is to push more advanced and complex cpus down to the space and grab marketshare by providing chips that are easier to program. It also helps that the size of the chip itself is a complete non-issue (the article referred to 1mm², while in reality the area for the core itself is actually less than 0.01mm² on a 40G process), and ARM can sell code density because they have more powerful instructions than the typical microcontroller, so while their cpu is bigger, the memory it needs is smaller.
No reason why should not be solar powered. Around here all the parking meters are solar powered, just for cost of installation reasons. Wiring electricity in to 50bn devices would be a pain, unless they are electric anyway.
I've seen solar powered road photoradars, and road lights on some crossings between cities. Solar cells are on 2-3 meter pole, at +- 45 degree, and have sth like 0.5 square meter surface, and big box the size of car battery under the solar cell.
I guess it's more cost effective than draw the wires a few kilometers from the city.
Just curious, wouldn't a place that needs parking meters already be on the grid? Assuming we use parking meters to earn back the costs of building the parking lot or to stimulate the use of other forms of transportation than a car in that area.
On the other hand, having autonomous off-the-grid parking meters might be easier to install and maintain by a third party. Plus it allows you to start growing parking meters everywhere, even in the middle of nowhere. Let them pay!
The ones I've seen have been installed as higher tech replacements for traditional mechanical parking meters. Solar power means the old meter head can just be removed from the pole and the new one attached, with no need to worry about running underground electric service where none was before. I'm sure the savings in installation cost more than makes up for the addition of the solar panel.
32 bit uCs are nothing new. However PIC32s are large high performance devices with AES, Ethernet, USB etc built in depending on model. They operate in a totally different area. You can use one to run a webserver and drive an LCD panel for example.
Atmel also offer 32 bit uCs in the form of AVR32 and ARM based parts.
Does that mean their hardware can multiply 32-bit by 32-bit
numbers in a single clock cycle?? I took a computer organization course where I implemented a simple hardware multiplier and it took a lot more cycles than that, so I was curious.
Generally the package is big enough to get all the connections out at a scale that won't irritate manufacturing.
There are tiny packages though, NXP (one of the ARM builders) has a 2mm x 2mm package with a similar ARM in it, but they can only get 16 pins out at that scale, 4 are power and ground, one is clock leaving you 11 pins to rule the world. (data sheet: http://www.nxp.com/documents/data_sheet/LPC1102.pdf)
These are not sold as physical chips, but as logic macros, where someone implementing a soc pastes this on with all the other stuff he wants. The total chip size will of course be bigger so that it can have pins.
I don't need years of battery life: I need 32-36 hours (of use, not standby). Then I can go from charging daily, like in the nineteenth century where you have to wind your watch up every night, and Sherlock Homes can tell a watch belonged to a drunkard by scratches around where you insert the winding key; how different is it to fumble with the power adapter jack? - ahem, to charging whenever it is CONVENIENT for me. It still wouldn't quite last through a long weekend at a lodge with just one inconvenient power outlet if you use it a lot EVERY day, but most people don't and if you do, you still only have to find a quiet period to charge it ONCE during the long weekend, like whenever the social activity dies down.
How many people keep multiple chargers at the different places (home/office) they are, or even take one with them ALL the time, just so they don't get left stranded. All this is fixed if the machine has a longer cycle than you do, over a typical 1-2 day cycle. Then you can pick the most convenient time to charge up from 20 or 40 or 60 percent or wherever it's at by then, you can work on it it at a cafe without plugging it in after working somewhere else you couldn't plug it in the night before, or go away for the weekend with it without any charger at all, if you know you're back in the ofice Monday and are sure to use it less than that.
In the meantime you don't worry, you aren't inconvenienced.
This is not about being able to go for long periods without bothering to charge, but about energy usage, and being able to get a long period of actual usage from a machine without having to charge it from a reliable mains supply.
You do need years of battery life. Not by improving the capacity of batteries, but by improving the rate at which energy is drawn from them.
As the article quotes:
"Every developed nation country has a graph showing electricity demand is going to outstrip supply at some point in the next 20 years unless we do something different,"
You are inconvenienced, just not in the manner you highlight.
I need years of battery life. Recharging means not being late for a charge cycle for 50+ years lest I suffer ventricular synchronization failure, and replacement means cutting a hole in my shoulder - both of which rather inconvenience me.
But it's not because you don't need years that nobody else would. And it's not because you immediately link ARM to smartphones and music players that that's the only use for a power-efficient CPU. Think a little harder :-)
I was talking about my laptop. Terribly sorry I didn't make this clear, too late to edit! I mean that nobody even thinks of making a laptop with the battery life mentioned. For office tasks, who cares if it's an arm?
I'm not sure it'd be possible given how much power would be drawn by the peripherals: screen, wifi, hard disks / SSDs, USB ports if used... although I do wonder if there are computers with netbook processing power and laptop batteries/form factor.
Right, but my point was that nobody even bothers to serve this laptop market. If there are laptops with netbook processing power (in both sense of the words: power usage and processing power), an ssd, and a massive battery that will push that sucker close to 20 hours (on email, word processing, office tasks), then I don't know about it. I just think nobody even considers taht a priority. Instead, they focus on "a full day's charge" -- like a wind-up watch in the 19th century.