May be not a microprocessor. Building something like that requires moderately deep pockets.
But hardware start ups happen all the time. They don't have any where near the kind of press coverage we do. I am working on a couple of embedded projects on the side. And I can tell you the barrier to entry there is pretty high. Web companies for all their scalability problems are still by and large low hanging fruits, compared to any hard ware start up. There is also little VC activity there, and you won't get slots for pitches at conferences.
One of things that I see while working with embedded developers(I am currently working with some very good ones). C is literally the bread and butter, so you have no language wars. The highest priority is Stability(Read, Quality in general), followed by efficient code, and then maintainable code. Your survival largely depends on your ability to RTFM properly, you have to internalize and thoroughly understand specs, the hardware interfaces you are trying to program and the way C lets you program them in a interplay with best practices there.
Internet isn't always of much help as the platforms you are trying to target aren't too famous. And there are no hipsters who crazy blog every little development in that area.
Of the many things that I've learned(and enjoyed doing it), is to go to great extents to workout things on paper. Mind maps, diagrams, graphs, doodles, checklist etc. To first prove a degree of correctness on paper and begin to program. It has greatly increased my ability to divide things and look at them with clarity. In turn I am able to write stable, and quality code.
According to me every one should work a embedded project. Even if its not a serious project. And please work with plain C without an OS.
Additionally, writing a python program on linux, running on something like rasberry pi is not hardware hacking, or low level programming.
After reading the transcripts about the code in the toyota car that had an accident it doesn't seem so. Maybe electrical engineers honestly try , but they don't seem to have the education(software engineering), the tools(languages that prevent errors,high quality libraries,test tools) and the processes(TDD) to get there.
However, the key word there is "seem."
My anecdotal experiences with electrical engineers makes me think otherwise. One engineer I worked closely with was given the assignment to produce the embedded code in a single day. She completed the task including learning the platform and the buggy vendor compiler.
Another engineer didn't care and couldn't be bothered to learn it. When he did code, he hammered it out only until it appeared to work and then signed off on it. Most engineers (any type) would never sign their name to a project finished like that.
The same thing happens all the time with software, but software or hardware, the people I've worked with definitely have the ability and can easily produce great work as long as management is doing their job.
I think firmware (like Toyota's) gets neglected precisely because nobody can see it or improve on it later. I am hopeful that more open hardware initiatives will indirectly result in open-sourced firmware becoming more common, resulting in better code quality.
I realize I'm blaming management here. Effectively managing engineers is hard, and it's a topic I enjoy when it comes up on HN.
Yeah, seriously. The explanation I always give to software people is to look at how much complex high level systems programming relies on debuggers, IDEs, frameworks, compiler warnings, etc.
Now tear all that away (yes, there are JTAG debuggers and whatnot, but it's a different ball game). If you just dive in and start coding you will get lost, especially in cases where your system involves significant interplay between fields. For example, you are writing C code on an SoC hard core, which is talking to IP blocks and custom logic on the accompanying FPGA, which is talking over various interfaces to some high speed PCB you've made, which is talking to some proprietary PCB you didn't make. Okay, now where is the bug? Print statements ain't gonna cut it. You have to have a very clear idea of the principles at play: signal integrity, digital systems, E&M, communication protocols, HDL synthesis, and more.
That's why hardware is hard.
I also work with a guy who has been hacking hardware since the vacuum tube days, and has built his own computer literally from the ground up. He's an amazing resource, so I got really lucky.
As for barriers, I got started for less than $60. I bought an MCU, breadboard, programmer, iron, some LEDs and other components and started messing around. I haven't put together a PCB yet, because I want to make sure I've got everything right. Like the article says, the real world is unforgiving. I don't want to dump $50 on having a board etched and then find out I messed up the design.
With MSP430 you have a launch pad board, And as far as I know your local electronics market will for dead sure have PIC starter boards. They should not cost anything more than 800 rupees. If you are going with PIC, you will get a free IDE from Microchip(MPLABX) and tons on free examples online. So on that end you won't have much problems.
So using PIC should be the easiest way to get you started. If you are in Bangalore, there are two shops at the end of SP road(nsk electronics and sri lakshmi electronics) which stock eval boards for PIC and MSP430. They will also have every thing you need to get started and to move to an intermediate level as they primary stock parts which students use for college projects.
You will also need some resistors, led's, and probably a USB to serial converter cable. Take time to study the PIC tutorial online. The best place to start is this.
Tutorials at : http://embedded-lab.com/blog/?page_id=26
You can work through the hard ware and software examples. It will require both patience and a bit of 'persistence' to get through them. But with time you will only get better.
As for the cost the net will not be more than say 1500 rupees. Which is pretty cheap for what you will learn.
As with these things, going with a little research helps, spend 3-4 days going through tutorial to what parts and in what quantities you need to buy so that you don't have to go there again and again.
BTW, Texas Instruments has recently upgraded the MSP430 to 128K.
Arduino gets you quickly up and running, without spending days fiddling with compilers and settings just to get something working.
The development costs are higher, but with some investment it's certainly possible for a small startup to create new CPUs and GPUs. Gaining market share is more difficult which often necessitates a huge established chip company as a backer.
So it is not just in the high performance hardware space, but even relatively simple hardware projects suffer due to lack of experience.
I like how they blamed the users.
I can't remember the project either.
Not on Hackernews, but this does happen. Microprocessors do get less spotlight because the costs involved in developing and producing them are huge. You also don't hear too often about a team successfully making a high-performance compiler.
Well last week there was a this post: http://blogs.msdn.com/b/ashleyf/archive/2013/09/21/chuck-moo... about a 144-core, ultra low-power chip.
So important stuff gets posted ;)
NRE (one-off costs) on ASIC design is very large. PCB manufacture costs have gone down, thanks to companies like OSHpark, but you still have to assemble them. Then there's CE and UL requirements to actually sell into the market (somewhat evadeable if you're doing "kits" or "prototypes".
As a result, you can't "fail early, fail often" without very deep pockets. And in some product categories "failure" gets people killed (Toyota passim)
Besides, software market is much less saturated than hardware. Like there was a time of electronic device explosion, where a single guy could invent such fundamental things as light bulb, radio or transistor, today is the time where one guy can "invent" yet unseen software.
Besides, their comparison with faster microprocessors I think is unfair, because it requires actual innovation to push currently best hardware further, it is not enough to just reassemble the parts found in different microprocessors to make something new, which kind of is the case with the software.
Not as dramatic as the transistor, but then neither is a lot of software.
Edit: not to mention the continuous background delivery of radical innovation keeping Moore's law going. Now down to a feature size of ~200 silicon atoms and still going.
I'm sure you didn't intend to put down software development as being less complex or less difficult than hardware. The challenges are different and it's true that people from one field tend to underestimate the issues of the other.
Are you concerned that at some point it will be possible for people to buy a tiny but fully fledged computer and let software developers create gadgets that do their job but are not real hardware ? Does this mean that people will appreciate less the real work behind creating the actual device that makes it possible?
The advent of complete operating systems make it possible for people to easily create, debug and test programs without sweating.
Certainly lowering the bar made it possible for more incompetent people to produce lower quality products.
Working in the layer that enables other people to build on a solid foundation and create stuff more easily without having to reinvent the wheel every time is more difficult but rewarding, and it's easily seen by people in each field as the "true $field".
So it's not software vs hardware, it's about hard vs soft problems, core vs peripheral issues, enabling vs business logic.
Replace $field_a and $field_b with (guitar, sitar), (piano, violin), (digital design, rf), (software, hardware), (python script, kernel programming), .....
yet in every of $field_a there are masters whose work is at least as exciting and worth praise (if not even more) than most those that master $field_b. Although it's true that $field_b requires more skill in average to enter.
The point that the OP in the linked article makes is that is a few guys with a few years of modest experience in programming can go create a company worth 8 to 9 figures on the market with few resources other than time and luck and connections.
He then states that this isn't possible in the semiconductor world due to the amount of experience required in that field. I've worked in the semiconductor industry and I agree 100%. I've also worked in RF and it's the same. There's no successful equivalent in those realms that compares to something like Twitter or most of acquisition bait out there. The amount and complexity of the knowledge is insane. (Not to mention the manpower...)
And from a guy who's worked the whole stack from component level to user applications; Python scripting is a couple of orders of magnitude easier than debugging firmware or dealing with DSP code/math or testing semiconductors/antennas. Doesn't require a doctorate to build websites or develop CRUD applications.
I'd wager that it's quite easy to get into hardware- there are tons of material on building easy and fun projects in no time. By combining practical exercises with "The art of electronics" or freely available "Lessons in electric circuits", one can gain enough knowledge to experiment on it's own in relatively short time.
Sauce: I'm an EE.
(This assumes chip design work (ASIC's etc), but general hardware design (uC work) the same principle applies, and then you don't need the calculus and physics knowledge really).
Sauce: I'm an EE too... :D
That's also true about software. Once you get past the usual "comment your code carefully and make sure you're not redundant in it", 99% of the stuff that is useful in practice is stuff you derive from practice.
Software is full of thin ice where you can walk out on it, jump around, then, mistakenly thinking it's frozen over, drive your truck on, only to have it fall through. It looks easy. But there's a reason 90% of the Rails apps I've looked at had no tests.
 - http://ocw.mit.edu/courses/electrical-engineering-and-comput...
 - https://6002x.mitx.mit.edu/
I've seen some physicists put together mind blowing electrical systems.