Hacker News new | comments | show | ask | jobs | submit login
Why Hardware Development is Hard, Part 2: The Physical World is Unforgiving (danluu.com)
95 points by mmastrac on Nov 11, 2013 | hide | past | web | favorite | 40 comments



>>Hardware is different. You never hear about a new team successfully making a high-performance microprocessor.

May be not a microprocessor. Building something like that requires moderately deep pockets.

But hardware start ups happen all the time. They don't have any where near the kind of press coverage we do. I am working on a couple of embedded projects on the side. And I can tell you the barrier to entry there is pretty high. Web companies for all their scalability problems are still by and large low hanging fruits, compared to any hard ware start up. There is also little VC activity there, and you won't get slots for pitches at conferences.

One of things that I see while working with embedded developers(I am currently working with some very good ones). C is literally the bread and butter, so you have no language wars. The highest priority is Stability(Read, Quality in general), followed by efficient code, and then maintainable code. Your survival largely depends on your ability to RTFM properly, you have to internalize and thoroughly understand specs, the hardware interfaces you are trying to program and the way C lets you program them in a interplay with best practices there.

Internet isn't always of much help as the platforms you are trying to target aren't too famous. And there are no hipsters who crazy blog every little development in that area.

Of the many things that I've learned(and enjoyed doing it), is to go to great extents to workout things on paper. Mind maps, diagrams, graphs, doodles, checklist etc. To first prove a degree of correctness on paper and begin to program. It has greatly increased my ability to divide things and look at them with clarity. In turn I am able to write stable, and quality code.

According to me every one should work a embedded project. Even if its not a serious project. And please work with plain C without an OS.

Additionally, writing a python program on linux, running on something like rasberry pi is not hardware hacking, or low level programming.


>> The highest priority is Stability(Read, Quality in general)

After reading the transcripts about the code in the toyota car that had an accident it doesn't seem so. Maybe electrical engineers honestly try , but they don't seem to have the education(software engineering), the tools(languages that prevent errors,high quality libraries,test tools) and the processes(TDD) to get there.


Without commenting on the Toyota situation, yes, it is common to find embedded code written by an electrical engineer who does not seem to have the necessary skills.

However, the key word there is "seem."

My anecdotal experiences with electrical engineers makes me think otherwise. One engineer I worked closely with was given the assignment to produce the embedded code in a single day. She completed the task including learning the platform and the buggy vendor compiler.

Another engineer didn't care and couldn't be bothered to learn it. When he did code, he hammered it out only until it appeared to work and then signed off on it. Most engineers (any type) would never sign their name to a project finished like that.

The same thing happens all the time with software, but software or hardware, the people I've worked with definitely have the ability and can easily produce great work as long as management is doing their job.

I think firmware (like Toyota's) gets neglected precisely because nobody can see it or improve on it later. I am hopeful that more open hardware initiatives will indirectly result in open-sourced firmware becoming more common, resulting in better code quality.

I realize I'm blaming management here. Effectively managing engineers is hard, and it's a topic I enjoy when it comes up on HN.


> go to great extents to workout things on paper

Yeah, seriously. The explanation I always give to software people is to look at how much complex high level systems programming relies on debuggers, IDEs, frameworks, compiler warnings, etc.

Now tear all that away (yes, there are JTAG debuggers and whatnot, but it's a different ball game). If you just dive in and start coding you will get lost, especially in cases where your system involves significant interplay between fields. For example, you are writing C code on an SoC hard core, which is talking to IP blocks and custom logic on the accompanying FPGA, which is talking over various interfaces to some high speed PCB you've made, which is talking to some proprietary PCB you didn't make. Okay, now where is the bug? Print statements ain't gonna cut it. You have to have a very clear idea of the principles at play: signal integrity, digital systems, E&M, communication protocols, HDL synthesis, and more.

That's why hardware is hard.


Can you advice on how can one get started in embedded programming? What are the barriers, talking the cost and hardware to begin. I can't think of much of a real project that I would be fond of making but even a set of blinking lights would be an achievement for me.


I've also just started learning hardware development in my spare time. It's a very different challenge than software, and I'm already having a ton of fun with it. To get started, I'd recommend taking a look at this tutorial: http://www.newbiehack.com/MicrocontrollerTutorial.aspx

I also work with a guy who has been hacking hardware since the vacuum tube days, and has built his own computer literally from the ground up. He's an amazing resource, so I got really lucky.

As for barriers, I got started for less than $60. I bought an MCU, breadboard, programmer, iron, some LEDs and other components and started messing around. I haven't put together a PCB yet, because I want to make sure I've got everything right. Like the article says, the real world is unforgiving. I don't want to dump $50 on having a board etched and then find out I messed up the design.


I don't think you will need to go the PCB level work right now. You can do a good deal of work on the bread board. And for things which are a little complicated, building reusable modules and learning how to solder a little quickly, works most of the times.


Just checked your profile, nice to know you are from Jaipur. The easiest way is to start with a Arduino, the other ones are PIC or MSP430.

With MSP430 you have a launch pad board, And as far as I know your local electronics market will for dead sure have PIC starter boards. They should not cost anything more than 800 rupees. If you are going with PIC, you will get a free IDE from Microchip(MPLABX) and tons on free examples online. So on that end you won't have much problems.

So using PIC should be the easiest way to get you started. If you are in Bangalore, there are two shops at the end of SP road(nsk electronics and sri lakshmi electronics) which stock eval boards for PIC and MSP430. They will also have every thing you need to get started and to move to an intermediate level as they primary stock parts which students use for college projects.

You will also need some resistors, led's, and probably a USB to serial converter cable. Take time to study the PIC tutorial online. The best place to start is this.

http://embedded-lab.com/

Tutorials at : http://embedded-lab.com/blog/?page_id=26

You can work through the hard ware and software examples. It will require both patience and a bit of 'persistence' to get through them. But with time you will only get better.

As for the cost the net will not be more than say 1500 rupees. Which is pretty cheap for what you will learn.

As with these things, going with a little research helps, spend 3-4 days going through tutorial to what parts and in what quantities you need to buy so that you don't have to go there again and again.


I went the MSP430 route, it was essentially free (from memory about US$5 incl postage). If you are on a very tight budget, this might be better. There are a ton of free tutorials, forums, development tools available. The caveat is that setting up the dev environment in Linux is not that easy, whereas for Windows it is essentially plug-n-play.

BTW, Texas Instruments has recently upgraded the MSP430 to 128K.


An Arduino kit is a good starting point. You can get some starter kits with excellent tutorials for $50 or less.

Arduino gets you quickly up and running, without spending days fiddling with compilers and settings just to get something working.


There are plenty of hardware startups, you just don't hear about them a lot. The AVR (used in Arduino) was designed by a few guys from Norway, which sold their idea to Atmel.

The development costs are higher, but with some investment it's certainly possible for a small startup to create new CPUs and GPUs. Gaining market share is more difficult which often necessitates a huge established chip company as a backer.


As another data point. A couple days back there was an article about a hardware kickstarter product, where the designers did not add reverse polarity protection to their power socket. (For the life of me I can't find it now...).

So it is not just in the high performance hardware space, but even relatively simple hardware projects suffer due to lack of experience.


You're probably thinking of this article: https://news.ycombinator.com/item?id=6689004

I like how they blamed the users.


Yes that was it, thank you makomk!!


I think it was no polarity on the battery connection. The fact the battery could be inserted the wrong way was interesting, too.

I can't remember the project either.


> You never hear about a new team successfully making a high-performance microprocessor.

Not on Hackernews, but this does happen. Microprocessors do get less spotlight because the costs involved in developing and producing them are huge. You also don't hear too often about a team successfully making a high-performance compiler.


"Not on Hackernews"

Well last week there was a this post: http://blogs.msdn.com/b/ashleyf/archive/2013/09/21/chuck-moo... about a 144-core, ultra low-power chip.

So important stuff gets posted ;)


Another important factor: bringing it to market doesn't scale up or down like software does. You can make a program and test it on a single user; or, at the other end, with a bit of pre-planning you can scale upwards by buying more EC2/AWS capacity on a credit card.

NRE (one-off costs) on ASIC design is very large. PCB manufacture costs have gone down, thanks to companies like OSHpark, but you still have to assemble them. Then there's CE and UL requirements to actually sell into the market (somewhat evadeable if you're doing "kits" or "prototypes".

As a result, you can't "fail early, fail often" without very deep pockets. And in some product categories "failure" gets people killed (Toyota passim)


> Another important factor: bringing it to market doesn't scale up or down like software does.

Besides, software market is much less saturated than hardware. Like there was a time of electronic device explosion, where a single guy could invent such fundamental things as light bulb, radio or transistor, today is the time where one guy can "invent" yet unseen software.

Besides, their comparison with faster microprocessors I think is unfair, because it requires actual innovation to push currently best hardware further, it is not enough to just reassemble the parts found in different microprocessors to make something new, which kind of is the case with the software.


I don't agree that the market is saturated; people are building new gadgets all the time, especially "internet of things applications". New applications of techniques like capacitative sensing or innovative radio modulation keep appearing.

Not as dramatic as the transistor, but then neither is a lot of software.

Edit: not to mention the continuous background delivery of radical innovation keeping Moore's law going. Now down to a feature size of ~200 silicon atoms and still going.


Yes, I should have said "more saturated", because first electronic devices were around for longer and second physical devices have more limitations than software, thus more difficult to make something new and useful.


My background is in hardware, especially RF, and it's fun to talk to pure software guys that "understand" RF and radios just because they set up a wireless network at their apartment or made a cantenna. I invite them to hang out with some world class engineers that I've worked for in the past that design microwave networks, test semiconductors, cell systems or technical ilk of that caliber. You really aren't experienced in the field without 30,000 hours of dedicated effort.


"My background is in software, especially programming, and it's fun to talk to pure hardware guys that "understand" programming and programs just because they installed a linux box at their apartment or made a web page. I invite them to hang out with some world class engineers that I've worked for in the past that create compilers, test distributed systems, maintain and evolve widely used libraries or technical ilk of that caliber."

I'm sure you didn't intend to put down software development as being less complex or less difficult than hardware. The challenges are different and it's true that people from one field tend to underestimate the issues of the other.

Are you concerned that at some point it will be possible for people to buy a tiny but fully fledged computer and let software developers create gadgets that do their job but are not real hardware ? Does this mean that people will appreciate less the real work behind creating the actual device that makes it possible?

The advent of complete operating systems make it possible for people to easily create, debug and test programs without sweating. Certainly lowering the bar made it possible for more incompetent people to produce lower quality products.

Working in the layer that enables other people to build on a solid foundation and create stuff more easily without having to reinvent the wheel every time is more difficult but rewarding, and it's easily seen by people in each field as the "true $field".

So it's not software vs hardware, it's about hard vs soft problems, core vs peripheral issues, enabling vs business logic.


I can't speak for xradionut obviously, but he specifically said RF, not "hardware", so your statements are not equivalent. RF is a different beast, and its peculiarities do make just about any RF problem more complex and more difficult than most software problems. Hell, they are more complex and more difficult than most other EE problems as well and the original statement could have been applied to digital design engineers without losing any meaning.


sure, but my point is that it's not because $field_a is easier than $field_b, but since $field_a allows to hack something up quickly even by inexperienced people, there is the feeling that $field_b is harder and more worthy just because it's difficult to get anything done without working hard.

Replace $field_a and $field_b with (guitar, sitar), (piano, violin), (digital design, rf), (software, hardware), (python script, kernel programming), .....

yet in every of $field_a there are masters whose work is at least as exciting and worth praise (if not even more) than most those that master $field_b. Although it's true that $field_b requires more skill in average to enter.


I'm not belittling software, since that's my current field of employment. And there's a large number of tough software issues that require years of experience.

The point that the OP in the linked article makes is that is a few guys with a few years of modest experience in programming can go create a company worth 8 to 9 figures on the market with few resources other than time and luck and connections.

He then states that this isn't possible in the semiconductor world due to the amount of experience required in that field. I've worked in the semiconductor industry and I agree 100%. I've also worked in RF and it's the same. There's no successful equivalent in those realms that compares to something like Twitter or most of acquisition bait out there. The amount and complexity of the knowledge is insane. (Not to mention the manpower...)

And from a guy who's worked the whole stack from component level to user applications; Python scripting is a couple of orders of magnitude easier than debugging firmware or dealing with DSP code/math or testing semiconductors/antennas. Doesn't require a doctorate to build websites or develop CRUD applications.


Maybe I am looking at this differently, but I would say that by definition what you describe makes $field_b harder than $field_a. I don't think the difficulty of the field has any bearing on its worthiness, though.


Hardware is fun (when you get it working), and intelectually challenging. I'm even younger than the "FPGA guy" - I guess I fall in the mentioned Arduino camp (although I have long moved on to bare microcontrollers).

I'd wager that it's quite easy to get into hardware- there are tons of material on building easy and fun projects in no time. By combining practical exercises with "The art of electronics" or freely available "Lessons in electric circuits", one can gain enough knowledge to experiment on it's own in relatively short time.


Not helping is the lack of good educational material - unlike software, there aren't all that many books around dealing with hardware, and more importantly, there aren't any people teaching others how to do stuff (via writing, videos, etc.). The grizzled veterans do their thing without bothering to share the knowledge...


There are plenty of excellent good books, but the entry barrier is somewhat higher. You simply need at least a basic understanding of calculus if you are to actually understand any of the processes that occur in semiconductors, and some basic knowledge of electromagnetism is required to grok issues like signal integrity.

Sauce: I'm an EE.


Yes, but then there is also a ton of best practices, what works and what doesn't work, basic circuit design principles etc. which to my knowledge isn't really condensed in any one book, but rather in lots of trade articles etc.

(This assumes chip design work (ASIC's etc), but general hardware design (uC work) the same principle applies, and then you don't need the calculus and physics knowledge really).

Sauce: I'm an EE too... :D


> Yes, but then there is also a ton of best practices, what works and what doesn't work, basic circuit design principles etc. which to my knowledge isn't really condensed in any one book, but rather in lots of trade articles etc.

That's also true about software. Once you get past the usual "comment your code carefully and make sure you're not redundant in it", 99% of the stuff that is useful in practice is stuff you derive from practice.


I'm learning how to test right now. You can read three books about testing, but you won't really get what they're saying until you actually TDD an app. Domain concepts are really malleable (read, hard to nail down) and deadlines are really hard, it would not surprise me if only 1-3% or so of people that work with software seriously actually test their code. I count myself lucky to be able to throw enough work hours in to figure it out.

Software is full of thin ice where you can walk out on it, jump around, then, mistakenly thinking it's frozen over, drive your truck on, only to have it fall through. It looks easy. But there's a reason 90% of the Rails apps I've looked at had no tests.


Speaking of EE educational material, for the basics this MIT OCW course[0] is really good. There's also an MITx version[1] that's super interactive and even features Gerald Sussman of SICP fame.

[0] - http://ocw.mit.edu/courses/electrical-engineering-and-comput...

[1] - https://6002x.mitx.mit.edu/


Get a degree in EE?


Or physics. I find it's quite often the knowledge of the underlying principles that spurs my problems solving and creativity when working with electronics, rather than the (rightly) engineered outlook you get coming out of EE.

I've seen some physicists put together mind blowing electrical systems.


He's totally right that designing microprocessors is hard. However you can design a reasonably decent microprocessor in a modern ASIC. You're only using perhaps 4% of the true power of silicon, but it's great fun, iterative, and you can download working designs already.


As a "young" FPGA guy, I love his posts. Anyway any idea who the other group was? Cyrix?


Transmeta


A painful truth- [Deviating a bit-It is so very difficult in the mechanical world. For a moment, smart sensors offer some hope on understanding mechanics better and seem to suggest the evolution of a new breed of machines which could be truly revolutionary.] I guess the mistake we all make is that we see hardware as chip design and other deep stuff. But look at the pioneers of hard ware design and see what they are doing. Folks like Tony fadell and Elon musk are doing the Jobsian to the existing IBM likes of respective categories(home-sensors, automobiles). Very feW hardware startups till now seem to get this point.[Its changing] But you have to be very smart to be a good hardware engineer- I have to admit that. Lifes unfair :)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: