Does anyone know of a list of projects I should do to at least be able to pickup kicad/eagle to start laying out boards into gerber files into manufactured boards?
One suggestion I got was to design and implement one's own Atari Punk Console: http://www.makershed.com/product_p/mkjr2.htm
I also found this book: http://www.amazon.com/Build-Your-Printed-Circuit-Board/dp/00...
I tried modifying some open source designs, as well as designing a board from scratch, using Eagle and Kicad last year. I found the software so cumbersome and maddening, I gave up rather quickly.
I've stumbled upon some better tools since then. I hope one of these will get me from idea to gerber with less heartburn.
Those are both free. On the commercial side, there are a lot of options, but I'm tempted to just buy Diptrace, which seems to have a good array of features and reasonably-priced licenses.
For instance, recently I got some supercapacitors and had a few ideas for a simple circuit (consisting of resistors, diodes and capacitors). I did an EE unit when I was at school a long time ago but wanted to get a better understanding of how my circuit would behave, particularly voltage levels and discharge rates over time.
I was expecting there would be some software which would allow me to draw a simple circuit, specify input voltages and press 'play', allowing me to see how the circuit acted as a live simulation and see charts of voltage at particular nodes over time.
However, despite googling around, I couldnt find such software geared towards visualisation of simple analog circuits...
Happy to look at OSS or propriety software, I was expecting to find something used in academia but no such luck.
Designing a complete system from scratch has three components, the system design, verification of the design, and then bring-up. Its very frustrating there in the middle. If you want to practice take a built system, and the data sheets for the parts, and write an operating system for it. A lot of chicken/egg problems. Things like "Ok see if the HW bootloader is working by loading a loop which toggles this line which I can watch using an oscilloscope." Real, starting from nothing into something stuff.
Designing shields (Arduino) and capes (BeagleBone) gives you a "known" working system so that you can prototype various peripherals you want in the final system. This is what manufacturer evaluation boards are actually for. They have the ability to be configured into all of the supported modes so that you can try them out prior to building them. A number of systems just lift the schematic from the Evaluation board, delete the parts they don't want, add the parts they do, and voila system "design."
Staying below 20mhz is a good idea until you get a handle on what happens to signals as they travel across a circuit board. Working with LVDS (low voltage differential signalling) lines only makes sense when you see what a 50mhz line looks like 4" from the processor on an unterminated 3 mil trace. :-)
Also if you make useful shields/capes and re-sell them then you can earn some money to buy the tools to make the more complex systems. You will also discover that a 500Mhz oscilloscope that is 15 years old is nearly as expensive as a 500Mhz oscillocope that is new because well if it works and is calibrated, then it works. Kind of like machine tools, they have intrinsic value.
Heh... forget making sense, you can't even tackle some LVDS systems until you start thinking about your "observer effect" when you try to measure signals, i.e. the impedance of your probes...
My first goal is a FPGA dev board, not altogether unlike the FPGA module on bunnie's laptop. The board I envision has a USB interface (for FPGA configuration and communication with host computer), a few peripherals and high speed IO header. I broke the project into three steps: a microcontroller board with USB interface not unlike the Arduino Leonardo (built and working), a board with a small (= cheap) FPGA to experiment with soldering BGAs (board just back from the fab), and the final design.
The ultimate chip I want to use is a BGA, which needs to be soldered with a process called reflow. (Basically, you bake the board in an oven.) So as a side project, I'm modifying a toaster oven into a reflow oven with the first microcontroller board as the controller. Lots of people have done this; you can find tutorials online. I started with an AVR chip for the USB, but I'm planning to switch to a ARM Cortex series M chip (STM32F4, only a few dollars more but way more capable) in later boards.
(EDIT: fixed wrong link)
High Speed Digital Design: A Handbook of Black Magic by Johnson and Graham:
and RTL Hardware Design using VHDL by Chu:
(the latter for designing for FPGAs)
Fast forward to today, I'm working on bringing my dream to create a laptop that doesn't stink, good quality, very Linux friendly and easy to use(by making a easy to use Linux distro). I also want to bring privacy and security by default by leveraging open source tools(including encrypted email and chat). There is a lot of room for growth and the niche markets of developers, hackers and future hackers could one day extend into other markets.
Alas my skills in PCB design extend to Motor controllers/drivers. I want to learn how to do High frequency bus design say to make ARM or i7 motherboard designs. If anyone here knows what kind of software to do that sort of thing I'd love to know.
- KiCAD: http://www.kicad-pcb.org/
- Eagle: http://www.cadsoftusa.com
Are the most popular entry level ones. You can make an ARM or i7 board with either. I think Bunnie is using Altium which is much more pricey.
- DipTrace: http://www.diptrace.com/ (probably best bang/buck in terms of capabilities/learning time)
- gEDA: http://www.geda-project.org/ (not a beginner tool)
- Fritzing: http://fritzing.org/download/ (super beginner tool)
Software-wise, Altium is one popular choice (I've used this and I like it; bunnie uses it  as does Dave of eevblog) and another is Cadence Orcad (I haven't used this myself, but the SABRE schematics come in this format). These are both expensive; Cadence has a range of options, from a standard version 1 year license costing $1,300 to an advanced version perpetual license costing $9,995  - for Altium 1 year is $1,750 and a perpetual license is $5,495 .
Piracy protection used to be weak, and I'm not up-to-date on how easy it is to bypass. Like Photoshop, people used to say the companies turned a blind eye to non-professional users who were learning. This may have changed recently, I don't know.
Anyway, on to my plan to learn this stuff. Bunnie's planned laptop uses a Freescale i.MX 6 processor. It seems pretty feature-packed as a processor. Among other things, Freescale seem to offer some good design documentation; they have evaluation boards  where they give out not just pdf schematics and gerbers but also the CAD software files.
They also have extensive design documentation. Public datasheets, Hardware Development Guide, Processor Reference Manual and so on . I've also discovered getting prototypes assembled in china is cheaper than I expected  - a few hundred rather than a few thousand dollars.
My provisional plan is to read all that freescale documentation, take one of the freescale board reference designs, then to basically produce a cut down version mostly by deleting components I'm not fussed about, then to get that made and see if it works.
That's my plan - whether it will work or be possible for the amount of money I have not remains to be seen :) and needless to say if an expert comes along to reply to you, maybe they'd have feedback for me too :)
Altium is one popular choice; another is Cadence Orcad.
I will say that I like working with Freescale's reference material better than, say, TI's OMAP and AM series. Of course I work for a larger company that has easy access to their FAEs, but at least they have an online forum where you can get answers...sometimes.
You've sniffed out one big problem with the Cadence Orcad layout. Freescale recommends you just cut-and-paste the DDR3 lines directly from their layout. Partially because the timing is so hard to get right and partially because they don't have the bandwidth to help you debug your board layout. So that's probably the biggest hurdle in layout. TI went a different way with package-on-package RAM mounting, but that's another set of headaches when it's time to build boards.
Another nice part is that you can get working boards today if you want them. I'm a big fan of Boundary Devices (http://www.boundarydevices.com), you can get a SABRE-based board for $99. Pick up the kernel patches from FSL and you can have your own system building and running in an afternoon.
The i.MX6 is also a nice evolution of their SoC family, especially now that you can opt out of their power-management IC. But that means a bit more schematic and layout work if you choose to go that route. I'd leave the PMIC in, you also get some nice additional features if you use it.
Marvell . . . hoo boy. Let's just say that everyone telling us "If you go with Marvell, you're in for a ride" was correct. Just enough detail to get you into trouble. Cheap, though. . . . :-)
The Marvell platform was just a trainwreck. Even simple questions were met with blank stares.
Us: "We want to know the delay period between freezing the timer and reading it. The documentation says 'three clocks', but we have strong evidence this is not accurate."
Marvell: "We are happy to help with your question. You need to wait three clock cycles."
Us: "That doesn't work. (insert mountain of evidence here)."
Marvell: "Let us get back to you."
Us: "Remember that question about the clock register?"
Marvell: "What question?"
(repeat above, several times)
Much later, we get a phone session together.
Obviously new Marvell chip designer: "It says three clock cycles here."
Us: (lots more evidence that that is not the case)
Slightly terrified (but still new) Marvell chip designer: "Let us get back to you."
(much, much later ...)
Marvell Chip designer, who clearly has his act together, but designed the circuit four years ago and can't remember, so he digs into the design files: "Oh yeah. Three clocks, but they're internal, so you can't see them. Use this clock over /here/ and write this, then read that, and you should be good."
Repeat for each major chip feature, and don't get me started on DRAM controllers or USB PHY registers.
(I've become more than a little discouraged from my few submissions that have sunk through the front page like a stone -- so I'll just leave this comment here :-)
edit: Be sure to also read the linked "interview that focuses on one of my recent failures.":
This has come-up on HN a few times. As an EE with plenty of experience designing all manner of hardware I'll tell you that, while not impossible, there's quite a road to travel from having no formal training in electronics to designing and laying out reliable working boards. Not to say it can't be done. Anyone with enough drive and motivation these days can learn almost anything with the myriad of resources out there. Possible? Yes. Plausible? Only for the really driven.
There's a demarcation line somewhere in the 25 to 50 MHz range beyond which the design of digital electronics isn't very tolerant of amateur design practices. You really start to need a solid understanding of the underlying science. Unlike software development, it is frustrating and expensive to blindly poke around mid to high speed design in hopes of finding a solution.
Digital circuits in many ways become very analog as frequency increases and clock edges get sharper. They can and do radiate RF and they can and do bounce around the copper traces mucking things up if either the electrical design or the trace layout and board stack-upare off.
Something as seemingly "simple" (in quotes because it isn't simple) as implementing a DDR2 or DDR3 memory interface requires quite a range of information in your mental design database.
For example, you need to start to consider the time it takes for signals to leave the FPGA (travelling at nearly the speed of light) and arrive at the chip. These signals must arrive within a very precise window of acceptance. A window which is measured in nanoseconds. You need to model the signal path from the die through the interconnect, BGA ball, land and via stack, trace and then back up the chain into the memory chip. And you need to do this for every single connection.
While you do the above you also need to look after the integrity of the signals themselves. In fact there's a discipline called "Signal Integrity" (SI) that deals with this. Most large companies have dedicated Signal Integrity engineers. The field is somewhat complex and requires a good deal of experience and study, particularly for mass production. Anyhow, at higher speeds everything is a transmission line. Failing to design with this in mind pretty much guarantees that a circuit that looks absolutely perfect on paper will not work.
SI has a close cousin called "Power Integrity". As operating frequencies increase and signal edges become sharper the power distribution system (PDS) becomes very critical. Once again, not understanding the subject can result on a board that looks great but does not work, or worst, is hopelessly unreliable. PDS design issues can cause resonance and oscillations on the power planes, increase clock jitter, destroy signal integrity and timing and more. This too is a subject that is usually the domain of specialists in larger organizations.
SI and PDS design alone can mean that you can have a schematic that looks absolutely perfect and a circuit board that does not work, is unreliable or has a "personality" such as weird things happening, random resets and other hair-pulling afflictions. SI and PDS design knowledge is of paramount importance as design operating frequencies rise.
That's not the end of it. Digital design with FPGA's also needs to consider such topics as the number of average simultaneous outputs switching per side/bank on the device. This has both SI and PDS implications.
As you approach the GHz range things get even more interesting. Examples of this might be doing HDMI or DVI transmitter/receiver designs or Thunderbolt/PCIe interconnects. Now you are solidly in RF territory, a land with it's own set of voodoo to learn.
As I said above, I draw this imaginary demarcation line somewhere in the 25 to 50 MHz range. Below this line one can hack around and build stuff that works well or reasonably well most of the time. I'll generalize and say that's the Arduino and below territory, the hobby embedded territory. Nothing too critical. Timing windows are comfortable. SI issues are relatively minimal. SSO is mostly not of concern. Transmission line or impedance controlled design is rarely needed, etc.
The minute you start talking about something like a laptop board or high performance FPGA based, well, anything, you enter a domain that is either reserved for very experienced EE's or requires a ton of time and dedication to master as a non-EE. This is also a domain where hobbyist tools tend to fail miserably. I use Altium Designer for electronic design along with FEA tools for SI and PDS designs on high speed boars. I also use a number of custom software tools created over time in order to simplify and facilitate the process. Most companies or designers develop guidelines, rules, tools and procedures over the years in order to improve design yields. Like I said, a perfect schematic does not directly equate to a working board.
I haven't even covered topics such as thermal design and regulatory testing. That's a whole other layer of knowledge that is critical to have depending on what you are doing.
Making hardware is very different from writing software in that the iterations per unit time are both slower and very costly. You can sit in front of your computer and hack for hours and hours, fail, edit, compile, learn and it costs nothing. Not so with hardware. You really can't learn that way. Even if you had money to burn, at one point you really need to understand the science behind it all in order to not burn cubic time learning by trial and error.
Even freshly graduated EE's are not necessarily qualified to successfully approach high-speed design. It takes years of learning and dedication beyond school to really master the art. This is not a matter of software. Even if I gave you a license to $100K worth of EE design software (yes, it can get that expensive) and fully trained you on it's use you would not be prepared to design high speed boards with it.
Again, none of this is to say this is an impossible domain for non-EE's to approach. Just pointing out that it is an iceberg. What you see is what's visible above the surface.
Where to start? College Physics and Calculus. Basic Electronics. Transistors. AC, DC and RF Circuit theory. Buy kits and breadboard-based trainers and build circuits. Make lots of boards. Learn. Look at the set of courses required for an EE degree and learn as much of that as you can (and in the order they are presented). Reading books about signal integrity right now without the required foundation isn't a good idea. It's like trying to read a Chinese book on Chinese literature without having learned the Chinese language first.
I started laying out a "small" Zyng board and running though the DDR3 timing calculations for each signal line was insanely complex. I was like "seriously?" It explained board layouts where the memory traces are all made the exact same length even if that wastes board space.
That said, if you're interested in pursuing this and are willing to spend some time taking online training and building a number of exemplar circuits to test out your understanding.
Start at 10Mhz and work your way forward to 200Mhz.
Also, get some PCB quotes before you try to jump feet-first into learning high-speed design via rapid prototyping. The cheapest boards I've ever ordered were 2 for $80; the last board I made with even the slightest eye to density/PDS/SI I think was 4 for $500. That was a dinky 2"x2" or 3"x3" 4-layer board too, none of the crazy 24 or 32 layers that laptops have.
What I really like though, is the trend that the rasberry pi is part of (and other boards/SoCs before it) -- it would be awesome to get some reasonably cheap completely open designs - both boards with I/O ram and something like a few tested FPGA with something like the LEON "cpu".
I'm not sure how feasible it is -- and I'm sure it would be great for learning and experimentation. Not everything needs to be high speed -- the Amiga could do a lot of cozy graphics and audio, and snappy GUI -- all with a 7.5 Mhz cpu. There should be some room to play in the now "ancient"/"slow" space...?
Basically I'd love to see some mini/nano-ITX like boards, with passively cooled FPGA, with BIOS/init via coreboot running various open OSs. The rasberry pi ticks a lot of these slots of course, but it still has a few proprietary parts in there (then again, it is dirt cheap too).
The main thing I see missing, is something that is accessible from the ground up (maybe something even slower, along the lines of the C64/VIC20 might be even better).
As a programmer who's had to go through a lot of the learning processes you describe, and I can't agree strongly enough with your informative cautions on the challenges in PCB design, some here might be interested in this document that is merely an indicator of some of the challenges you mention.
What tests could you design to verify a hackers proficiency and acknowledgement of the product's quirkiness that's practical to verify but can't easily be cheated?
Perhaps when you initiate an inquiry you could be given a unique function that makes a set of randomly generated non-destructive transformations to a string. Your task is to submit a string that will be transformed by that function into your email address. If not your email address then the output string "I will not ask Bunnie for help, I will fix problems myself and share my fixes"
does it really matter? the goal is just to discourage mass market consumers. if somebody wants one so badly they're willing to cheat on a test to get one, that isn't really mass market.
and isn't cheating sort of the spirit of hacking that projects like this want to encourage? if you want something, you make it happen without worrying about the "right way" to do it. nothing wrong with a bit of cheating - this is reality, not school.
What I have seen called cheating by hackers are things like exceedingly clever shortcuts. Cheating by using someone else's answer in a qualification test is not in any sort of spirit of hacking that I'm familiar with.
And probably, it will be priced in accordance with what
you’d expect to pay for a bespoke digital oscilloscope...
Heck, there are oscilloscope probes that cost $5000.
But test equipment isn't exactly high volume stuff.
- Webserver/standalone operation.
- Integration with other standard comm buses.
- Option for Better A/D (typical o'scope is 8 bits).
- More/better output options. You could potentially use it to simulate some hardware (in the loop).
- Ability to add COTS hardware.
- Open design insures owner against planned obsolescence.
Depth of the sample buffer. Tradeoffs between sample speed and sample resolution. Triggering. Math (anything from sums/differences to FFTs). Bandwidth, including tricks like equivalent-time sampling. Various kinds of display. Various kinds of human interface. Various kinds of machine interface. Extra features like waveform generators.
Honestly you can figure this out yourself: go to some scope makers' websites (Tektronix, Agilent, B&K, Fluke, Instek, Teledyne Lecroy, etc etc) and see how their marketing department distinguishes their product lines.
The processor, for example, has a 5,000 page technical reference manual. That's just describing how to use the chip. It doesn't go into detail about how each part works internally. But to the layman it's just a 2cm^2 chip so how complicated can it be?
Everyone wants the sausage but nobody wants to learn (or even know) what goes into it.
This is something he is right to be concerned about. No matter how many warnings you slap on something that it is currently only suitable for advanced users, you will always get a bunch of people who want some shiny new thing and buy it and then bitch and moan that it doesn't have Apple-style end-user polish.
It is all being developed in the open (in all senses of the word), so if some other entity who does want to get into the mass manufacturing and support side thinks doing so is worth it, that can always happen.
In the long run this benefits everyone, of course, but it's only the techies who can actually use the source, and that's why it's mostly the techies who care about it being open.