Incidentally, the guy creating this is Bunnie Huang -- the guy who designed the Chumby, was famous for Xbox hardware hacking, and did an awesome (open-hardware) MITM attack on HDCP at CCC [1,2].
Sean "xobs" Cross is collaborating on this as well. He's more low-key than Bunnie when it comes to publicity but still worth following if you're into the "Internet of Things" (I kind of hate this phrase, but it gets the idea across effectively enough).
Sean was also at chumby industries, by far the greatest collection of smart people I've ever worked with (sadly, on a product idea that ultimately failed).
The chumby was sort of a precursor to the tablet life most of us live now. It's sad that it failed, but the concept of chumby no doubt influenced a lot of really handy app design.
bunnie's history with open hardware design at Chumby leaves me confident that this will be a really fun system to work with. I've hacked around on various Chumby devices (I've got an Infocast 8" running Debian, for instance), and they've consistently been fun, reasonable systems to work with.
I was wondering this - I 've been following this blog but have been lazy at reading up on his background. I found his posts on the manufacturing/electronics culture/subculture around Shenzhen fascinating...
not to mention the regular posts on odd circuit boards....
I'd be conscious about the LIPO battery choice & potential fire risk.
They're absolutely fine when treated with care (i've had many for years for RC planes & helis and never had more than minor issues), but i'd be concerned people are used to not having to care about their batteries.
If you discharge them below ~ 1.1v per cell (higher for cheaper ones) they don't quite explode but it's not a slow burn either: http://www.youtube.com/watch?v=hcwOwf55Rtc
There are other options like li-ion or, slightly lower voltage per cell but you can pretty much abuse them and they won't blow up: A-123's.
EDIT: for clarity, discharging to 1v alone shouldn't cause a fire, it's the act of charging them from that state.
Nah, just use lithium iron phosphate (LiFePO4) batteries. They're not quite as energy dense as LiPO, but they have longer lifetimes, better power density, and are WAY safer.
You'll see a lot of roboticists using these already (for safety reasons). For example, you can grab some nice Turnigy LiFePO4's at most hobby shops.
One suggestion: it could be made cheaper and maybe more interesting by removing the screen altogether (there's an HDMI port !) to do a C64-like computer, with a tiny smartphone-like internal screen.
We all have multiple screens already - such as tablets or smartphones we carry.
Presumably the board will also be offered on it's own, without anything connected to the LVDS interface. One of Bunnie's previous projects, the Chumby, is available this way.
SparkFun is awesome, one of my favorite places to buy hardware hackery stuff along with Adafruit and Newark, but if you want to hack on this board for cheaper you can buy Insignia Infocast 3.5 units off ebay for around $40, half the price. The board inside is virtually the same as the Chumby Hacker Board (different arrangements of some of the headers and ports, IIRC) and it comes with a wifi module, 3.5 inch LCD w/resistive touchscreen, speakers, etc.
At $80 for the Chumby Hacker Board most people are better off just buying a rasp-pi, the CPU and memory in it crush the falconwing board and there's a much larger active community of people hacking on it so you get easy access to newer linux kernel versions, distros, etc, but at $40 and including the LCD/touchscreen, wifi, speakers, etc [infocast|chumby One] are still a great deal to use just for hacking on. Still slower than the pi but having the lcd touchscreen/accelerometer/speakers/bend sensor buttons/wifi/etc gives you a lot of stuff to hack on out of the box that you'll have to pay extra for on the pi.
Imagine a battery-powered C64-like device (ie "all in one", the motherboard just below the keyboard) with a tiny LCD screen (no bigger than your smartphone) and a real-sized keyboard.
Now imagine doing RDP or VNC to "display" the screen on your tablet or your phone - there are browser-side VNC clients that are easy to use, you wouldn't even have to install a VNC client.
Unless you want to playback movies, IMHO there is no need for an HDMI input
No big LCD = longer runtime (or throw in an eink display - framerate is not that important).
I would be willing pay a premium if it also included a breadboard connecting to the various internal ports. That would be a real hacky laptop, a workbench for geeks.
It's great to see that dedicated hackers can get their hands on the hardware and data-sheets needed to design and manufacture these kind of high performance projects. There still remains one huge barrier before we'll see a large number of projects like this: the costs of the equipment used to validate the signal integrity on high speed digital systems like this immense. I'm hoping that soon enough Moore's law will work it's magic in the prices of signal validation equipment and at least a few hacker spaces will be able to gather enough cash to pony up for a 10 GHz scope and a nice logic analyser.
Actually it's been easily possible for years. Just forget about buying new equipment and buy older Tektronix or HP sampling scopes on ebay. For eg the HP 54121T, a 20GHz scope with TDR, able to characterize trace impedance absolutely workably for modern motherboard applications.
I got a perfectly functioning one on ebay for $600.
Also you can get the service manuals, which you can't for more recent equipment.
You don't need that kind of gear to build a board like this. You just need a good set of EDA tools and to read all the datasheets carefully. Stuff like trace impedance and the performance of high-speed lines should be done in simulation tools before you ever send out the board to be fabbed. It's only in cases where people are really pushing the limits (networking gear, RF gear, for example) that high-speed physical measurement hardware is necessary.
A board like that, with a Spartan6 and ARM processor, is actually fairly easy to make work "out of the box" with minimal attention to pre-build simulation, as long as you carefully follow the datasheets' recommendations for signal routing, bypass, etc.
Bunnie is taking advantage of the massive scale of these SoC systems like the i.MX. There really isn't a lot of exotic signal line design needed to make these chips run. From my own knowledge of board design with this Freescale part, the only tricky layout would be the DDR3 RAM lines. The rest is wiring up I/O to the necessary connectors and discretes.
TI even made it easier by mounting their DRAM in package-on-package ball grid. You can find the TI OMAP4 Pandaboard schematics online and see how simple their system design is.
This is an absolutely awesome project, mad props to the guy for envisioning it and pulling through. The cost must work out to something terrible but I can totally see why someone would do this.
Having an FGPA on there is a very clever idea, it adds flexibility to the design in a way that a prototyping area would but much cleaner. The one corner with all the PWM and other connectors is the most interesting part, I really wonder what kind of plans he's got with this but it goes way beyond 'just another laptop'.
If someone did a 'Bunnie Huang uses this' post I'd be all over it, the tool collection to create a design like this would be extremely interesting reading.
Interesting, he mentions that he'd love to have a triple screen capable laptop, but then when he has a chance to make his own it has two possible monitor attachments. Or did I miss something? Anyway, with USB->hdmi adapters this may no longer be as much of an issue.
I wonder what computing would look like if every motherboard had an FPGA on it (especially one that could be rapidly re-programmed).
The only ideas that immediately come to mind would be games offloading some of their logic to a custom FPGA-based co-processor, or being able to have a hardware decoder of all the latest audio/video codecs. But maybe with the speed of current hardware, all of that might be moot anyway.
At the worst, there'd probably be a lot more people out there that can speak Verilog.
More importantly, people will be using and modifying these, both in software and hardware, for the next ~30 years. During that time, how many other devices will get left behind because we can't extend and modify their software to meet our new use cases?
I would assume that building your own lightsaber, given it's form factor, would involve significant amount of laptop-like small, unfriendly electronics (not to mention fiddling with force fields and other advanced physics) :)
You would think, but based on the literature they don't actually seem to be that finicky. Just stick some kind of crystal in some kind of case, with a couple other off-the-shelf components, and that's it.
Nothing at all. But I can't think of any other community where we'd get into a discussion of the build process of a lightsaber, can you? I love participating in such a geeky crowd, even though I was never a big Star Wars guy myself.
Don't know whether this fits, but here's something I'd like from the power system: Automated battery maintenance. I'm plugged in, a lot, and I forget to / blow off periodically cycling my battery. (Admittedly, maybe I'm unusual / an edge case, in this, and my thoughts here are of little general value.)
I'd like a power system that can be made to periodically and contextually select to run from battery, even while plugged in, so that the battery can be cycled in a manner that maintains its capacity.
So, when, every some weeks, I do need to run from battery, it's still in decent shape. Without my having to manually ensure this on an ongoing basis.
I doubt its' a priority for Bunnie, but what the heck, I'll throw the idea out there. When else do I have even a chance of having any input into a laptop's design?
Oh, and thank goodness for an(other) open alternative to "secure boot" (maybe nice in principle, but very potentially malicious in current execution).
Best thing for most lithium batteries is to be kept cool and neither fully charged or discharged. Many phones already try to keep the battery at about 90% or so for this reason, it might be nice if this was configurable on this laptop.
Keeping a laptop on and plugged in all the time, especially if it runs hot, is probably the fastest way to decrease its capacity beyond extremely aggressive charge/discharge cycles (normal use is probably better since the average state of charge is then lower). I've completely killed a battery in a few months when I made a habit of leaving my laptop on my bed turned on during most of the day, Now I either remove the battery or keep the laptop in sleep when it's not being used if it's being kept plugged in for extended periods.
Only what you're suggesting isn't actually battery maintenance. In fact, it will decrease the capacity of your battery (by as much as a normal drain-recharge cycle of course).
Cycling the battery will however appear to improve the capacity by recalibrating the internal circuitry. That's still somewhat useful to a user, but much less than battery maintenance would be.
PS: From what I understand this only applies to modern Li-Ion batteries, if you're using different battery technology your mileage may vary. Some battery technologies do benefit from being periodically cycled.
Maybe I'm not up to speed on the latest developments, but I currently have two laptops whose batteries have essentially been reduced to short duration UPS's.
I should be more pro-active, but I forget, or I worry that I'll discharge right before I need to be charged up. And one sits on a spare table, doing its thing all by itself, much of the time.
They tend to stay plugged in and on overnight. It would be nice if, e.g. once a week or so -- whatever's best for the battery at hand -- the power system could run a discharge /recharge cycle "in the middle of the night" or similar.
In my experience, batteries still have memories (that shrink over time and with lack of cycling), and from what I read periodic cycling is the solution.
If Bunnie can find an acceptable fuel cell option, of course I'm all for that. ;-)
P.S. I think I do at least partway understand what you're saying, that being that rechargeable batteries have a limited number of cycles and tend to loss capacity with increasing cycle count.
However, in my case, it appears to be a lack of cycling that is catching up with me well before the above does.
That's very weird: my laptops stay plugged and on 24/7 for months at a time, and the batteries still last around 2.5 hours. These are 4 year old Li-Ion batteries I'm talking about (they used to last 3 hours when they were new).
Same experience here, i kept my laptop in a socket almost every day for 2 years only bringing it out maybe once a month. The battery was as good as new until i forgot it completely uncharged for 2 weeks when it overnight reduced its 2h capability into more like 2 minutes.
Hmm. Then maybe I'm ascribing my problem to the wrong behavior. I wonder whether I inadvertently left my problem children discharged for too long, at some point.
I've gotten reliable service out of my mobile phones, but laptop Li ion batteries continue to feel (for lack of a more consistent and rational/analytical approach, on my part) like black magic, to me.
Ultimate hackers laptop would be one without a wide-screen! Bring back the 4:3 displays that we had in the old day please. These were good for reading and editing text/code.
You can't buy any decent laptop without a wide-screen, so you will have no competition in this area.
Yes. For starters, you can put in a T61 motherboard (14" 4:3 model only, and avoid Nvidia GPUs - they're all defective) and 8GB of RAM. Minor alterations to the frame are required.
I've been looking for a laptop-appropriate ARM board that takes DIMMs for quite some time, as all the SOCs are very tight on RAM. Designing a motherboard is a bit beyond what I know how to do, but I really want to build a laptop with a custom composite shell, an ARM CPU, a high-res 4:3 screen and lots of battery.
Moore's law is not decelerating - transistor density is doubling on schedule, with process shrinks happening like clockwork. Smartphone performance has actually been doubling yearly - better than Moore's Law.
If there are fewer transistors, it's because we haven't worked out how to use them effectively; or they aren't demanded by customers. If not in demand, other ways of improving performance won't be in demand.
It's this pattern of improvements overshooting demand that Clayton Christensen wrote about.
People think inverse Moores Law is slowing with Intel, but Intel has just been committing more and more die in their chips to onboard graphics. I wonder how Ivy Bridge E will perform with 435mm wafers, though Sandy Bridge E was kind of a let down (I'd imagine that is more architecture shortfalls with 1/4 the cores shut off).
But yeah, consumer demand just mandates something that can run a web browser as fast as IO bandwidth allows. You don't even really need video decoding since you can offload that to hardware acceleration.
People should realize we are at another threshold with Moores law - in a few years we should have transistor density high enough (I'd imagine during the CPU 14nm era) to be able to embed a full compute environment (I'd imagine a dual core Cortex a15) with a gig of ram and the necessary chipset fixings for an integrated wireless AC NIC that is the size of a penny, maybe off one fab line, maybe even integrated on one die. Full computers with 500mbit wireless spectrum bandwidth to be able to stream off video, printers, data, etc in parallel. Probably runs on a watt or two too. You could probably run that off solar.
Think about it - you could have traffic cameras doing realtime video analysis processing on solar power. Solar-powered wireless routers you can just glue on top of lamp posts run by the sun. You could have a full computer that you wear, powered by the kinematic energy of motion or maybe a novel heat leech tech to utilize ambient body heat. And you think phones were a big deal!
I wouldn't be surprised if we could do something similar with a single core Cortex a9 (maybe around 600mhz) with 128mb of ram and 802.11b (maybe even g or n) right now in a similar power package (maybe 4 - 5 watts).
Solar-powered wireless routers you can just glue on top of
lamp posts run by the sun.
Stross, Doctorow. "Unwirer" (2003)
She handed him the Motorola batarang he'd glimpsed earlier.
The underside had a waxed-paper peel-off strip and when he
lifted a corner, his thumb stuck so hard to the tackiness
beneath that he lost the top layer of skin when he pulled
it loose. He turned it over in his hands.
"How's it powered?"
"Dirt-cheap photovoltaics charging a polymer cell --
they're printed in layers, the entire case is a slab of
battery plus solar cell. It doesn't draw too many amps,
only sucks juice when it's transmitting. Put one in a
subway car and you've got an instant ad-hoc network that
everyone in the car can use. Put one in the next car and
they'll mesh."
This is incredible. In a time where it seems the big pc designers are shifting appeal to the facebook heavy consumer market, developers need a new source. Id love to see this snow ball into open source phones, tablets, glass etc. and an open ubiquitous wireless network.
Perhaps RMS will either contract Bunnie to create a replica for him, or he'll use Bunnie's open schematics to create his own. Then he can upgrade from his Yeeloong laptop or at least have an alternative with a bigger screen.
From what I recall, one of RMS's concerns was open firmware, which Bunnie has as one of the project goals.
The integration of an FPGA on the motherboard is one of the most interesting features that differentiates it from regular laptops. The author points it can be used "for your bitcoin mining needs".
I couldn't find which model of Spartan6 it is exactly. However if it is an LX150 (the one with the most logic units, used by the entire Bitcoin community), there is no way he can fit a decent heatsink for proper thermal dissipation. A typical mining implementation generates so much heat that heatsinks this big need to be used (actual picture of one of the first dual-LX150 boards built specifically for mining):
http://dl.dropbox.com/u/13472215/forum/angle_a_heatsinks_600...
Unless things changed in the last few years HDMI requires a non-trivial license fee of $5,000 to $15,000, if I remember correctly. There's also a per-unit fee. If you are not a signatory of the contract you can't buy HDMI chips.
This is pure speculation, but my guess is that signal spec licensing sits on shaky legal ground and that the big manufacturers are more than happy to pay up to avoid fighting a lawsuit that they may or may not win. Similarly, those that own the IP for the HDMI spec are unwilling to sue a small fish like the Raspberry Pi foundation, because even if they win, there's not much money to be made. And if they lose, then Samsung and Apple and Dell will realize they don't have to pay anymore, which would amount to a lot of money.
DisplayPort is a free spec and can be easily converted to HDMI, so I don't really understand why anyone uses HDMI. HDMI has more bullet points but I've never seen a device that can do anything more than deal with audio and video through the same cable. And Display Port does that fine.
Yup, $10k/year and $.15/unit. It doesn't look like it's mandatory, though; the language on the site[0] is really hard to decipher. I imagine that they're not going to prosecute one guy for using a chip in his personal build. Obviously the Raspberry Pi guys are getting around this somehow, though; I'd be curious to know how.
Aren't the chipmakers like Broadcom (for rPi) and Freescale (the i.MX in Bunnie's design) taking the burden of this licensing fee? It would be a lot easier for them to roll the $0.15 into their part costs than hassle their customers with taking care of this licensing.
I ran into this about five years ago when doing a design that required HDMI I/O. Distributors in the US such as Avnet and Arrow wanted to see confirmation of being under contract to be able to buy chips. I don't remember all the details. I think you get access to sample quantities just for signing the contract but anything else required payment of the big fee. Also, you couldn't get access to some data sheets without proof of being under contract.
I can easily imagine that things in Asia could be very, very different due to, shall we say, how loosely IP rights are enforced there.
A few friends and I were discussing at work how nice it would be to be able to make your own custom laptop or smart phone. Something pluggable where more than just RAM and HDD were swappable. One thing that came up, however, was what to do about "evolving" wireless standards. Well, what if you could get a completely open source software defined radio? Might be bloody expensive for the miniaturization, and it would have certain physical limits (due to antenna sizes mostly), but it might go a long way to solving the whole problem of having to "upgrade" your phone or laptop just to get higher connection speeds, or have NFC/Bluetooth 3/etc.
(the above are ramblings of a code monkey who dabbles in simulating RF; negative mutterings about how unrealistic or stupid the above ideas are will be politely ignored).
Shame there's no external high-speed bus for that FPGA. If there were, this would be an interesting platform for software-defined radio and high speed DAQ products.
Analogue meters have uses that you don't get from digital readouts. Watching a signal slowly fluctuating on an analogue meter is "nicer" than seeing some numbers go up and down.
Great comment. Wow steam engines are great, and at small scale they are even better. Dual action, those centrifugal governors, and the smell of that lubricant oil mixed with hot water. I'd never before considered messing about with small steam engines to be hacking... That used to waste many an afternoon. Don't fuse the safety valve though, that's a bad way to get power and eye brow re-growth takes ages and annoys one's mother.
Quad-Core GHz+ CPUs are "good enough" for every day code development? Seriously? We really do live in a world of bloatware if we need 4 cores to support an IDE or a classic editor + compiler/interpreter setup. This really reminds me of May's Law and that it needs to be addressed.
FPGA development has been one of those things that I've really wanted to get into, but each time, I spend days trying to get a toolchain set up (using the free Xilinx tools on Linux and/or Windows), and by then I'm tired, and decide to take a break. By the time I get back to it, the toolchain doesn't work anymore (because of some licensing crap) so I have to start all over.
I don't think I've been this excited about hardware in a long time. I'd kickstart this thing in a heartbeat.
[1] http://events.ccc.de/congress/2011/Fahrplan/events/4686.en.h...
[2] video: http://www.youtube.com/watch?v=37SBMyGoCAU