Hacker News new | past | comments | ask | show | jobs | submit login
New Xilinx Virtex-7 2000T FPGA provides equivalent of 20 million ASIC gates (eetimes.com)
42 points by rbanffy on Oct 27, 2011 | hide | past | favorite | 30 comments



Bear in mind that a the current generation of these, the Virtex 6, starts at about $800 for the smallest parts and I believe goes well over $20k for the larger ones. Expect the Virtex 7 to be priced in this ballpark.

edit: I was off by an order of magnitude on the high end. $2k -> $20k


It's a monster chip. I don't see a die area number, but at 6.8G transistors that would place it at about twice the size of the biggest Geforce dies. So yields are going to be lower due to simple scaling constraints (though FPGAs no doubt have lots of redundancy by their nature, so some failures probably don't kill the chip). And it's 28nm, which is a brand new process and won't yield as well. AND big FPGAs are a low volume market.

So yes, I wouldn't expect these to be cheap. They're not consumer SoC parts.


You're quite right, a traditional die with 6.8 billion transistors on it would likely have serious yield issues. This is why Xilinx is using the silicon interposer to combine multiple die into a single package. So it's not actually 1 6.8b transistor die but 4 roughly 1.7b transistor die. Still huge, but in line with the level of integration that others (Intel, IBM) are using.


Yes, initially. But as the manufacturing processes mature, these prices reduce dramatically over time. Price reductions of an order of magnitude are not unusual over the course of two or three years.


I meant this more as a data point for realistic mental comparison to an ASIC, or anyone thinking they might use one. It's cool that it's achievable, and saying "This (big-n) gate count part is available today!" is nice and gee-whizzy for the press release and trade mags, but divorced from it's cost doesn't mean much to someone who would put it to work, as I'd expect at least a few here on HN to be.

High-end Virtex 4 parts from 2004 still cost over $10k. These aren't the devices you design into products unless your market is low volume, high margin, and long life-cycle. This is the part you buy to prototype your own ASIC design.


Considering my current rhythm, by the time I finish learning VHDL, it'll be available for US$10 a piece ;-)


I'd consider spending time with SystemVerilog.


Sounds intriguing. I am not sure where to start from, however.

http://en.wikipedia.org/wiki/SystemVerilog


I wonder if that's enough to build a pocket Alto or Lisp Machine...


It'd be shocking if it wasn't, given that there's a number of FPGA based implementations of more recent and far more complex machines (in terms of gate count at least) than the Alto and early generation lisp machines, such as various incarnations of Commodore Amigas.

Some of them use external CPUs, but newer generation ones such as the FPGA Arcade can hold pretty much the entire machine include the CPU in the FPGA (usual exception is RAM, and a micro controller to bootstrap).

See FPGA Arcade, Natami, Minimig, C-One, Chameleon and others. Natami is the most capable of the bunch (aiming to be a faster, better Amiga), while Chameleon is probably the smallest (the size of an old style Commodore 64 cartridge, and can plug into a real Commodore 64, since it started as a C64 expansion on steroids, but it can run standalone too), with FPGA Arcade a nice middle group (fits in Mini-ITX form factor boxes, but isn't deep enough to fill nearly all of the box).

FPGA Arcade: http://www.fpgaarcade.com/ Chameleon: http://www.vesalia.de/e_chameleon.htm


So, what would be a good FPGA starter kit, just to learn how these things work?


To learn how FPGAs work, I'd recommend starting with a cheap Digilent board; see http://www.digilentinc.com/choosing.cfm - You can get a simple BASYS 2 board for under $100, possibly under $50. This board is the Arduino of the FPGA world. There are tutorials from many schools, lots of good documentation, it's a proven design...you can't go wrong with the Basys board. It does sacrifice performance for simplicity, though - Notably, there's no RAM except for what's on the FPGA. Go up to the Nexys for that, but if you're just interested in learning Verilog/VHDL, the Basys is a fine place to start.

To learn how these things work, get a job at a company which uses FPGAs. You'll probably never see one of these parts as a hobbyist.

If you're interested in stuff that's actually like this, and want to start right away, then look for a board packaged as a PCIe card. This will be significantly more expensive; think $500 on the extreme low end. NetFPGA is (http://netfpga.org/) is a good starting point if you're trying to start at a higher level. (Note that the old board is 33MHz PCI and the processor is obsolete; you want the 4x10GBE Virtex 5 part if you're looking for modern tools)


+1 for Digilent. They are a local company for me; they do really great hardware stuff at an affordable price.

The founder of the company is a pretty spiffy speaker as well; if you have a chance to see him, it's worth it.


I had pretty decent success with the products from KNJN (http://www.knjn.com/). It's affiliated with FPGA4Fun (http://www.fpga4fun.com/) which will help with the learning.

Recently I've been looking at the small, low-cost boards by XESS - http://xess.com/prods/prod048.php

Keep in mind that writing logic is quite different from programming. It's quite the mind-shift.


If you're new to FPGAs or digital design in general, it'd probably be worth your while to pick up a good digital design book. I always liked Wakerly's Digital Design Principles and Practices. I recall that my edition (either 2nd or 3rd) was VHDL-focused, but it looks like the 4th edition has examples in both VHDL and Verilog.

There's also the matter of working through the FPGA toolchains, which isn't exactly a walk in the park. Unfortunately I don't have a good tutorial to reference off the top of my head...


My interest with FPGAs is more designing processors rather than interfacing with electronics. I like the products from OpalKelly [http://www.opalkelly.com/] for this as they have a USB interface and C libraries on the computer side, HDL on the FPGA side to abstract the host->FPGA communication allowing you to just focus on your design.


Adafruit has an FPGA for sale: http://www.adafruit.com/category/products/451

I haven't tried it, as I have a much cheaper but far less documented CPLD sitting around, gathering dust. =p


I haven't tried this particular development board but it looks pretty inexpensive and would be a good starter board. http://www.easyfpga.com/


I've heard good things about the Papilio: http://papilio.cc/


Yes. A $10 Spartan 3 could run a simple core and interpret LISP code. It's been done before, check Open Cores.


Been there a couple times. Emulating a real Lisp Machine would be a little more complicated than just a Lisp-running processor - you have miscellaneous hardware around it. I was imagining something software-compatible with a Symbolics or LMI with a USB port for keyboad/mouse, an SD card slot and an HDMI output.


I believe so. There have been some efforts to make a softcore[1] Lisp Machine, but when I last looked, it appeared that they weren't seriously pursued.

I would love to put together a softcore Lisp Machine. It would a tremendous learning experience and I think it could help spur some really interesting results in OS architecture. Particularly useful in that arena would be a multicore Lisp Machine.

It would be a dream come true if I got a job building such a device and development was open-sourced from day 0.

[1] Softcore, that is, a processor on an FPGA.


check this out, by hans huebner:

http://zslug.wordpress.com/2011/02/09/meeting-1-report-audio...

i had a few of the 3600 lisp machines, they wouldn't be hard to fit into a fpga at all (size wise, of course :). they had huge boards filled with chips, which were all... hand wirewrapped cmos chips (iirc) .


Trivial. Those are comparatively tiny machines. This is at the level where you could simulate a Pentium or 21064 Alpha or the like.


As a comparison, Pentium 3 had 10 million transistors. Does that mean that in theory you could implement P3 CPU on this FPGA?


You could do a lot more than that, as there are multiple transistors for each gate (assuming static CMOS).

My company uses FPGAs to for verification during development (because it takes months to get silicon back after tapeout, and it costs millions of dollars even if you only want one single test chip). We make chips that are much more complex than a P3. Friends of mine at other microprocessor companies have told me that they have a similar process, though some of them at larger companies use their own custom hardware for accelerated simulation, since you can't fit anything like a POWER7 or even a core i3 on a commercially available FPGA.


I don't know if that would be practical, but both Intel and IBM can build wafer-sized FPGAs if they really need to.

The standard wafer they use is around 30 cm, right?


There is a difference between transistors and gates. FPGA's implement gates (i.e. "and", "or", "xor", "not", but really they're all only nand). Generally on a standard processor, one gate is approximately 2 transistors (assuming a CMOS process). So theoretically this FPGA could implement approximately 4 P3's. 10 million transistors ~= 5 million gates.

However, the process of implementing circuits in an FPGA isn't that efficient (processors can do a lot of tricks to get better than 2 transistors/gate. They can also do other tricks to do calculations with just transistors instead of gates and FPGA's can't). However given the 4x overhead, I would say, yes, theoretically an Intel engineer with a full spec and some really good VHDL (the programming language of FPGAs) skills could implement a P3 on this.


FPGA parts don't model gates, actually. Their unit is a giant array of SRAM lookup table (two bits wide by 5 address bits I think?) plus some interconnect logic that you can "wire up" by setting more bits in other SRAM cells. They translate this into "gates" with some marketing cookery as the number of logic gates that would be required to duplicate the most complicated configuration.

So when they say "20 million gates", they don't actually mean that any 20M-gate circuit can be built. It depends on what you're doing; some things (e.g. big logic functions) work better in FPGA logic than others (e.g. complicated wiring and bus architectures).


There are some neat cross-sections of the 28 nm process here: http://www.electroiq.com/blogs/chipworks_real_chips_blog/201...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: