I have what I imagine to be a fairly simple project for a talented FPGA coder. While I know of some marketplaces for finding generic software developers, I don't know whether there are any good ones in the embedded space. Any suggestions? Thanks in advance.
Many great comments that I appreciate! Just as a bit of extra data, here is my design problem:
My lab uses an existing OpalKelly solution which translates a semi-custom SPI interface to USB. We've built some open-source closed-loop software on top of this DAQ system, and I'd like to lower the USB latency by implementing the same interface on a Zynq dev board to stream via UDP/GigE. The boards I've looked at have SYZYGY interfaces, and I'm happy to spin the adaptor PCBS myself, as well as write the streaming code (which I'm hoping will run in userspace on Linux).
The custom bit of the SPI makes the standard SPI interface for Petalinux not work, and so I basically want to mimic the standard SPI interface but with the needed tweaks. What are the issues:
(1) 2 MISO lines rather than 1
(2) "Double data rate" (meaning MISO data are transmitted on both edges of the clock)
(3) I need to compensate for arbitrary cable lengths. The issue here is that the cable is long enough that we can get delays of up to 8-16 clock cycles between when the clock edge is sent and when the corresponding data from the device finally gets back to us. So the data in clocking needs to be able to be delayed from the data out clocking.
For (3), I almost convinced myself that I could do this in SW, but implementing it in hardware would make my life much easier.
Anyway, I realize that this is probably more than a week of work for someone talented, but I have the sense its closer to 1 week than 1 month, definitely less than 1 year. I also am hoping for code that I can share freely, so I'd obviously prefer to start with something well thought out on that front.
Arrow Electronics has been involved with freelancer.com for many years, so it might be ok for FPGA. I think an old colleague used it for some embedded stuff (not FPGA) and was pretty happy with it.
Upwork also has FPGA category, but I know even less about it.
The open hardware community (e.g. https://fossi-foundation.org/events/archive and http://www.enjoy-digital.fr) works with FPGAs. Contributors can be found on social media and video playlists from past conferences. If your project can be done by modification of existing open hardware, there would be less cost and risk.
Does depend upon the project but I suspect this is far too low. Only thing you're likely to get in this price range is a small addition / tweak to an existing design.
A very minimal design from scratch that uses existing off the shelf hardware with no interfacing to anything remotely complex and no follow on support (pretty much developer hands it over showing some smoke test works and you're on your own from there) could also fit into this budget.
Make up some design documents and put up a splashy crowdsourcing campaign. Collect, but don't spend the deposits. Trickle out some progress content over about a year. Look for knock offs on aliexpress, return deposits.
So you want to pay for less than 1 week of dev time? Almost nothing can be done on an FPGA for that little amount of money beyond the smallest of changes to an existing design. Heck, the cost of simply getting your completed circuit board with the FPGA tested by an engineering firm that gives you the required FCC certification before you can ship your product starts at $10k. Realistically you need a bare minimum of $50k to bring a simple commercial hardware project to market when you understand the niche and already have the ability to deliver the firmware and software without significant external (paid) effort.
I took the time (and it takes time) to learn a bit about FPGAs (playing around with 1Gbps and 10Gbps ethernet as well as PCIe 1.0 x1 and 3.0 x4), and if you want to actually do something new, there is a lot of time to spend on learning the various protocols a given design will touch or implement. Most of the sample code on $vendor's website is shoddily thrown together and breaks as soon as you breath on it wrong. When you do anything interesting with networking protocols the constraints of clock speed and bandwidth make things extremely complicated with oodles of corner cases as you do things like use wide busses at low clock speeds where packets can start at multiple offsets in the bus. You can build on top of something like the NetFPGA project, but that only works if what you're building is close to an existing well known niche. But if you're just flinging packets, there are far less costly (especially in time) frameworks to play around with things.
I'd say away from building custom FPGA based hardware until the performance benefit will increase sales a factor of 100-1000x (depending on complexity) to cover the cost of the hardware. But keep in mind that it's harder and harder to compete with commodity x86 servers. In the past, every cell phone base station used FPGAs to do the signal processing. Today's 5G networks do most of the signal processing on servers because you can do cool things like noise cancellation across nearby antennas when all the data is in the same place, and the software developers to produce those changes are going to cost far less than a team of the size needed to support the same rate of change on an FPGA.
They never mentioned building custom hardware anywhere and tbh I don't see why you would need to when reasonably cheap FPGA hardware already exists.
Alinx provides a bunch of cheap FPGA development platforms including SOMs that you can use with custom boards without needing to design the whole thing from scratch.
I'm willing to bet that unless they are doing something particularly exotic, they can probably complete their project using COTS parts without a single custom piece of hardware.
Yes, absolutely. My preference is a devboard, to be honest. My goal is something that I can share for ~free with other labs who'd need to order the parts themselves. I'll likely have to spin an adaptor PCB, but ideally quite trivial. So just firmware, and I'm happy to write associated code in Linux.
Adapter PCBs still have to be emission and safety tested before you can start selling them. I have family in Germany that had been making and selling bee scales for almost 25 years across Europe up until compliance became extremely costly due to new regulations. Even minor changes (swap to an alternative part footprint due to EOL?) to PCBs trigger re-certification, and that is part of the lifecycle challenge in the embedded world. Certainly if people are making and assembling the boards themselves that throws a lot of overhead out of the equation.
I am quite curious what space you are doing this in... FPGA hacking can be a lot of fun. And it can be mindnumbingly tedious when you miss a subtle detail on a data sheet that makes the difference between an ethernet link working 100% of the time versus merely 98% of the time (Marvell gigabit ethernet phys are quite sensitive to the exact sampling time of data sent over GMII relative to the clock in a way that National Semiconductor gigabit phys are not. This requires setting a delay on the I/O buffer of the FPGA to make gigabit ethernet work reliably on the old Xilinx SP605). Having multi-hour build times for moderately complex designs to synthesize slows development down.
If you're doing anything with high speed SERDES (ie >= 10Gbps ethernet), know that Xilinx transceivers require insanely complicated witchcraft that involves incantations that only the most seasoned FPGA developers have learned. Clock routing between adjacent transceiver quads with the right wires going to the right PLL is bonkers to figure out on your own. Altera is a lot simpler, but the Polarfire transceivers were probably the easiest to get working.
Let us know how deep down the rabbit hole you venture.
Totally! Another reason why sticking with a pre-populated dev board (like the Zuboard) makes life easier on many levels. My application is basic neuroscience, so we're looking at slow signals, each channel is ~25 kSps, so we're at bleeding edge electrode technologies before we get to 10 Gb.
I actually teach a microcontroller class to sophomores/juniors. I'd love to build a multi-semester curriculum where we'd end up at crazy speeds, but it takes a lot of work to cover all of the theory (E&M and system) required...
25kSps sounds like it would work well using something like the RP2040. The PIO blocks would let you implement the sampling triggers with decent realtime accuracy, and it's an order of magnitude cheaper than most FPGA boards. External muxes are fairly cheap and easy at that relatively low sample rate. Sounds like a fun project!
I’ve spent several evenings pondering the PIOs and whether I could adapt them to this project. I think there’s a reasonable chance that the answer is yes for the new RP2350. For the aforementioned 25 ksps per channel, I actually need something like a 25 MHz clock (64 channels at 16 bits). With the original 2040, I convinced myself it would be hard without overclocking. But with the new one quite possibly possible. Handling variable cable delays of 8-10 bits also requires buffering in a way that I couldn’t quite wrap my head around, but I bet someone could!
That's 25 MHz if you're doing everything serially, but you could probably do it with a 11 or 12 bit parallel bus. There are ADCs on the market that have parallel rather than serial interfaces. The only reason I'm suggesting this is because 25ksps is a rate that's low enough that the headaches of using a parallel bus are minimal.
Deskewing isn't all that hard to do, and there are plenty of designs you can steal ideas from. FPGAs generally have FIFOs in the SERDES that are used to deskew multilane protocols like PCIe or 40/100Gbps ethernet (in QSFPs). You bit/byte slip during training until things sync up, then throw an error and retrain if sync bits don't match what's expected (to keep the check in the fast path cheap). That should work in software as well, but it could very well be challenging at 25MHz. Those are but a couple of the reasons serializers / deserializers are giant complicated bits of intellectual property found on virtually every chip these days.
Right.
Fpgas shine if you have high data rates (multi Gigabit on single/dual pins), low latency, lots of similar computation (multiply accumulate), a lot of complex pipelined logic and if you need to be really compact at the same time.
The effort to get this running is also crazy huge. Large designs "compile" for hours, simulation and in situ testing can take months. If your fpga has 1000 pins, even PCB design, manufacturing and debugging is hard, if you even have a sufficient scope at hand.
I would guess 10x compared to solving a problem with an embedded cpu, which is 10x compared to solving a problem on a standard PC. Expect another 10x if you go from Fpga to custom ASICs.
I'm curious what could be expected / possible with 5-10k. Tweaking an existing design, on proven and available hardware? Testbench for a module only? Maybe "just" VHDL/verilog coding (instead of dealing with the Fpga)?
Many of the common usecases, like camera input, will have dedicated peripheral a available in a System on Chip or microcontroller. Which would then generally be preferred over FPGA because it will require less custom development, quality assurance, etc.
If you give a short description of what you need and contact details, your search is maybe already over, as on this site are also some freelancers with FPGA&|Embedded experience.
I would recommend searching for terms of art such as vhdl and verilog in normal freelancer sites.
Most embedded folks don’t do FPGAs, the subset that do is mostly employed in situations that don’t permit freelancing.