Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Is it feasible to do high-frequency trading as an individual?
91 points by antichaos on July 15, 2010 | hide | past | favorite | 57 comments
My stock broker offers a programming API. The trading fee is close to $10/order. With that price point, it seems pretty expensive to do high-frequency algo trading. Are there any much cheaper options for me to experiment with hi-freq trading? Or is it a dumb idea that I should give up?

I'll share some experience a friend had with this stuff. I don't have the low-level details, but this should be enough to make an informed decision.

He was working on what he termed "high speed automated trading" -- basically all in this realm of picking pennies up in front of bulldozers.

He's a very smart and talented guy -- Harvard Physics/Comp. Sci double major, etc. So we're not talking about an amateur throwing around some code.

He spent about 6 months and over $50k setting up his system -- he had developed some algorithms to trade spreads between different securites (on the CBOE, the NYSE, and NASDAQ). He built a relationship with a clearing agent, a direct broker, etc. He got machines in colo facilities as close to the exchanges as possible (one data center in Chicago, one in Jersey City, etc.) SLAs on low-latency DS3 lines, etc. The whole nine yards.

I'd call it a "pro-sumer" level setup -- everything done by one guy, but done basically as well as a bigger firm would set it all up.

Here's what he discovered after about a week of trading: He didn't have even remotely a shot at competing. Like, not even close. If he had 10ms pings to the exchanges, someone else had 5ms. If he got down to 2ms, someone else who was physically at the exchange itself had 1ms. He got killed with commissions -- even if he did make a few $$ on some trades, it vanished with commissions.

Why? Because he was competing against guys who paid NO commissions. The broker-dealers and clearing companies themselves had internal automated trading setups. They had deeper relationships and deals that traded commissions for a cut of the profits, etc.

In other words, he figured out pretty quickly that this is not an algorithm game or speed game - it, like many things on Wall Street - is a 'who you know' game. No matter how smart, how fast, how sophisticated you are as an outsider, the likelihood is that someone on the inside has a similar trade idea and can do it faster and cheaper than you.

Now, the corollary to that: My friend is smart, but he also tends to give up too easily ;) Clearly if you've got an idea for a better/smarter/more innovative trade, then you'll make money on it. But that's a trade idea, not a speed advantage.

So just recognize that there's potential to make money, but "same idea, just faster" won't cut it.

Yeah, he learned the hard way that you can't expect to compete in the pros without pro level access. The thing is though that the pros are dealing with ms or seconds when he should have been looking at things outside their range like minutes or hours or possibly even days.

The currency market is even more cut throat (hey, insider trading is encouraged! :) ), but I have a friend who works with a single other guy managing about 10M leveraged out close to 100M. All they do is trade currency. On a typical day they make 1-2 trades for their clients and the rest of the time play golf and hang out. Pretty much the ideal job :)

They got started years ago while in grad school by writing an algo to analyze currencies. After showing it worked they rounded up a bit of funding and have gone on from there. Ever since then they have just honed and tweaked the algorithm and both make a good living off the commissions from working a couple hours/day.

Stay away from HFT, big players will eat your lunch. HFT isn't even a game of milliseconds anymore, it's a game of microseconds. You can still do well with algorithmic trading where speed of execution isn't an issue. Work on your strategy, test it on historical data, execute it manually, if it works then automate part or all of it.

Yup. NASDAQ will even tell you what the current latencies are for their coloc: http://www.nasdaqtrader.com/trader.aspx?id=inet Microseconds indeed.

Alternatively, you might consider going overseas if you aren't prepared to pay coloc fees and do custom FPGA or realtime work. Some of the exchanges, particularly in Asia (except Japan) still have clearing times measured in seconds.

Why doesn't government step in and quantise trading terms to be a whole day (or similar time)? It seems wrong to me that with potentially equal performance two traders can get different returns based on the microsecond difference in their latency between their server and the exchanges central server.

Does anyone still by the whole line about exchanges being their to create fluidity in the market.

It all seems a big scam, the more I learn about it the less I feel the highest returns go to those who perform the greatest scams fleecing off value created elsewhere (manufacturing, services, ..).

It's not a dumb idea, but you certainly should give up. The high frequency game is one you will lose, because many of the large investment banks have relationships with the stock exchanges allowing them to get information faster - and respond to that information faster - than other traders.

See this NYT article (including the graphic) for more: http://www.nytimes.com/2009/07/24/business/24trading.html

Attempting to do this through (presumably) a retail brokers public API is doomed and dumb. The only serious way to do this is with sponsored access via a tier 1 participant. This is quite obviously not something for dabblers

That article is 85% hype. It reads like an editorial, not a news article. What exactly do you mean by "relationships with the stock exchanges"? If by that, you mean they pay the exchanges for colocation (which anyone can do), then yes, I would agree. If you mean the banks and exchanges are cronies and that the exchanges give the banks free perks because they're golf buddies, then that's wrong.

The edge you are talking about in terms of "previewing orders" seems to refer to flash orders. Those no longer exist (which I think is good), but I doubt they were a high source of revenue or edge, given their low volume relative to the rest of market activity.

There are several problems with this article; some errors (or simply incorrect terminology) and some things that are no longer true.

There is a difference between "algo" trading, also known as "program trading" and HFT. The algorithms in HFT kind of boil down to "get it there fast".

The thing that is no longer true is the flash orders (pointed out elsewhere in these comments) are no longer allowed. This is the reference to flash orders.

The trick to being a HFT is to be big enough and fast enough that the exchanges pay you to trade. Thus, you don't necessarily have to make a profit on these trades. Then you are really a liquidity provider. Being an HFT is about size, speed and execution mostly, and ideas not so much. My advice is to not go after that market. It is likely that you are three or five orders of magnitude too small.

Other strategies are more interesting for the small guy. Maybe become a MFT (medium frequency trader), whatever that is.

I meant colocation - the costs of which I assume to be too high for individuals - and flash orders - which I did not know were no longer operating.

My general point was just that the playing field is not exactly level - the banks have resources the individual does not have, if speed is the game you're playing. I didn't mean anything at all to do with personal relationships.

It seems like you're describing flash trading, in which some customers pay to receive market data a split second before everyone else. In particular, this is not the same as co-locating to reduce latency: it's programatically implemented on the exchange side.

Flash trading no longer exists on any major exchange. AFIAK only the relatively minor exchange Direct Edge has them.


If I understood that article correctly it looks like the "high frequency" traders pay the exchanges to get access to trades before everyone else.

So it's not really the fact that these guys are trading at high frequency that gives them the edge (although that is necessary) it's the fact that they get to operate a few milliseconds ahead of the rest of the market.

I wonder how much much you have to pay to get that access?

The old metaphor of 'picking pennies in front of a steamroller' still holds in high-frequency trading.

Except that without (at least) a few million in cash deposited with a prime broker you need to pick up a lot more pennies than a bank. (You'll be giving a lot back on a day-to-day basis)

Since everyone has said no so far, I'll go ahead and say it's possible, but very difficult.

HFT algo commission is much lower than $10/trade. IBKR offers 0.008/sh, and it can get much lower if you trade in volume.

Like others said, it is a game of milliseconds. So you need to own a server near the exchanges in new york (i.e. http://www.ubiquityservers.com/data-center/new-york.php).

The best way to start learning is develop algos not for HFT. Instead of milliseconds think about 15 min - daily holds. If you're trading under 25k, you're only allowed to make 3 intraday trades a week.

So find a broker that has a FIX api, buy some data for backtesting, code some stuff out using R/python, and forward test your strat using papertrading for about a month-- then put real money to work.

IB's unbundled commission structure is cheap, but I wouldn't recommend them for HFT. HFT, as defined by the industry, is basically very heavy on the order flow with the majority of those orders going un-executed. IB charges an insane cancel fee for orders that are direct routed. If you let IB route the order with their SMART algorithm then you have no idea where it might land. I'd speculate that IB charges such a crazy cancel fee because they don't want high volume limit order traders competing with their Timber Hill market making outfit.

Agree with everything else you said re: time frame, etc.

IBs cancellation fee on options is pretty absurd. For the solution needed there are much more specialized firms; however, starting off it's a pretty decent start.

I would ignore the "HFT" moniker and focus on the algorithmic part. HFT is a big-money game for companies who can afford to build a data center next to the exchange. But a longer-term strategy based on technical indicators can still do very well.

My main recommendation would be to take things slowly, browse through the COTS options in the field(which are plentiful), and make some of your own trades in order to learn and develop your strategy.

Someone on reddit a few months back was succesful at this:

"I used to work as a software engineer and started developing and trading automated strategies in my spare time in 2006. I went full time in 2007 and have been profitable every quarter since. AMAA"


automated != high frequency


I know a few algorithmic position traders.... and it's got sweet nothing to do with high-frequency, or even day-trades usually. They have their algorithms, software they've developed to handle analysis, and get them in and out according to plan - I believe they research their target sector a bit, fire up their algorithm machines, and then manually execute (or at least approve their software to execute) the plan they had in place - and it's all position based. They know when they're getting out of any position, up or down - whether that happens the same day or in days or weeks, or months.

Lots of neat software, but not day-trading and not HFT.

True high-frequency is very hard. Most of these guys got started in the 1999/2000 time frame and didn't have to deal with many of the start-up issues that new entrants face. For example, just consider the data you need to trade that quickly. Not only is the real-time feed expensive, but if you want to source the data from the execution venue then you have a lot of code to write. The amount of data is also quite large. For example, the US equities market executes approximately 45 million trades per day and there are about 700 million quotes per day, not including the non-top of book quote activity in the various ECNs. I collect between 2 and 4 GB of data per day, if you were to go direct to the execution venues you'd be looking at >50 GB per day, probably.

So, anyway, the "true" high-frequency game is very tough today because we're already 10+ years into it. The markets have changed and the edge has gotten smaller, but there is still plently for a lowly individual automated trader to scratch away at.

Instead of looking for millisecond opportunities, look for second or minute opportunities. Go where the big guys can't because there isn't enough capacity. Can you find an edge that, on average, keeps you in a trade 30 seconds, for example? There's still plenty of alpha left, just don't step onto their playground and expect to get onto the swing set.

A few practical notes:

* You need to look into unbundled or cost-plus commission structures. These fee structures charge a per share commission and pass through all fees and rebates from the executing venue. This is required to do any sort of size with reasonable cost.

* Most "retail" brokers are not sufficient for any sort of high-volume algorithmic trading. Interactive Brokers is barely ok if you are in any way interested in limit order trading because they have fairly large cancel fees for direct routed orders. If you're model doesn't rely heavily on strictly offering liquidity or you're ok with letting IB route your order then IB is ok and offers an unbundled commission structure. Lightspeed Trading and Lime Brokerage are two that cater to active individual and institutions.

* Data storage is a big deal. Effective storage of regular (evenly spaced) and irregular time series will require you to engineer something. There are commercial solutions, but you can't afford them. When you're dealing with high-volume intraday trading this is one of the first issues you'll face. How do you store, query, and manipulate data that includes 45 million new rows per day? The relational DBs fall apart pretty quickly and even if they didn't they won't give you the time series operations you need/want.

* Data feeds are expensive, but required. Look at DTN NxCore. It is a full market feed that will give you the best you can get w/o going direct to the exchange. Some brokers will give you access to raw exchange feeds, but you'll need to engineer feed handlers and a ticker plant for them. This is a non-trivial task, but not impossible. Once you've done that, you'll need to figure out how to get all that lovely data off your co-located server and back to your home base for analysis. Network engineering will be required because your broker doesn't want you pushing 10-20GB per day through their network connection, so you'll need a circuit from an on-premise carrier.

* The banks are players in HFT, but not the original or best. Most of the guys that started it are still independent. Look at GETCO, RGM Advisors, etc.

* Flash orders, what most folks in this thread are refering to when they say the exchange gives the HFT firm a first look, are no more. That edge existed, and I'm sure HFT took advantage, but no HFT firm was built on flash orders. When they started flash orders didn't exist.

It is possible to be a successful, independent, automated trader. It is even possible to do it on a purely intraday, high-volume basis. Don't get caught up in the hype of needing to be high-frequency or not.


EDIT: The other thing I forgot to mention is that naked sponsored access is likely going away. This is a near-requirement for "true" HFT. Any future regulation won't affect any of the existing players because they're all grown up now and most have their own broker/dealers. Some form of sponsored access will likely still exist, but pre-trade risk checks will probably be required and will therefore still leave you're broker between you and the market.

NPR's Podcast Planet Money has an interesting show on this subject called "The Million Dollar Microsecond":


(very off topic) How do you store, query, and manipulate data that includes 45 million new rows per day?

Perhaps one shouldn't? I've always been a bit fascinated by the analytical tools in brokerage software - with a pretty good understanding of DSP and an appreciation for the fact that asset prices are somewhat periodic, it's hard to overlook the fundamental similarities between stock graphs and audio waveforms. Once you start performing FFTs or wavelet transforms and get a 'feel' for dealing with signals, patterns become very seductive...possibly too seductive: http://en.wikipedia.org/wiki/Pareidolia and http://en.wikipedia.org/wiki/Tetris_effect

Might there be another approach? You wouldn't prepare to go to the store by reviewing and analyzing the 1287 individual footsteps of your previous trip, or try to predict the content of a HN thread by textual analysis of all previous threads. Do we do so at a subconscious level, then? Not really - or rather, our subconscious tends to forget about things as soon as they cease to be important, which in the case of things like walking is a period of seconds or less. Processing large volumes of data is computationally expensive, but it turns out that simple rules can yield results that are both complex and useful, as in flocking and swarming behavior: see http://en.wikipedia.org/wiki/Boids and http://en.wikipedia.org/wiki/Swarm_Intelligence, plus everything from the wisdom of crowds to nonlinear dynamic systems (aka chaotic ones) like Newton's basin or the logistic equation.

I feel there are two other fundamental problems with the massive dataset + analysis approach. One is that you're not working in a closed system, and there's no sensible way to quantify unexpected events. 'Bigcorp CEO in Sex Scandal!' might cause the price of Bigcorp to tank if it's a major distraction or their largest customer base is among rural conservatives. If Bigcorp makes racing cars, it might just be good publicity! Now you can do some kinds of interesting posthoc analysis (eg for news stories that contain a stock symbol, measure the correlation between # of textually similar stories and stock volume/prices using a distributed windowing function) but we're a long way from having a browser plugin that trades based on the contents of your RSS feed.

Another problem is that of feedback. As you've discussed so ably above, people who spot an arbitrage opportunity will mine the hell out of it. And as we all know, traders are extremely subject to herding behavior even though all training suggests they do otherwise. Sure there are systematic contrarians, but I bet that if you just want to do academic analysis you could find a contrarian coefficient and quantify its damping effect on price or volume movements.

So rather than crunching vast quantities of stored data, I wonder if it might be better to treat price movements not as absolutes which you hope will reach a particular ceiling or floor, but as differential vector data with a short half-life. So far AI and modeling approaches seem to have focused on prediction (surprise) and don't perform especially well. I think it would be more interesting to map correlation variations for as large a number of nodes (listed securities) as possible - think how we intuitively appreciate the dynamics of a school of fish when watching a nature documentary, without performing any detailed analysis of individual fish trajectories.

Of course this still involves processing a lot of data, but storing it is less important because are only seeking to become more familiar with high-level behaviors inside that system. There's more to fishing than running trawlers!

Good thoughts, and I don't particularly disagree. Regardless of any derivative you might obtain from the data, most folks want to retain high-fidelity historical data for two reasons:

(a) Future analysis techniques are unknown. Today you might be using method X, but tomorrow you might want to try method Y which calls for an entirely different massaging of the raw data.

(b) Backtesting and replay. Simulations and replays of previous trading events are valuable not only for testing a model but also testing the particulars of new "infrastructure" code.

Both require high-fidelity source data. This stuff is so hard to come by and costs so much money that most people want to be safe and save it forever.

To your point on analysis: it is rare that one would directly analyze the source data itself. You'll almost always want to transform the data into something more manageable for model development. The most crude form of this is idea of "bars". Instead of looking at tick data, traders tried to reduce the noise by looking at arbitrary aggregations of that data: 1 minute, 1 hour, 1 day, etc. Technical analysis using moving averages and other indicators are also examples.

Seems like hosting a large hifi dataset in the cloud (eg rolling last 5 years) and charging a small fee to crawl it might be a good opportunity. Or maybe there's no margin in it when people are prepared to pay $$$ as you describe even if they're reinventing the wheel in the process.

Not that I'm a mathematical or economic genius of any kind, but I continue to be surprised at how primitive financial analytics seem. When people do find something interesting (eg Li's Gaussian Copula) they almost invariably make a fetish out of it and hurl themselves off the nearest cliff shortly afterwards. Economics faculties are as much to blame as anyone, I feel.

It is likely that you can't share that tick data--there are redistribution restrictions.

Stock exchange data is closed? I find that rather strange.

It's not closed, but to purchase re-distribution rights is very expensive.

Interesting thoughts. I had an acquaintance years ago who did a type of HFT. He built and ran a team of people for 5 years or so before 'retiring' with his FU money and traveling the world.

With respect to your news about a CEO, what he told me is that any stocks that had news or upcoming news were simply pulled off the table from trading. I always thought that there could be some value in parsing and understanding news, but from his standpoint it simply took too long. His other point is why deal with other variables when you don't have to. Find low news stocks and you find something that is more easily predictable.

That leads me to the next point he made, time. I asked him about using exotic analysis techniques and again he said most of their algos were variations of regression testing. Now, he didn't go into how they figured out the variables in their regressions, but he said the reason they stuck with those variations is because they were fast and they allowed them make decisions quickly before the window of profit closed.

Last I heard he retired at 30 traveled the world with his wife got bored and now runs a hedge fund. I don't know if how he was doing it 10 years ago would apply to today.

Cheaper? How. 10/order makes no sense at all if you are ordering 10-20 shares at a time. It makes a lot of sense if your orders are 100,000-200,000 (or more) shares at a time. Individual traders (most often) only do the 10-20 shares or so... Check to see if your broker has limits on the number of shares/order.

High frequency algo trading, as presently committed to by the big banks, relies (as others have noted) on advanced notification (if only, at times, milliseconds in advance...) which, as an individual, you probably won't get. But that just means you can't make use of the same algorithms as the banks. You'll want to derive your own algorithms to exploit some specific pattern(s) that are different from those that rely on advanced word. It's possible, but unless you're already in possession of some wealth, unlikely that you'll be able to design, implement and tweak your algorithm to profitability in anything like a year or so. I certainly wouldn't try it as anything less than a full time occupation... certainly not something that can be done as an 'experiment'.

The only person likely to get rich off that is the broker...

Sadly "that" likely describes far more than HFT. I've been wondering for a while if performance was inversely correlated with portfolio turnover in actively managed portfolios.

Without having proved it out, I am almost certain it is.

And as I think about it, the reason for that might actually be that portfolio turnover mitigates concentration risk if it is not excessive.


Your ping time will be > 20ms. factor in a bit of latency in the data feed and call it 30ms. Nyquist says you'll get aliasing unless you sample 2xfreq. So The highest frequency you can possibly trade is 60ms. Not at all high frequency.

And I really doubt you'll get a 20ms ping time. I get 100ms to my broker. Can you do mid-frequency trading? Absolutely. But you don't have the money or access to resources required for hifi trading.

Also, $10/trade is VERY high. I'm using interactive brokers, which is $2/side+some costs which are small enough I don't bother accounting for them.

What about low frequency algorithmic trading? As in you run a program that is looking to make money holding a position for hours to days so the exact buy price isn't as big a deal. Or does it become to much of a game of chance then?

Trouble is that you still need to guess what kind of market you are in. Such systems often only work in some trending markets and you can easily loose what you have gained when market goes against you.

You should give up ... reason is that HFT costs for the big houses are just a fraction of what you are paying, like less than a penny per trade. You could not hope to gain back in margin, what you lose in transaction costs.

High Frequency trading, is really meant for people or companies that have their servers in the same network as the exchange they are HF'ing on. In fact, true HF'ers buy and sell in fractions of a second, some lasting seconds, or minutes.

I've researched this, and tried doing the same, but, in the end, trade fee's, and commissions kill the idea dead. In fact, HF Trading falls under Day Trading, which required that you must have the legal minimum $25,000 in equity on hand to day trade... that is the main thing that killed it for me... read for yourself: http://en.wikipedia.org/wiki/Day_trading

If you'd like to try it out before you trade live (here comes the shameless plug), you can use the trade simulation I made (buy/sell/sell short) and set your trade fee's and commissions totally free at https://algxchange.com

"In other words, he figured out pretty quickly that this is not an algorithm game or speed game - it, like many things on Wall Street - is a 'who you know' game. No matter how smart, how fast, how sophisticated you are as an outsider, the likelihood is that someone on the inside has a similar trade idea and can do it faster and cheaper than you."

For every wall st question posted to HN this is the answer. It's a lot of money in somebody else's backyard. The game's played in that backyard are always rigged against the outsider.

Check out Interactive Brokers at http://www.interactivebrokers.com/. Their brokerage fees are substantially lower than that.

As others have said, I think focusing on HFT is a bad idea. You are going to be in competition with other players in the market. What are your weaknesses compared to them:

1) You are alone; you are competing with teams mixing 3-4 math/physics ph.d.'s and 3-4 programmers that might implement their algos on the latest GPUs (and this is very time consuming).

2) You have no experience. You are competing with very smart people that have been playing this game for years.

3) You (probably) have little money. So you want to minimize cost of historical data acquisition, data storage, data analysis, hardware, collocation …

IMH, these points pretty much rule out HF trading.

You also have some advantages compared to the competition. One of them is that you have little money, so you can invest in assets that are illiquid to someone that wants to move a lot of cash.

I think it is possible to trade as an individual; hard but possible. And if you are successful at some non-crazy frequency -- a few days, a few hours, a few minutes -- then, maybe, maybe have a look at HF.

And another thing, if you can, go shopping for a broker with lower fees. $10/trade? It looks as if you did not do basic homework -- but mabe you live in a country where that's the cheapest you can get.

As already said both here and in previous threads: HFT is for big players; requires:

- hosting on exchanges premises to cut roundtrip delay (µS speaking). Also, very often, you'll be throttled depending on how much you pay. For entry fees, you'll be limited to ridiculous rate such as 20 msgs/sec.

- man years of development. Basically, when starting from scratch, we develop Direct Market Access gateways (i.e, custom/proprietary access to exchanges) in 3 to 5 man-months (using an in house framework). And they sit on top of other products that have dedicated teams working fulltime for years.

- upkeep: exchanges update their systems once/twice a year. Migrations take somewhere from a few hours to weeks, or a from scratch when they rewrite their complete API. If you can't keep up with the updates, you won't be able to trade.

And here, we're only speaking about trading. As highlighted, you also need market data, both retrieval and processing. Then and only then you'll be able to seriously start trading. (and then you'll want to do some back office stuff)

> The trading fee is close to $10/order

Deal breaker, especially for HFT.

HFT is a game of speed and low fees. Not a single individuals broker can offer substantial speed (microseconds count) and fees (cents count). It is a game for prime brokers and big institutionals. Moreover it is true that some exchanges (like NYSE) offer lower latencies for a serious monthly fee (not the one an individual can afford). Thus I don't think one person can solve all these complexities, however this topic is a good startup field with lots of ineffeciencies and unsolved problems.

Algorithmic trading is an option, though needs careful research on topics of broker setup (InteractiveBrokers should be a good starting point, ThinkOrSwim is another good one, both has rather tolerant programming APIs) and software for time series analysis and trading (among the most popular are Marketcetera and WealthLab + lots of libraries like Incanter and Weka). On some of these topics http://elitetrader.com forum should be helpful.

Successful HFT desks are either large broker-dealers, or sophisticated hedge funds, both of which are well-capitalized, have expensive infrastructure, great location (close to exchanges to minimize lag), ultra-low trading commissions etc. Furthermore, only the hedge funds are pure prop groups. Most the broker-dealer desks act as market makers and aggregators for internal flows (almost free money).

So it's going to be very hard for you to compete with them, even if you come up with better algorithms.

I recommend looking at mid-frequency or low-frequency algorithms, which will offset the impact of your higher trading commissions and don't get affected much by infrastructure and location issues.

And yes, $10 per order is too much unless you talking really big money. Check out IB, FXCM etc.

As an individual it is unlikely. Your best bet would be to develop an algorithm with massive amounts of recent historical data that shows you could hypothetically be profitable. Shop that to trading firms or angel investors and use their capital to get a direct feed. You probably won't be able to do this to a major exchange, but there are lots of other options. If you find a smaller market you can also potentially get a market maker agreement. This will mean that you have to have certain positions open at all times, but you can place any orders you want in addition to those, and all of them will have $0 fees.

All that said, I would suggest you stick with vanilla algo trading to start, and leave HFT to the big boys (until you are one).

your internet-based API is nowhere near fast enough to compete with legit HFT firms. low to medium frequency with longer holding periods, sure, but real HFT via your broker's API? not gonna happen.

If you're not connecting directly to the exchanges, no.

I like lrm242's comments. I started my own high freq algo firm back in 2007 and crushed it for awhile. In the end though, the governing dynamics of the marketplace change and can change very rapidly, so any machine learning or neural networking you do perform can become obsolete fairly quickly. Unless you've developed HAL you're probably going to have a rough go of it alone...hope this helps!

High frequency low latency trading is an fascinating area. If you do decide to make a go of it and have the algorithm, the collocation site, the relationships, and the money, you might want to work with Maxeler Technologies, http://www.maxeler.com, which has the acceleration technology that is need to make it happen.

There is a talk being given by Andrew Sheppard on O'Reilly right now talking about using GPU programming to speed up number crunching, specifically in the financial arena.


I believe the talk will be recorded and made available for later viewing.

A little OT: People here are always talking about trading stocks, anyone dabble in forex?

A caveat emptor on FX trading worth reading: http://tickerforum.org/akcs-www?post=23331

Yeah, there are lots of bucket shops.

The claims about costs are stated in a weird way. The absolute cost of the trade doesn't change based on your leverage, just the cost as percentage of the margin requirement. I guess they're just trying to say you can blow up your account faster with high leverage.

You can also find brokers that don't do roll over and will even pay you interest on open positions for certain pairs.

I can't even get real time (meaning not delayed by 15 minutes) stock quotes for my startup here in Brazil.

Can't sign up for an IB account?

etrade is about to come out with an API and programmers could use it to do some really cool HF trading.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact