Hacker News new | past | comments | ask | show | jobs | submit login

How do you plan to get around the Twitter syndication rules and regulations once this site takes off?



This is a really good question, and should be referring to all apis and their limits. This was my approach: Create a parser that lives off of apis, just under posted limits. Although we had a few thousand visitors today, there was only one direct line to each data source. There's a viewer server that reaches for data every so often depending on your desired speed (I think it's set to somewhere between 1 and 2 seconds) and broadcasts it. So if the parser goes down, or tags change, or services get mad, you can fix and restart without requiring clients to refresh. I'm still not in love with the solution, but it worked nicely today. Thanks!


Thanks for the explanation. I worked for a one of the eleven companies that still had rights to access the Firehose after the debacle and I saw a lot of great projects stymied by the new rules.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: