The only serious API that isn't a corporate API is the Web. (There are of course some pre-Web Internet APIs that are still around but they're a lot less relevant.)
Write for the Web. Don't write for a platform that belongs to a single vendor. Look what just happened to that cool Face.com API. Yep, even "good guys" can get bought and their third-party developers get screwed.
You just have to make sure the interest of the company who owns the API and your own interests are aligned. Take two examples:
Twitter: as a client developer, your interests are competing -- you want control of the stream, Twitter wants control of the stream. Guess who wins?
PayPal: You want to process payments, they want to make money taking fees for processing payments. That API isn't going anywhere.
I think the first case can be mitigated if there's a "developer account" that you pay for, and I've predicted that Twitter and Netflix will both come out with (paid) developer accounts in a short time. We'll see.
Ok, I apologize for the tone of my previous post, it was pretty flippant.
Yes, I don't think you should even go into business without either:
1) A reasonable expectation of getting screwed (maybe reasonable is less than 1%, maybe it's 50%)
2) Money to pay at least some legal fees
In order to ensure fairness, you have to play some kind of game of attrition http://en.wikipedia.org/wiki/War_of_attrition_(game) against others who would seek to gain by harming you disproportionally. In the U.S., we usually play that game with lawsuits. Perhaps in Russia (I am broadly generalizing and stereotyping) you might play it with bribes and cronies. In Somalia, it might be with actual guns and attrition.
You can enforce a contract against a larger company, size does not have a 1-to-1 correspondence with court case outcomes.
But that's not "making sure" at all - you're still at the mercy of others. Just because you pay them doesn't mean they can't discontinue the product, for example.
Also, PayPal seems to have a horrid history of freezing accounts nilly-willy.. sure, their API might not go anywhere (as long as PayPal doesn't ;).. but what's the use when your access to it might go away any moment?
"Making sure" may have been too strong of a phrase; what I'm talking about is minimizing risk. Sometimes you can form relationships or partnerships with the 3rd party, sometimes you can shovel them money, and at all times you should do your best to make sure you're not using a 3rd party to compete with itself.
I am not sure to get exactly what you mean with that. But I was thinking about APIs recently and I will explain how I think it should be done in the best of the worlds, it may have some relation with this assertion.
First a note: I don't mean APIs can be like what I describe, nor that businesses should do like that, nor that it is easy or has no problems. It is just a point drawn in the plane for better distance measures.
When I point my browser to http://someurlfoo.bar/user/A, the server knows I am using a browser, knows who I am, and answers with a mix of HTML, javascript and other resources.
Now if you check any API available you almost always have an equivalent GET http://api.someurlfoo.bar/user/A, and the server will answer with some XML or JSON blob containing the information required (and often more).
Setting authorization, optimization and bandwith issues aside for 5mn, one could just wonder why the ### those two interfaces are different. Why not read the ACCEPT header and answer with JSON, XML or HTML accordingly, serving apps, direct users and other client through the same dumbwaiter?
Why not having just one interface?
Benefits would be interesting:
- user A would be able to present itself ("A, the name's A") in different languages: html, xml, json, whatever.
- the information about a given topic should have the same content in every languages. Just take the data from the store and filter it through a jsonizer, an xmlizer or a htmlizer.
- API client code would not have to read awful API docs that are obviously written by robots for robots: they would just change point their browser to the data they want, change the accept header to json and get their answer.
It seems to me that, APIs having historically grown after normal web pages, they have been conceived as a separate service, but they are just the same core thing.
Now let's go with possible objections:
- Authorization should be handled the same way, but it is possible that the server would need to ensure that you are coming from which mobile app, or web app. It is easy to do with a double authentication: the app authenticate itself and passes along end user authentication token. This way the server knows who it is speaking to, through which intermediaries.
- Bandwith: An important use for APIs nowadays, from the server's point of view, is the ability to avoid being flooded by bogus requests. An API key comes with limitations on the maximum amount of requests per day or the like. If both the end user and the client app authenticate themselves, then there should be no difficulty in giving priority to some clients over others (usually the direct browser access is preferred).
- Optimization: This is maybe a bigger issue. From the APIs I have seen they seem carefully crafted to reduce the number of roundtrips required. For instance, a GET request for user/A might answer with A's name, number of followers, list of recent followers, same with followings, A's recent posts, creation date, last dog-walking time and culinary preferences. All of this makes no sense if all you need is A's full name. I think this optimization issue should be solved differently, even if we were to keep the current isolated status of APIs, but this if for another post.
This has been tried; the big idea of xhtml+xsl was that your server would answer any request with an xml api response and some xslt that transformed it into a webpage. Now admittedly part of the reason that failed is that xslt is a pile of crap, but even if it hadn't been the aproach is wholly unsuitable for building modern, responsive UIs. The behavior you want from an API - consistency, separation of concerns -is completely at odds with what you want on a page, and there are a whole lot of concerns you want to hande server-side (e.g. pagination) but that don't make sense for APIs. It's not that no-one's thought of this, it's that the current approach, where you have a thin but distinct server-side presentation layer (or even client-side javascript - the point is that you need a full-eatured program to turn go from API calls to good UI), works better in practice.
That works (though you might have trouble with older browsers), but you need to treat it as a "real" program, and you wouldn't want to serve it from the same endpoint as your API using accept headers like you're suggesting. And it's not really any different from the current way of doing things; it's always been good practice to build your web tier on top of your API.
"I hope if anyone is getting started here, in bootstrapping an open alternative to Twitter, that they will be compatible with these two pieces of software." He's talking about his software (that nobody uses), and that's the agenda behind the piece. Just in case the "open Twitter" takes off, while he has no intention of working on it, he wants to inject the idea that developers have some kind of moral responsibility to make it compatible with his stuff. Oh, and because he forgot it this time, I'll put this in: he thinks he invented RSS.
“Eventually Tim came around, and gave me credit for making RSS happen. Thanks.” [1]
“a long-standing issue I have with Wikipedia, that I don’t get any mention in articles on [...] RSS [...] people who want to know about blogging or RSS get the idea that other people did the work and took the risks that I did.” [2]
Seems to be along the lines of what Simmons is advocating. May not be the ideal Twitter replacement, as it relies on servers (rather than being decentralized like email) and is written in PHP.
But email does rely on servers; it's just that the servers are distributed. And so are StatusNet ones: I can run my own server and subscribe to some feed on yours and they'll talk directly without relying on any other.
And all the protocols are open: Atom, PubSubHubBub, Salmon, etc.
I defer to you; I just tried to figure out how the system worked by looking at a few websites, and don't really know anything about it. If each user can run his own server then that's more interesting.
Twitter is traveling a road that the industry has already been down before, and as an outside observer, it would appear that they've failed to learn from history.
AIM, MSN, and the other IM networks gradually attempted to generate revenue by adding advertisements to their official clients, and additionally, did not provide clients for operating systems other than Windows (and sometimes Mac OS/Mac OS X).
In addition to that, there was a significant user demand for IM clients that could connect to multiple networks at once, which was strictly contrary to the business goals of the commercial IM networks.
None of the IM network protocols were public, but this didn't stop developers from reverse engineering the IM protocols and producing alternative clients, often to the chagrin and frustration of the IM network providers. There was, for a time, a small arms race between AOL and implementors of unofficial AIM clients, but invariably the 3rd party developers won. This despite the fact that the protocols were complex binary-only affairs. The modern reliance on HTTP and self-describing encodings (eg, JSON) is a significant difference from that time, and makes reverse engineering of closed protocols vastly easier.
This arms race also wasn't constrained to open-source; there were a number of shareware third-party IM clients.
If Twitter continues to lock down their APIs for the use of 3rd party clients, while simultaneously 'monetizing' their client while almost invariably making it less attractive to users, then it is almost certain that we'll see the emergence of clients using unsupported API, especially for the desktop, and most likely for mobile. My biggest concern on mobile is whether Apple, at the behest of Twitter, would pull unofficial clients from the App Store.
This already happened to Pandora, who is engaged in a minor arms race with third-party developers. Pandora provides a poor Air-based desktop client, resulting in a number of native clients being produced for Linux, Mac OS X, and Windows[1]. Pandora occasionally produces updates that break these clients, and the clients are then updated to fix the issue.
"Why is it a bad idea to develop on the Twitter API, but a good idea to develop on Apple's? [...] Apple is far bigger therefore far more dangerous than Twitter"
...I think, and sorry in advance for the pun, that's an apples to oranges comparison there.
Firstly, the author already says he's more comfortable with companies like Dropbox where the 'customer is the customer', and not the product. Most people would agree that's true of Apple.
But more importantly, there's a huge difference between a web based API and an OS-level API. I can think of many examples where REST APIs have pulled their rest APIs overnight or on short notice (Face.com being the most recent). A company like Twitter or Linked.In can kill an app that's using their APIs instantly.
Now, you could argue that Apple can do this through the app store, but I think that's a rather different discussion. Apple's motivations and Twitter's motivations are very different here. Apple will, in fact, pay you to use their APIs if you choose to charge for your apps.
Some people will say it's a 'good' idea to work with Apple, other's will say it's 'bad'. Whatever your opinion, I'm still not totally convinced you can really compare Apple with Twitter in this situation.
I wrote about this in a recent blog post: "App Stores are the new record labels"
As much of software is about disintermediation, of making the world run more smoothly through removal of middlemen, it is interesting that we software developers are now driving ourselves to a world full of middlemen. A world where we suddenly have to ask for a permission to do something new.
In a world where everything must go through the rules and regulations of an app store without any oversight we, the developers, will suddenly be in the same abused stage as artists are with their labels. We take all the risk and all the effort on building software for our users. The middleman then can invalidate all our hard work by arbitrarily making it impossible for their ecosystem to run the app. And even if they do accept the software, they'll take a hefty cut of the proceeds.
How can this make sense to an independent developer?
It's worrying, because with little freedom comes little room to innovate. The web as we know it has become the huge, fast-moving, innovative industry that we know it to be thanks to the freedom it has.
Anything that gets the words "twit" and "corporate" in the same sentence gets my vote.
Twitter is a user driven content system and in that closing and limiting the API in the way they are doing does optentualy hint that there going to somehow charge down the line for it's use.
Googe MAP's API was free competely at one stage, they introduced charges and then reduced those charges. Twitter is trying a different approach - not saying there approach is better as I personaly believe it is not. But there certainly trying something that will eventualy boil down to MONEY.
Long term isn't the only kind that exists. Short and mid terms are fine as well, as long as you find a way to monetize fast and early. Just be aware that the party may be over at any moment and, when it is, exit gracefully and before the crowd starts yelling for the already absent DJ to play just one more song. Use that time to find a different party to join to, one that's just starting.
When email dawned with the Internet, nobody ran into these problems because email communication didn't care which client you were on. The communication happened irrespective of where you were.
The same initial mechanism of email communication still persist today except perhaps someone innovated a little to make the group communication little easier (the listserv etc).
We have come far in our journey, but people need to realize that no matter the innovation through Faebook, Twitter etc, the truth is that underneath all the glittering services, the basic communication need is still the same as it was during initial email days.
So there should be multiple Twitters, multiple Facebooks and the average user shouldn't care which Twitter and which Facebook you are on as long as you can communicate easily. The problem is that people are not realizing this basic fact hence we have created one walled garden after another.
I don't understand why Dave objects to Feedburner in the "2a" point of his post. He seems to be saying that relying on Feedburner as a corporate API can or will burn you if Feedburner changes their policies.
But Feedburner lets you serve your feeds off your own domain (e.g. feeds.mattcutts.com) using a CNAME in DNS. Here's an article from 2007 that talked about how Feedburner made that functionality free after Google bought Feedburner: http://www.mydigitallife.info/how-to-activate-and-use-feedbu...
So if Feedburner ever makes you angry or disappoints you, people are still coming to your domain to get the feeds, and you can just swap Feedburner out with a different system. As far as commercial APIs go, that seems pretty developer-friendly.
Of course you have to do that long before Google breaks your feed. I don't really object to Feedburner in this piece, btw, I just say people are going to regret it. In my opinion (clearly labeled as such). And of course this is your opinion (not so clearly labeled, esp since you work at Google, assuming you still do).
Yup, I still work at Google. I set up feeds.mattcutts.com years ago, and my hope is that anyone who cares enough to configure Feedburner in the first place would click around to find the MyBrand feature and set it up.
I'm not trying to take away from your fundamental point about corporate APIs, by the way. At this point, any time I use a product/feature I'm thinking about how I would export my data. If the answer on data liberation isn't what I hoped, I often avoid that product.
Twitter simply doesn't make sense as a business -- not for other companies, and not even for Twitter itself. But it would be fantastic as a protocol, alongside SMTP and such. I realise that this is far easier said than done -- but what would it actually take to do it?
It's already done: StatusNet uses two core protocols for sending messages around - Atom + PubSubHubBub - and a few extra for describing the messages: Salmon + ActivityStreams.
These are all open, distributed and already power a Twitter-like social network.
Really interesting -- I hadn't heard of StatusNet/oStatus. Looks like it might have the right idea -- but after several minutes of searching, I'm unable to find a single client or a public frontend (as in, I don't have to give away my personal information to view it) for the network. Why do you suppose this is?
If you built applications for IBM VM/CMS or GECOS systems, you used proprietary APIs. If you built for VMS, DG/UX or HP-UX, you used proprietary APIs. Same for applications built for Windows or OS X.
Yes, open protocols and APIs and standards are preferable and "open" is great. In theory. In practice, things always get messy. "Open" doesn't solve many of the problems that the customers (the folks with the money) have.
Linux - while not proprietary - has its own hassles with application portability. Try deploying with GNU autoconf/configure anywhere other than a GNU/bash system, for instance. Linux is inherently very open, but can also be surprisingly constrained.
For larger and more complex and longer-lived applications, the only viable long-term approach I can see involves a mix of open (eg: writing most of the code in portable C, Python or Lua or...) where that's possible (such as in the application kernel), and then using vendor interfaces where it's not.
Certainly presenting web interfaces or connecting to Twitter where that's appropriate.
For the smaller and "throw-away" applications, or when specific customer requirements such as gonzo-level I/O or graphics performance is needed (to have a product that's viable for, you know, your customers), then you're using the vendor APIs; you're writing non-portable code. The whole discussion of "open" is moot for these applications. (And I've seen many 4GL application generators come and go, too; you can be portable and chained to one vendor, at the same time.)
Application portability and controlling your stack is a very old topic of debate. And it's one filled with trade-offs. Of the trade-offs that a commercial developer makes involves determining what customers will pay for. And with most customers, "open" just isn't a priority.
This is a very old discussion and debate. Look at your history. And more importantly, look at your customers needs and expectations.
The debate over corporate APIs is not new. That's not Dave's point. A lot of people are just starting to think of Twitter in the same light, though, and that is definitely new.
He prefers companies like Dropbox where the customer is the customer yet he's skeptical of Soundcloud -- soundcloud content creators are the customer too-- they have paid plans. As long as serious users are paying and developers are using their API to make it easier for content creators to distribute their content, Soundcloud remains distinctly different from Twitter. I happen to think Soudcloud is fantastic, their freemium model allows for user growth, then those users can upgrade when their needs increase. Soundcloud is much closer to Dropbox than they are to Twitter; the author seems to be ignorant about Soundcloud's model despite him being so cool as to have had a conversation with the CEO.
DB monetizes input through storage subscriptions and Twitter is trying to monetize output through promotions and (generally) advertising.
SC is kind of in the middle where they are monetizing storage, but positioning themselves as a content provider, particularly illustrated by their recent upshift in design and focus on sharing features.
This reminds me of Google shutting down a lot of their APIs,starting charging for some or limiting usage with ridiculously low thresholds (1500 geocoder calls per day for example). Larry P. called this a consolidation and a re-focus...
I agree with the general sentiment of the article.
On a somewhat related topic: is there an alternative to Amazon's affiliate API?
I've built a project that uses it but I don't want to base the selling of products for commission to be limited to just Amazon in case they decide some day down the road that the API is no longer worth it.
APIs are never perfect, and they are only there to sustain ecosystems beneficial to the company supporting them. They will inevitably change, for better or worse, but everything changes. Just don't rely entirely upon things that are not constant, or at least have a backup plan.
I haven't seen a web API that I really like... ever.
Remember del.icio.us? Their API was a roach motel, aimed to get data into del.icio.us, but offering no real access to the huge pool of shared bookmarks inside.
Any commercial API takes more than it gives, that's a fact.
Write for the Web. Don't write for a platform that belongs to a single vendor. Look what just happened to that cool Face.com API. Yep, even "good guys" can get bought and their third-party developers get screwed.
The Web is the platform.