
Don't Build APIs - atularora
http://ceklog.kindel.com/2012/04/18/dont-build-apis/
======
cek
OP here.

When I wrote this post, I didn't imagine so many people would react by saying,
essentially

"Screw the customers. Just break 'em."

One of the reasons I have my panties in wad over this topic is the Web has
made it far easier for people to create APIs that get used by others. Back in
the olden-days you HAD to be Microsoft or similar to get the kind of traction
a kid in his parent's basement can get with a little Python and a Heroku
account.

But something else has changed along the way: Composability.

Yes, I worked on COM, OLE, ActiveX and all that crap. I was thinking deep
thoughts about composable software back in the stone ages (and even then, it
had all already been done by others smarter than I in the 70s & 80s).

But today it is REAL. The Web technology stack has actually, finally, enabled
massive distributed systems composed of loosely coupled components from
multiple independent parties! The dream of my computing youth has become
reality!

These things are connected by APIs.

(Which, by the way are not just function calls. A API can be a REST end-point,
a file format, or a database schema, amongst other things).

Yes, you as an API provider can choose to deprecate your API anytime you want.
Use kill-dates, warnmessages, etc... You can even create new, independent &
parallel versions. It will help.

But you will find that someone else has built something that uses your API,
that exposes its OWN API and now there's a 3rd party indirectly dependent on
you. Be prepared for it.

APIs are actually easy to build. That is the problem. The solution is to
realize that statement is actually false.

~~~
ssanders82
OP, I really don't want to be that guy, and I don't know why I'm being that
guy tonight, but regarding your sentence, "I then spent hours pouring over..."
- come on man, it's poring.

~~~
cek
I don't mind you being that guy at all.

I consider myself to be an ok writer with a reasonable vocabulary.

I never knew it wasn't pouring.

Consider me educated. Thanks. :-)

~~~
alexchamberlain
poring and pouring have different definitions - so pouring is correct for
liquid, and poring is correct for poring over.

------
tmurray
This post is very accurate. I build APIs for a living (CUDA), and this lines
up pretty well with my experience. Writing APIs is very tough, you will get a
lot of things wrong, and the fixes available to you after you realize your
mistake are all ugly at best.

One quick example:

In CUDA, you have to explicitly copy memory to and from the GPU. We have two
basic kinds of memcpy functions--synchronous and asynchronous. Asynchronous
requires some additional parameter validation because the GPU has to be able
to DMA that particular piece of memory, etc. After we had been shipping this
for a release or two, we noticed that our parameter checking for the
asynchronous call was missing one very particular corner case and would
silently fall back to synchronous copies instead of returning an error. We
thought, okay, let's just fix that by returning an error because surely no one
managed to hit this.

Absolute carnage. Tons of applications broke. This particular case was being
used _everywhere_. It provided no benefit whatsoever in terms of speed; in
fact, it was just a more verbose way to write a standard synchronous memcpy.
People did it anyway because... they thought it must be faster because it had
async in the name? I don't know.

In the end, we made the asynchronous functions silently fall back to
synchronous memcpys in all cases when the stricter parameter validation
failed.

~~~
bwood
Thanks for your work on CUDA, it's really a great tool! My one hope is that
Nvidia decides to make it a direct competitor to OpenCL by allowing it to
target different platforms (though I recognize that it's not completely up to
Nvidia and requires cooperation from others).

~~~
exDM69
What would be the value in competing with OpenCL? CUDA and OpenCL ship pretty
much an identical feature set with small changes in the API. The biggest
difference is that OpenCL requires you to use buffers+offsets where CUDA
allows "pointers" to GPU memory.

It's nice that Nvidia has CUDA so they can go ahead and expose functionality
in new GPU's without having to (first) deal with OpenCL standardization.
However, for the long term, it would be better if we'd stick to OpenCL so at
least parts of source code can be shared between CPU's and GPU's of different
vendors.

~~~
bwood
Competition is a good thing, and CUDA isn't a direct competitor to OpenCL
because it only targets Nvidia's platform. OpenCL is obviously the longterm
winner because nobody in their right mind would want to lock themselves into a
single vendor. I'm saying that I would appreciate it for Nvidia to challenge
that, which will drive both CUDA and OpenCL to become better.

------
sophacles
At some point, we need to kill the myth of backwards compatible. This has
caused more problems than it fixes. Further, at this point in history, app
updates are trivial and built into everything, so retaining backwards
compatibility is not so much of a necessity.

When designing APIs, use versions and have a kill date in place. Even if you
don't change the API, release the same one under a new version number. Kill
access to the old version on the kill date. Keep N versions accessible at a
time, to reduce the burden on app writers, but don't slack on the kill date.
This will give you a timeline and procedure to avoid hacking in crazy
backwards compatibility, and targets for total rewrites.

Yes, people will still complain. It's OK though if you provide a reasonable
balance.

~~~
mattmanser
Myth? MS built a very successful business on it.

You know what happens when you get kill dates? One day all of a sudden half
the web will stop working. There's a reason why people start back flipping to
support out of date calls.

Customers don't care why your software just broke or whose fault it was, all
they care about is it broke.

~~~
georgemcbay
I'm actually quite fond of the way Microsoft historically bent over backwards
for backward compatibility, but OTOH Facebook is also very successful and has
a very successful ecosystem of third-party users of their APIs. Facebook is
like the anti-Microsoft in that they randomly change and break their APIs at
such an alarming rate that you have to wonder if they are intentionally
fucking with you if you code against their APIs.

Obviously the needs for a web API and the needs for an API with a specific
binary ABI on a local OS are quite different but I think for either
environment, most API developers can comfortably fit somewhere between those
two extremes where they don't continue to support inherently unsupported API
usage, but they don't break something randomly every week.

~~~
planckscnst
I think developers tend to love backward-compatability while users hate it.

As a developer, it means we don't have to keep digging out the old projects
and porting them to the new API.

As a user, it means we are left with systems that have many years of cruft and
with features being held back from systems because they couldn't make it
backward-compatabile.

I think most users are willing to pay any higher price that will result from
developers being required to work harder to port their software.

~~~
anonymoushn
As a developer, I would rather have my code break than continue to use a
terrible API.

~~~
dvhh
How about fixing your application, because your terrible API were deprecated,
for an application that is quite old, bit still in support ? (and of course
for free, because client don't expect to pay for a compatibility fix).

------
btilly
The key problem here is that "backwards compatibility" means "backwards
compatibility for people abusing the API in undocumented ways".

Don't do that. That way lays insanity. Be very, very clear up front that you
_will_ break backwards compatibility for those folks. Don't sweep it under the
table, be very vocal about having done it. There will be short term pain as
important customers (eg Adobe) learn the hard way that you really mean it. And
long term relief as you don't have that legacy headache growing so quickly.

One estimate is that a 25% increase in requirements results in a 100% increase
in software complexity. (See _Facts and Fallacies of Software Engineering_ for
a source.) That means that the complexity grows as the number of requirements
cubed. Therefore the key to a good API is that it be short and simple. When
you start to add implicit requirements on top of explicit ones, it quickly
becomes long and complex, and the necessary software to support it and make
future modifications becomes much worse.

This does not mean that designing APIs is not hard. But don't let your API
become what is published and quirks that are not. Just don't.

~~~
cek
OP here.

I think this is very naive/utopian.

Yes, be super focused in your design.

Yes, only expose APIs you have clear use cases for.

Yes, keep the surface area as small as possible.

Yes, have very focused requirements.

Yes, document the hell out of things.

Yes, implement strong parameter validation and other things to try to reduce
the chance people do bad things.

Do all these things and more (these are all part of what makes exposing APIs
hard work, that many people don't do).

But do not, for one second, believe that someone still won't do it wrong or
abuse your perfectly designed API eventually... especially if it is
successful.

~~~
btilly
I agree with all of what you just said. But I think that the post is not a
good demonstration of that.

Microsoft failed at keeping the surface area as small as possible. When their
surface area expanded to a hack to deal with Adobe's hack of replacing code
behind Microsoft's back, they went into very dangerous territory.

~~~
jbri
If an end-user upgrades their operating system, and Adobe WhateverIUse stops
working, who do they blame?

~~~
mtts
It's not as simple as that. If it's just Adobe WhateverIUse that breaks, sure,
people will blame Adobe. But if it's Adobe WhateverIUse that breaks AND
WordPerfect WhateverSomeoneElseUses and maybe also Borland
SomethingSomeoneElseYetAgain uses word will get around that the new operating
system "breaks things" and people will stay away from it.

Even if noone will ever use all three of the broken apps.

In this sense, maintaining backwards compatibility is simply a matter of
reputation management.

~~~
masklinn
> If it's just Adobe WhateverIUse that breaks, sure, people will blame Adobe.

Not a chance. People will blame Microsoft, because WhateverIUse worked
perfectly before. The only parameter which changed is that they updated the
OS, therefore the OS is to blame.

------
3pt14159
Upvoted because I find it interesting, not because I agree with it.

FreshBooks has a very _significant_ amount of their usage/profit from their
API. These are just their endorsed/vetted add-ons, let alone all the ones out
there in the wild: <http://community.freshbooks.com/addons/?header_addons=1>
and they clearly built an app AND an API.

The API for FreshBooks was a major portion of their (very successful)
strategy, so I can't see why people can't do both, provided they do it
_intelligently_.

------
blantonl
This post addresses many of the issues we've dealt with at RadioReference.com.
We version our primary APIs which has worked very well, but occasionally we
have to abstract and write translations for backwards compatibility. We also
need at times to deprecate versions and features.

But, this quote in the post is important: " _When exposing APIs be absolutely
certain the value you get from doing so is worth it._ "

In our case, our APIs are a significant revenue driver and are worth it. Don't
let OP discourage you from exposing your platform's data via APIs. Instead,
let this post warn you what to look out for when exposing APIs.

------
stickfigure
This boils down to a simple question:

Are most of your API users ahead of you, or behind you?

If you have an immature API with a smallish number of users, and you think
that incompatible changes in the API will improve adoption with the vast bulk
of your (not-yet-on-board) potential market, then go ahead, break the API.

If the only reason people use your API is because of its legacy -- if you
think that by changing your API people will wake up from their inertial
slumber and investiate your superior competitors, then don't break your API
under any circumstances.

Obviously there are grey points in between, and a series of small breaking
changes will be worse than occasional large breaking changes. But by and
large, the success of your platform will depend on its utility, not legacy
compatibility. This is why stripe and wepay will eventually conquer paypal,
why python will continue to be a vibrant community, why java will eventually
fade to cobol-like obscurity, and why Microsoft Windows is fundamentally
doomed. It's hard to look backwards and forwards at the same time.

------
jwwest
This reminds me of a Joel Spolsky post from years ago:

<http://www.joelonsoftware.com/articles/APIWar.html>

Essentially, Microsoft shoots itself in the foot by trying to stay extremely
backwards compatible, even to the detriment of making their products better.

------
instakill
Public APIs are forever – you get one chance to get it right. Unless nobody's
heard of your API and thus nobody uses it, in which case you're safe, but also
screwed.

------
asif
Build the best API you know how to build and ask questions later. If your
business is doing well, then supporting your API forever is a high quality
problem.

~~~
gfodor
This is often good advice, but not in this case. Versioning your API is a
trivial thing to do from the beginning and has _massive_ consequences down the
road. Some things _are_ worth getting right in the beginning.

------
hartror
My biggest issue with web APIs is the speed at which they change means you
often get a lack of maintenance of documentation. The speed of change
compounds the effects, the discussions you find on the API usage only months
old have lost relevance in non obvious ways.

Facebook and Paypal are or have been guilty of this in the past though
Facebook is a lot better than it was.

------
tvaughan
The OP's enemy is not APIs, but rather proprietary software.

------
ilaksh
Open source APIs and libraries improve this situation to some degree.

At least you wouldn't be stuck debugging assembly code.

------
paulhauggis
Netsuite has a different url for each new version of their API (and they keep
the old one live as well).

This was great for me because my code never broke, which was important because
It was running the back-end of an e-commerce site. It gave me more than enough
time to upgrade when I wanted bug fixes/features.

~~~
ageektrapped
Yes! This wasn't what the OP was referring to, though: he had shipping shrink
wrap software with APIs in mind. But still, you're totally right and this
occurred to me as I was reading the post, for Web APIs: keep versioning the
URL for every shipped revision of your API. You'll have to document like
crazy, of course. You can even detect stragglers on old versions and help them
get on the new version.

