
How I Took an API Side Project to 250M Daily Requests - coderholic
https://blog.ipinfo.io/api-side-project-to-250-million-requests-with-0-marketing-budget-bb0de01c01f6
======
Tloewald
I'd like to point out that the things he says he's doing instead of marketing
are, in fact, marketing. It's "guerilla" marketing, and it's being paid for
with the writer's time. Nothing wrong with that, just don't confuse
"marketing" with "advertising".

------
smokybay
The author does not say how much does maintaining the service cost and what is
the long term plan. As others have referred to already, a similar existed
before and ended for a simple reason – no point in maintaining it with
constant loss and no clear revenue plan.

[https://news.ycombinator.com/item?id=11010856](https://news.ycombinator.com/item?id=11010856)

~~~
coderholic
Oh it generates revenue, and we have a lot of big customers! See
[https://ipinfo.io/about](https://ipinfo.io/about)
[https://ipinfo.io/customers](https://ipinfo.io/customers) and
[https://ipinfo.io/pricing](https://ipinfo.io/pricing)

~~~
sscarduzio
Yes but what's your revenue? Please consider going transparent (i.e.
[http://www.transparentstartups.com/](http://www.transparentstartups.com/))
and making an interview with one of my fav websites indiehackers.com

~~~
packetslave
I'm confused why you think you're entitled to this information?

~~~
blackoil
Providing revenue and financial metrics are really get eyeballs as this info
is rarely available for early stage companies. Balasmiq used this very well,
they released early on monthly financial status blogs, which almost all
managed to reach front page/top of HN. So, take it as another guerilla
marketing channel.

~~~
always_good
Though the difference between Balsamiq and an IP address lookup service is
that HNers don't think they can make Balsamiq in a weekend.

------
westoque
Good strategy! That's also what I did to get my side project (Cookie Inspector
- Google Chrome Cookie Editor) project to having 80,000/daily users.

[https://chrome.google.com/webstore/detail/cookie-
inspector/j...](https://chrome.google.com/webstore/detail/cookie-
inspector/jgbbilmfbammlbbhmmgaagdkbkepnijn?hl=en)

I solely marketed it at Stack Overflow and was getting upvotes and that was
all my marketing.

[https://superuser.com/questions/244062/how-do-i-view-add-
or-...](https://superuser.com/questions/244062/how-do-i-view-add-or-edit-
cookies-in-google-chrome)

Also a big factor there are good reviews. When users like your
project/product, they will market it for you.

~~~
zschuessler
HN users clicking through: take a look at reviews before installing. Appears
there may be an issue with adware.

westoque - might want to check your code base, if the adware is unintentional.

~~~
westoque
Thanks. Yes, it has been an issue and the adware is definitely unintentional.

Will fix immediately :)

Adding to my answer above: users are more likely to install your product too
if you provide good customer support.

~~~
julien_c
Post-mortem would be appreciated. As someone who has Chrome extensions in the
store, how did you get malware-infected?

------
davidivadavid
I'm not sure why people are proud to do things without spending money on
marketing.

What if spending money on marketing had made you grow twice larger? Twice
faster?

When people say "I didn't spend money on marketing", the only translation is
"I knowingly overlooked massive growth opportunities."

~~~
nostrademons
I wouldn't be so quick to make that translation.

I think a more useful way to think about it is "Know your users, and your
channel." There are some user populations - and developers are usually one of
them - that are virulently anti-advertising. Any paid channel usually earns
instant distrust from them. Putting money into marketing spending can have a
negative ROI for them, because it has a signaling effect on the brand that
says "Our product quality isn't good enough for us to get users without paying
for them."

Then there are other user populations - most e-commerce is like this - where
there is no such signaling effect, and they are happy to check out new
products, regardless of how they hear about the product. For these markets,
it's silly to ignore paid channels; you're just leaving money on the table.

~~~
sturgill
I think Apple's paid marketing efforts are a counter example to your argument.
Ineffective advertising is ineffective. As a tautology this statement is
pretty unremarkable.

I first found out about ipinfo from a Google search looking to solve this
problem. Haven't pulled the trigger on using it, but it's been on the back of
my mind for awhile now.

For me, their marketing happened to be their placement on Google. I don't
really know how you'd pursue paid channels on this one (outside of SEM and
SEO).

It's a solution to a known problem. Unlike, say, the Apple Watch which has to
first convince you have a problem... (This coming from a guy who's very
interested in buying a watch as I train for a marathon.)

~~~
nostrademons
For developer tools? Apple certainly spends a ton on brand advertising for
consumers, but most developers I know develop for Apple because they want to
reach Apple consumers. (I spend a year doing smartwatch apps before giving up
on the platform...the _only_ reason I was developing for Apple Watch is
because that's where the users were.)

~~~
sturgill
As a single example that immediately came to mind: WWDC. If that isn't a
massive marketing channel directly targeted at developers I don't know what
is.

------
rickduggan
This is super cool. I use a similar API to provide a client-side service
called IP Request Mapper ([https://chrome.google.com/webstore/detail/ip-
request-mapper/...](https://chrome.google.com/webstore/detail/ip-request-
mapper/ghhmhholmphdnpndngkhmpknekgicmbp)). Coming soon to a Show HN near you.

What it does is show where every asset on a web page is loaded from. It allows
you to visualize how many different requests go into building just one web
page. While it's gotten much better, the Houston Chronicle
([https://chron.com](https://chron.com)) used to make about 500 individual
requests to build its home page. It's down to about 125.

It's best to run it across two different monitors, with IP Request Mapper on
one monitor and your "normal" browser window on another. Then enter any URL
and watch the map start populating based on the geolocating every request made
by the page.

But it's projects like ipinfo.io that make these other things possible.
Standing on the shoulders of giants and all that...kudos to you, coderholic.

~~~
heipei
That's the same motivation that started me to build
[https://urlscan.io](https://urlscan.io). I wanted an easy way for everyone to
visualise the amount, size and destinations of the various HTTP requests that
a single page-load triggers. Incidentally I also created a tool that is being
used by a lot of Security / Phishing researchers. If you ever want some
inspiration for additional IP / domain annotation sources, check it out. I
should really do a "Show HN" soon ;)

Meanwhile, this is a scan for a particularly noisy German newspaper website
(faz.net):
[https://urlscan.io/result/f23e2e7e-e1eb-4591-9794-92f97957dd...](https://urlscan.io/result/f23e2e7e-e1eb-4591-9794-92f97957dd75)

 _This website contacted 35 IPs in 7 countries across 24 domains to perform
302 HTTP transactions. Of those, 51 were HTTPS (17 %) and 35% were IPv6. The
main IP is 92.123.94.227, located in European Union and belongs to AKAMAI-
ASN1. In total, 4 MB of data was transfered, which is 9 MB uncompressed. It
took 2.51 seconds to load this page. 16 cookies were set, and 42 messages to
the console were logged._

~~~
random_rr
Very cool tool. Thanks for posting this!

~~~
rickduggan
Agreed. heipei, love what you did here.

------
reacharavindh
Happy user here. My GF came up to me and asked if I could somehow get country
names for the ip addresses she had of her survey respondants. I Googled and
found this neat little API. True, I could have downloaded the raw databases
from elsewhere and worry about the SQL I need, whether the data is recent or
ancient or even correct. I decided it was an overkill for my need, and just
used this API in a throttled(1 req/s) mode and left it overnight. If I need
this IP to Location need again, I'd happily pay for this API.

------
babuskov
I'm baffled why anyone would use this, when you can import data in a database
and run it on your own server?

I mean, you might spend 20 minutes more to set it up, but you are safe from
having to rely on 3rd party service.

Anyway, kudos to coderholic for creating this and sharing the story.

~~~
gfodor
\- Database needs to be purchased

\- Database needs to be distributed to your servers

\- Database can become out of date easily

\- Database lookup requires going to local disk and having a relatively fast
access path/cache for lookups

\- In general, a local database requires a large amount of effort compared to
just running a curl in your PHP code.

If you are actually going to use a database, the proper solution does not look
like "put it on your webservers" anyway, it looks like "put it on a separate
service with a fast caching layer" etc etc. So in other words, the proper
solution to decouple yourself from a 3rd party API is to... build a 1st party
API.

In other words, not a 20 minute job. For small shops, a quick curl during the
page load _is_ a 20 minute job.

~~~
heipei
There are many ways to quickly get and use a GeoIP DB like MaxMind, most
requiring way less than 20 minutes. This is just one example of something I
started using recently:

docker run -p 8080:8080 -d fiorix/freegeoip curl localhost:8080/json/1.2.3.4

(Repo:
[https://github.com/fiorix/freegeoip](https://github.com/fiorix/freegeoip))

~~~
callumjones
Where are you going to host it, how will you cycle the container to keep it up
to date, and will you need to load balance it?

Running a Docker container in production is not a 20 minute job.

~~~
deno
Just put HAProxy in front of it. It’s not rocket science.

[https://www.haproxy.com/solutions/microservices.html](https://www.haproxy.com/solutions/microservices.html)

~~~
callumjones
Setting up HAProxy in a production environment with discovery for the docker
container may not be rocket science but it’s certainly not 20 minutes.

------
Scirra_Tom
Where did you get the IP DB from? My understanding is most you can't resell
access to?

~~~
maddyboo
Do you think free IP DB providers insert "fictitious entries" [1] to identify
breach of TOS like this, similar to what happened between Google/Bing a few
years ago [2]?

1:
[https://en.wikipedia.org/wiki/Fictitious_entry](https://en.wikipedia.org/wiki/Fictitious_entry)

2:
[http://www.cnn.com/2011/TECH/web/02/02/google.bing.sting/ind...](http://www.cnn.com/2011/TECH/web/02/02/google.bing.sting/index.html)

------
ribrars
Great overview here on how you solved a problem and built a business around
that.

I read that you use Elastic Beanstalk for your server config, but I wanted to
ask: 1\. What programming language did you use?

2\. What, if any, configuration did you have to do to the Elastic Beanstalk
config to deal with network spikes and autoscaling?

Thanks!

~~~
skynode
I was going to ask why _AWS Lambda_ was not used instead. Or any other
serverless offering from the major cloud providers.

~~~
ribrars
Yeah, that would be totally hands off. But I believe you'd have to ensure that
your requests didn't timeout, (3 seconds in lambda) and in this example of
10ms response times I couldn't see any issue here. If you're into python,
checkout Chalice, it's being built as a "flask" like interface on top of AWS
Lambda.
[https://github.com/awslabs/chalice](https://github.com/awslabs/chalice)

------
jacquesm
That's great. Question: does it make money? The words 'profit', 'money',
'income' and 'revenue' do not appear in the article.

~~~
coderholic
Yes, it's profitable. See
[https://ipinfo.io/pricing](https://ipinfo.io/pricing) for details of our paid
plans.

~~~
jacquesm
Neat, congratulations! I know a few people that were active in that space and
none of them managed to make it profitable and they all faded out again. It's
important that services like these exist and even more important that they are
viable businesses otherwise you are building on quicksand.

------
larsnystrom
Ipinfo seems to have the exact same logo as Podio
([https://podio.com](https://podio.com)), a service owned by Citrix.

~~~
IAmGraydon
That's interesting. It's almost exactly the same. I did a trademark search,
and although "Podio" is registered as a word mark, the logo design is not
registered. So IPInfo is probably in the clear, but they may want to consider
a new logo.

------
SirLJ
I see the author is posting the same thing every 20 days or so, so here is the
0 marketing...

------
WA
I use ipinfo.io mostly to see my own public facing IP address and for me, it's
2 reasons:

\- I somehow can remember that domain. I don't have to google "my ip" and dig
through weird domains that change all the time

\- The design is clean and simple. Not too many information, no ads, loads
fast.

~~~
secure
When I google “my ip”, I get a onebox showing me my IP. Dose that not appear
in your search results?

~~~
synapse0
Ok have a look at canihazip.com then! Love it!

~~~
penagwin
Or the super easy to remember [http://ip4.me](http://ip4.me) (ip6.me is a
thing too)

------
unchaotic
Crowded space. Quick google search of any of these keywords "ip address
location api" , "ip lookup API" , "geolocation API by IP" etc. shows :

\- [https://db-ip.com/api](https://db-ip.com/api) \-
[https://ipapi.co](https://ipapi.co) \-
[https://freegeoip.net](https://freegeoip.net) \- ipinfodb.com \-
[https://www.iplocation.net](https://www.iplocation.net) \-
[http://neutrinoapi.com](http://neutrinoapi.com) \-
[http://www.ip2location.com](http://www.ip2location.com) \-
[https://www.telize.com](https://www.telize.com)

and a few dozen more. I wonder if collectively they are serving over a few
billion requests per day. Microservices & API culture FTW !

~~~
geuis
I've been running [https://jsonip.com](https://jsonip.com) for years. Been
serving millions of requests a day for most of that time. Doesn't really show
up in searches well because it's just an API.

~~~
voltagex_
How do you pay for it?

~~~
buro9
I run a few small services, nothing of this scale, but one thing to bear in
mind is that it's easy to pay for a lot of little side projects if they have
virtually no costs.

The one cited simply echoes back your IP. That's it. How cheaply could you do
that and how many requests per second could you handle on one small VPS?

Example, I recently ran
[https://www.tactical2017.com/](https://www.tactical2017.com/) which is a
tactical voting website for the UK general election. The cost for serving that
whole website to 2.6 million people over 5 weeks, and 650k people in the last
day and a half, was $20.

Just push costs down.

------
firloop
Related, some other adventures while running an API to retrieve IP addresses.

[https://news.ycombinator.com/item?id=11010856](https://news.ycombinator.com/item?id=11010856)

------
mrskitch
I'm employing a similar strategy for my library
[https://github.com/joelgriffith/navalia](https://github.com/joelgriffith/navalia)
as I couldn't find any solution to manage headless chrome (plus the API for
that protocol is pretty heavy).

Building for what folks want, even developers, is so obvious that I think we
often forget about it. It's also not as glamorous as self driving cars or
rockets, so gets discredited easily.

Sound points though

------
fusionflo
Kudos to you guys for building this. There is always a lot of scepticism from
people on "why would anyone pay for this" . Reality is not everyone has the
time or resources to build their own kit. There are literally 1000s of
businesses on the internet that that are in the business of selling "time" or
timesavers and removing the risk of maintenance, ongoing support.

Keep improving this and with the rise of web personalization, the demand will
continue to grow.

------
kasbah
Does anyone know how ipinfo compares to running your own instance of
[https://freegeoip.net](https://freegeoip.net)?

~~~
justboxing
I was wondering about this too.

The answers on the SO question
[https://stackoverflow.com/q/409999/325521](https://stackoverflow.com/q/409999/325521)
that OP refers to (in his blog post)

also has another answer using freegeoip.net =>
[https://stackoverflow.com/a/16589641/325521](https://stackoverflow.com/a/16589641/325521)

From the comments on this answer (not OP's answer linking to his API), it
seems like freegeoip is not all that reliable (i.e. it's down a lot).

Funnily enough, 1 of the comments on this answers links to another free
service, called "freegeoip2" which seems to work just fine as of right now.

~~~
kasbah
That was my experience with freegeoip.net actually, which is why I host my own
instance of it.

------
diminish
Congrats. I m not sure but ipinfo could be very interesting to startups and
programmers. So a good idea could be writing attractive articles and posting
them on HN and Reddit programming and some other subreddits. That would bring
more customers with zero marketing.

See also:
[https://news.ycombinator.com/from?site=ipinfo.io](https://news.ycombinator.com/from?site=ipinfo.io)

------
craigmi
Pretty cool man, use your site all the time for ASN lookups, although I find
your carrier information wildly conflicts with digital element's DB.

------
motyar
Such stories don't let me stay focused on my freelance work.

I got inspired and start researching and Building. ( btw failing yet)

------
niko001
This has worked well for me, too. I saw an influx of "How to offer a time-
based trial version on Android" on SO and developed a trial SDK as an answer:
[https://www.trialy.io/](https://www.trialy.io/)

~~~
aphextron
Very neat. How successful have you been with this?

------
kpsychwave
Given the fast lookup time, it would be useful if you could provide a JS API
fot synchronous loading.

Essentially, a blocking script in the dom <script src="...api.js" /> that
prepopulates the window object. With clever error handling, this could improve
perceived performance significantly.

A few questions:

1\. What differentiates you from ip-api.com and other providers?

2\. Do you use MaxMind?

3\. Is there an option for no-throttling? 100s of simultaneous requests?

I aggregate multiple IP databases for my SaaS
([https://www.geoscreenshot.com](https://www.geoscreenshot.com)) and I need
highly performant / reliable IP look ups.

~~~
Etheryte
Why would you _want_ a synchronous, blocking script in the first place?

~~~
kpsychwave
For conditionally loading other blocking assets in JS.

For example, if IP is in China, local fallback for Google CDN as it will fail.

~~~
Etheryte
You can also do all of these things asynchronously, without blocking the whole
page.

~~~
kpsychwave
Yes but like all async ops not without expense to the UX.

------
kevan
>90% of our 250 million daily requests get handled in less than 10
milliseconds.

Minor nit, but with that level of traffic I'd expect you to be bragging about
P99.99 latency, not P90.

------
GordonS
> Our servers use latency based DNS routing to serve over 90% of all requests
> handled in less than 10ms.

What _exactly_ does that actually mean though?

Does it mean that processing time at your server is 10ms, or 10ms to time to
first byte, or something else?

Giving it a quick test, I generally get the actual JSON result in around
400ms. The lowest I got was 200ms, the highest around 1000ms. It didn't seem
to make any difference if I used the HTTP endpoint instead of the HTTPS one.

------
drej
I see it's still a thing. Back in high school, some ten+ years ago, I coded up
an 'ip2country' website. Not sure why, there were dozens of those. I guess I
had a free domain and a lot of time on my hands. I put some Google AdSense on
it and let it go. I checked my AdSense account some six months later and found
out I was cashing $20/month. Easiest money I've ever made.

------
pier25
So what's your stack? Still running PHP?

------
merb
well currently my location is basically totally wrong. from
[https://www.iplocation.net/](https://www.iplocation.net/) I've only seen one
service that tracks my location 100% correct (correct village), all the others
are 200 or more km away from my real location.

~~~
XCSme
It just depends on the database they are using. Anyway, nowadays it's getting
harder and harder to detect accurate location from the IP address (many users
are on 4G or behind a proxy).

~~~
unclebucknasty
Yeah, and features like Data Saver on mobile Chrome will quietly proxy your
requests through Google servers. Not sure if G forwards the proper
X-FORWARDED-FOR headers (or if all IP detection services read them).

~~~
coderholic
Yeah Google does, and we take the first public IP from x forwarded for, so
we'll show your actual details in that case.

------
elnygren
The author says _I took_ even though this was pure luck and coincidence.
Attribution bias is strong in this one.

However, it is important to acknowledge that he did put himself into a
position where he was available to become lucky (= he built the API and linked
to it).

------
rodionos
Checked one of our static IPs: the country is correct, but the city is 500
miles off.

------
ge96
That's crazy the details like lat/long, what if proxy or where does that data
even come from? ISP? Or you take time to build it out ie research. At any rate
cool.

------
reefoctopus
Are you just using the max mind data?

What percentage of those 250,000,000 is from paid plans? Even if it's only 20%
you'd be doing $xx,xxx per day. Is that in the ballpark?

------
erikb
How much money do you make per api req?

~~~
justboxing
See [https://ipinfo.io/pricing](https://ipinfo.io/pricing)

Although I must admit, I'm a bit surprised as to why anyone would pay for
this, as several HN readers have listed atleast 5 other free (and some self-
hosted) alternatives here...

~~~
nu11p01n73R
Several HN readers have also pointed out that some of the free solutions have
shutdown because of lack of funding. When money is on the table, user may
expect that the provider is "more" answerable than a free service.

------
martin_hnuid
Thanks for sharing.

I am ready to launch a startup and currently trying to figure out what to
focus on (so many ideas!).

I posted an "Ask HN" earlier today. Wondering if anyone might have some
thoughts or advice on this:

[https://news.ycombinator.com/item?id=14677939](https://news.ycombinator.com/item?id=14677939)

------
imaginenore
Just some rough calculations. Assuming the worst-case scenario, everyone in
the highest tiers (the cheapest per request), 250M daily requests means he
makes

400 * 250M / 320K = $312,500 per month.

Or $3.75M per year.

Not counting the expenses.

~~~
venning
The cheapest per request is $0. There is a free tier where, presumably, the
long tail lives.

------
kalal
You are great! My karma goes down, please!

