I use https://freegeoip.net/ which gives you output in JSON, CSV or XML and has a limit of 10,000 requests per hour. The limit of 1,000 per day on this service is too low.
If you are geolocating, I would suggest spending $370 to buy the MaxMind GeoIPCity database. There's an nginx module for it and any major language will have code to run queries against it.
So something along the lines of tracking incoming requests, then doing a DNS lookup on the incoming IP? Isn't that something that something like Google Analytics could do for you? And in that case, would this be more for people who are avoiding GA?
Google Analytics can give you this data in the reports but won't allow you to get access to it in real time in your application.
Imagine you want to redirect users to the correct country page in your site. In this case you need to get access to the ip geolocation in real time. GA won't help you there.
If your goal is geo-redirection then a 3rd party service is usually a bad choice (speed + downtime). An memory geo-ip database is the best way to go - but this is usually also part of what you pay for when you buy the non-free version
It's less accurate than their commercial offering, but good enough in many cases. For ease of use, there's a nice C API (https://github.com/maxmind/geoip-api-c) and wrappers like pygeoip for Python.
But freegeoip's limit of 10,000 requests per hour is better than ipinfo's $200/mo plan (6,667/hr). Plus there are no limits if you run freegeoip on your own server.
Looks like no immediate fix either, which is a shame as I wanted to use it client side but (a) need SSL and (b) can't ignore Chrome (it fails on Chrome for Android latest too)..
Thanks for that! I'm building a small script which may evolve in a service and I was retrieving IP addresses using curl, this is way more clean. Thanks!
IP addresses should only be announced by one ASN at a time (regardless if that AS is multihomed or singlehomed). If that's not the case that's usually a hint of prefix hijacking.
It gets my Location wrongish here on the Gold Coast, but not too far out. We have huge suburbs though, all of the GeoIP DB's seem to think im in the wrong suburb, despite having a static IP on Telstra cable.
I just finished integrating geo IP lookups into an app I'm working on, using a library that queries a number of the free IP lookup services out there. Something that I've learned from this is just how hard it is to stay on top of IP geolocations. Most of the free services use incomplete or outdated databases, and if you want decent accuracy you probably need to opt for a paid package from a company who has the resources to track IP address location changes, and keep their database up to date.
So, I'm wondering, does ipinfo.io maintain their own database, or do they sub-license someone else's database? If they're maintaining their own, how good are they at keeping it valid and updated and what is the coverage like?
I was trying to figure out which database they were using as well. I've checked their results for my IP address against the results from MaxMind's GeoIP City and GeoIP2 City service and these return the closet results related to the Latitude/Longitude displayed on ipinfo.io, it also has a matching zip code. The network name differs since I'm on U-Verse but I think they are just displaying the company name from the ASN. I also checked Quova/Neustar and the results are vastly different.
It uses one of the free databases that are available.
Ip address 200.7.52.1 is a good way to check these services. This should position to the island of Sint Maarten. A serious offering would know that. The free ones always set the position to Curacao. Just about 900 NM away...
Usually though I just use http://www.moanmyip.com for the weirdness of it. Or pretty much any search engine includes that up top when you search for "ip address".
It provides response formats: xml, json, csv, newline separated, serialized php. And enforce a limit of 240 requests per minute (that's 14,400 per hour for the lazy).
Fun fact, if you're running tor (tor-0.2.4.17 as proxy not the browser) you'll see the tor exit node in "IP:" and your actual ip i "Real IP:", thanks to gdns' EDNS (https://gdns.re/edns-demo/)
Fun fact: if you are running tor and you see the tor exit node and your IP you are doing it wrong! You are leaking DNS requests. Tor goes to great lengths informing you that something is wrong in the logs. This is not thanks to edns, it is thanks to your inability to use the software correctly. Please see:
>it is thanks to your inability to use the software correctly
I disagree, but have you said it is down to my inability to explain what the problem is, then I agree.
I'm aware of the DNS leaks, and obviously wasn't clear enough and failed to explain what the problem was.
The leak only happens when you run the tor proxy daemon, and your own browser with the appropriate proxy settings.
It's down to the way Firefox, uses the defined proxy for the initial DNS and HTTP requests, but then bypasses it when doing DNS lookups for JS within the initial page loaded.
It doesn't leak DNS lookups made by JS scripts if you use the tor version that includes a mod-Firefox browser.
I don't mean to be a huge jerk - I think the service is cool - but IPv4 exhaustion is months away! All of the major GeoIP vendors have IPv6 support. It's simultaneously saddening and frustrating that it might as well not exist.
For ARIN (where most of us will be getting our IPs) projected runout (assuming no "bank run") is closer to a year out[1]
Now, people /have/ been predicting some crazy run on IPv4 for some time now, I was pretty certain it was going to happen in 2011 and while a run is still quite possible, it hasn't happened yet, and there are a bunch of outstanding /8s that could very well be returned; the DoD has been returning blocks, and they still have a bunch more they could return. Based on my own previous expectation that runout was going to occur in 2011, I'd be surprised if we run out before 2015.
Now, if you are dealing with network admins and infrastructure types? IPv6 is very important. I give my customers an IPv6 address by default, and will give a /64 upon request. It's pretty important for the sort of people I have as customers.
You see, we're the ones who have to deal with this 'nat hell' - and make no mistake about it, it will be hellish.
However, from a business perspective? If you are going after business types? The internet is still entirely IPv4.
This is interesting. IP address databases are typically bought and sold for figures in the thousands. This service presumably purchases one or some of these and attempts to make up the cost by providing it on an as-needed basis.
I wonder if there are any additional data sources that can just be bought wholesale and sold in pieces? Think of all the applications that needed very precise IP address data but couldn't afford the whole dataset. They can now exist!
I could also be wrong and this isn't at all the approach this service takes...
> IP address databases are typically bought and sold for figures in the thousands.
And presumably, therefore, issued under licenses that forbid you from starting a query service? I'm pretty sure most of these IP data providers will offer their own on-demand query services
That's for your own use however, the license says:
Access to the data is restricted to employees and contractors of the license holder. With contractors, the license holder is liable should the contractors violate the terms of the agreement.
Data may not be stored in a way that is publicly accessible.
This really looks like maxmind city level, so... until we get confirmation from the dev on what db it is, assume this service can disappear at any moment due to license issues.
About 8 miles less accurate than MaxMind -- 11 miles wrong vs. Maxmind's 3. I'm always curious: what is the provenance of services like this one / MaxMind — my ISP? Some past inadvertant GPS-enabled requests that leak out?
And presumably this service is just repackaging such an existing system. My old company had a tool to figure our what widely used geo db was used for these services (to properly judge it's value), but I no longer have access to this tool.
I'm on a campus network that reverse proxies all HTTP traffic. The service shows my internal 10.x.x.x IP. Just a heads-up that you might want to fix your handling of X-Forwarded-For headers.
Thanks for reporting! You should see that we correctly detect this as a bogon, but we should definitely be pulling the correct IP from the headers. I'll look into this.
Recommendation: I have the feeling your sales would be significantly boosted if you added some sort of tiered service plans on top of your free tier, instead of requiring the frictionful process of reaching out to a sales address.
My "goto" for shell scripts is http://echoip.org/, though I recently setup my own such script as well. It's a one-liner in PHP, though having some json output alternatives would be useful as well.
Has anyone done research on quality of geo DB? I've personally prefer Maxmind because of the format of the DB, but Netacuity seemed to be giving more up to date results.
My first guess would be that the domain uses a custom nameserver that logs all dns queries and serves back a different IP to each user. Then when the http request comes in to that specific IP, it can check the dns logs to identify the resolver.
But surely.. that would either require (a) a massive IP space (i guess that works if the site is IPv6 only), or (b) assuming a pretty short time between dns lookup and http request, and hence possibly false results if the dns lookup is cached somewhere and the IP has since been reused
I'd love to know if there's a better way of implementing this.
A better way would be to hand each client a unique hostname (sessionidhere.whatsmyresolver.stdlib.net) and match that dns query to the http client? For example as an ajax call from the main website.
This believes that I am in Colchester, FreeGeoIP can say that I am just in the UK, Google in incognito mode thinks that I am in Sheffield. Correct answer: Manchester :)
Paid plan for API access I guess - hoping one wont use the same environment for both. I meant this for the stripe frame which kicks in, cant really trust an https frame that has a start from http.
I was at one point using https://github.com/tjfontaine/node-dns for the hostname lookups, but it was significantly slower that the built in dns module, so I ended up wrapping that with my own timeout logic.