Google was 97% accurate, at $4 per 1000 requests.
Mapbox was 95% accurate, at $.50 per 1000 requests.
TomTom was 73% accurate, at $.42 per 1000 requests.
Location IQ (one of many providers simply running the open-source Nominatim project) was 12% accurate, at $.03 per 1000 requests.
To be fair to Location IQ / Nominatim, they had the right street the vast majority of the time, but were usually wrong about house number due to interpolating between address boundaries at cross streets defined in Census Data. If you need exact addresses, Nominatim probably isn't for you, but if you need a general location, it might work fine.
Also - this was one county of test data, so take it with a grain of salt.
Nevertheless, it gave us confidence that we could move away from the Google APIs, save 90% of our costs, and still have a high level of accuracy.
 Link to example -> https://developer.here.com/api-explorer/rest/geocoder/revers...
geocode.earth founder here, happy to answer questions.
We have great reverse geocoding with OSM and OpenAddresses data.
Feel free to shoot me an email for an invite: email@example.com
probably not worth the effort
The real catch is this, though: if you actually pulled it off, you would have just built a nearly flawless reverse geocoder, and you wouldn't need to use an external API at all. You could just look locations up in your huge property data set.
it doesn’t need to be an accurate decider, just “good enough” in a majority of cases.
The issue is the rendering is actually somewhat resource intensive and browser support is thus far incomplete. But we’re getting there!
Server side rendering can be less intensive for the client, since it just has to show an image.
Forward Geocoding (Converting the addresses to latitude/longitude): https://gist.github.com/MiniCodeMonkey/13c02d45089478182c3d1...
Reverse Geocoding (Converting the lat/lon to addresses):
Based on the above definition, geocod.io's accuracy for reverse geocoding of this list is 95.2%
Putting this together with dbatten's original findings, the list would look like this:
Google: 97% ($4 per 1,000)
Geocod.io: 95.2% ($0.50 per 1,000)
Mapbox: 95% ($0.50 per 1,000)
When I got the notification in May that the rates were increasing, I didn't take it that seriously. We have a profitable company and I would have been more than happy to pay Google twice their rate for the premium services they offer. Nobody ever expects a sudden rate increase to be more than 20%, 30% or 40%, right???!
It was in July that the gravity of the situation hit me. I was seeing tweets and articles lamenting the rate increase. I thought to myself "huh, I better check this out". I did some quick math using the new rate card and nearly had a heart attack! Our bill was estimated to be ~$14,500 and the CLOCK WAS TICKING. We were facing a 2600% rate increase and had only 2 months to figure out a game plan. Our business was on the line!
I immediately determined that we were eligible for bulk pricing. However, Google will not sell you the bulk rates directly. You have to contract with an authorized 3rd party (re-seller, basically) to get the rates. So, we found a reseller and locked in the bulk rates. That brought our estimate down to around $12,000/mo. Better, but still a huge shock.
The next step was optimization. There was no way for us to reduce dynamic maps usage because it is such a core part of our products. So, we cut off almost all of our Places API usage and started using other services. Our estimated bill was now down to $9,000.
That's where we are today. We just got our first full month bill. It was a huge hit to our business, wiping out a significant portion of our profitability.
To be frank, I am pissed about this. I was more than happy to pay Google far more than they were charging us. We were always under the impression that rates would increase one day. But, to force an increase of this magnitude with such a short amount of lead time is pretty f&^%ing sh*&$tty coming from a billion dollar giant.
We're starting to test other mapping services. I met with a team member from Mapbox last week and am planning on testing their platform in December. Their quote for our usage needs? $550/mo.
This is exactly why I'll always be wary to depend on Google's services. And it isn't the first time they do this either.
First they monopolize the market via cheap rates and branding, then they make it exorbitantly expensive.
Wrong. It's completely acceptable. It's their business, and this is the risk you take when you make your business dependent on another business.
>This is exactly why I'll always be wary to depend on Google's services. And it isn't the first time they do this either.
>First they monopolize the market via cheap rates and branding, then they make it exorbitantly expensive.
Google isn't the only company that does stuff like this. Almost any company would do this if they had the opportunity.
Very few business would do well in such an environment.
If Google raising prices obscenely "isn't acceptable", then what are YOU going to do about it? Nothing, right? Is Google going to change their ways because a few people proclaim "this isn't acceptable!"? No. Therefore, it IS acceptable.
I'm waiting to be proven wrong.... which will only happen if this price-raising is canceled, which I don't think it will be.
We'd be happy to discuss how we can meet your needs, as well, if you're interested. Feel free to reach-out directly: firstname.lastname@example.org.
E.g. I'd put up an experimental site which uses google maps free qouta only and I don't want to pay anything, since it's only an experiment, I'd be happy with my access blocked automatically for the rest of the month after the free quota is exhausted.
But AFAIK it cannot be done. You have to track you spendings and shut down the site manually when you are near the end of the free quota.
Google in the past helped people experimenting with its platform. This is not the case anymore with google maps.
I provided my credit card for the trial. I was notified of the impending end of the trial, and was billed no further.
I do agree that the inability to set hard limits on usage is frustrating and somewhat concerning, however. Especially as they're apparently able to do so through the aforementioned trial termination mechanism.
And I further agree that, outside of this trial period, this policy does discourage experimentation. And that it is somewhat un-Google-like; somewhat surprising as I believe they're considered an underdog relative to AWS; and a bit disappointing.
I think that's ridiculous. It should deal-breaker problem for anyone running on a side project.
What has become abundantly clear however is that CC gating is a fast way to reduce your malicious traffic _exponentially_; and especially if you're enterprise focused, this doesn't even impact your core strata much.
As a non-corporate-techie in my off-work-time, I absolutely understand the frustration at this inflexibility. During my day job where I'd have to clean up the fires that result from a fully open door policy however? I'd have a somewhat different take.
It's not 100% clear cut "the CC is all just to add spurious friction", is all I mean to say :)
I totally can't tell how much this could cost me; I'm pretty sure I'm still under the free quota limit, but I can't see how far I'm into it. I see that there are requests being logged in the API metrics, but the billings pages show 0 usage across all the same metrics.
You would not be charged more but then would maybe be in breach of the contract passed with Google (?)
We're here to help on that one. :) We strongly believe development and educational use should be painless, so testing locally (just use the right link!) or signing up for the free tier is as simple as possible.
[Added some more caching, site works again slowly]
This attitude is something I truly appreciate. So often posts like these are angry rants about 'how dare they do this to me'. You seem to presume good intentions, which should be the default reaction in life.
> we're haven't been slashdotted for a long time, sorry!
MapExample.png file is 1.1MB, it shouldn't be higher than 100KB given how little detail that picture has. Also 4.1MB of css for a page that has barely any styling?
Whatever you are doing you might be losing potential customers because they are unable to access that page through HN, because of poor optimization and scaling.
 something on my agenda
So what were you paying for?
Will add this clarification and your links to the article if site works again though, thanks.
[Also currently not online are maps with search results, we removed to cut costs but will add back again]
You're not processing customer data there are you?
That wouldn't solve today's traffic spike, and there are probably some lower-hanging fruits that should at the very least keep the site online. Anything sold in the last five years with an ethernet port should easily be able to handle a static site even during rare spikes in traffic.
But the Cloudflare DNS configuration isn't something to be scared of: I used to feel similarly nervous when fiddling around with DNS, but haven't had a single problem configuring 20+ sites with Cloudflare. (I would, however, be slightly more nervous about google's reaction to the move to a subdomain this would require.)
This surprised me. I'm completely ignorant with respects to geoinformation providers, but for some reason, I somewhat expected vendor lock-in to be much tighter.
>Just pass the coordinates as a latitude and a longitude, separated by either a comma or a (URL encoded) space and the API will automagically work out that you want to reverse geocode.
Is the reverse geocoding based on
1. government-source boundary data, or
2. proximity to tagged points from OSM or other non-authoritative sources?
The competitive set is differentiated in some of their niche work (like specific-device-embedded maps, such as the ones in Tesla), but the first principles for the common developer are all the same.
Were they vastly under pricing it before? Did they want to stop providing it so they priced it at a point that would make almost everyone go away?
I'm just curious about their motivation and surprised that this doesn't seem to have been figured out or even discussed.
It's not directly relate to topic (dynamic maps, online geocoding), but somewhat relevant.
With about $300 in Amazon EC2 I geocoded 18 million addresses in 3 months. You probably can do it with your own PC with similar time (the only hardware requirement is RAM and SSD).
For resolution you cannot expect too much because street numbers are usually interpolated, and TIGER database definitely will not be as good as best commercial database. However you don't have many other solution if you have millions address to geocode.
We customized the CartoCSS a bit to get the maps closer to the way we wanted them. For example, we took out building outlines, and reduced the frequency of highway shields, which required removing exit numbers.
It's a good solution for us, but we already had some mapping experience and lots of system deployment experience, so YMMV.
* Provided you're familiar with PostgreSQL, and PostGIS.
I've a small webapp to show me my elevation based on my lat/lon. I built it with Google's API as I've used it before and it's what I know. It doesn't get much traffic so it's unlikely to ever cost much once my free credit expires but I'd like to explore other options.
What alternative (preferably free) services are there out there? I need to be able to pass lat/lon and get back the elevation above sea level of that point on earth.
Using mapbox's own backend would've been too expensive for us since our app is able to do asset tracking, a use case almost all of the bigger mapping providers (we checked google, bing, here, mapbox) want to earn an extra buck on.