Interesting situation. One point to make: CakePHP cannot cause browser compatibility issues. CakePHP is a backend framework and does not itself cause client side issues. I know this is splitting hairs, but a lot of things can contribute to client-side issues. If a design is in bad shape (ie. browser considerations were not made during planning), then CakePHP wouldn't be a relevant point for front-end issues.
That's all, carry on. I hope everyone's bickering and finger-pointing goes well on this fine Friday.
I've used pretty much all of the big geocoding services, and here are problems I've ran into.
1. Rate limiting: I get it, you have to make money and/or limit your freeloading, but rate limiting has killed things I've built in the past, especially Google's hard rate limit. A soft rate limit, or an alternate way to monetize, would be huge.
2. Accuracy: MapBox's geocoder is not good. Aside from inaccurate map tiles, their geocoder misses entire US zip codes. PLEASE at least include helpful error messages and a path to report incorrect results.
3. A solution for shared IPs and rate limiting. I have helped several small websites that do not come close to approaching Google's daily rate limit, but because their IP was used by someone else, they are not allowed to make geocoding calls. This forced us to use a different service.
Honorable mention: It would be nice to be able to specify what data I get back from a call. If all I need is lat/lng, I don't need another kilobyte of neighborhood/city/time zone info in my result.
...rate limiting has killed things I've built in the past, especially Google's hard rate limit.
Sometimes it's possible to shift queries to the client, and then build in enough intelligence to: run only ten queries at a time, delay queries by a period that backs off, save results in localStorage, etc.
This won't solve all problems, and perhaps it annoys users to see the first ten locations pop up immediately while subsequent locations have some random delay the first time they visit a particular resource, but it does make some things possible that would not be otherwise.
* Offer unlimited access for more money (popular)
* Offer a cheap, simple way to batch requests of very large and/or unlimited sizes (e.g., CSV upload)
The latter is nice, because it's not worth it for me to upload a CSV file with 1 address in it -- I might as well use your regular API. And it's not worth it for just 25 addresses, either. There's some threshold where it becomes more useful for me to submit my addresses in bulk, and that's where the CSV files come in. It should be way cheaper for you to process a file with 1 million rows than it would be to process 1 million API requests, so if you were to do that, it would be a gold mine for businesses like mine that require geocoding capabilities for millions of addresses at a time.
I really don't want to hijack the thread, but I couldn't help notice that the company I work for[1] recently added a lot of these features which might be of interest.
We have support for both bulk CSV upload and an API endpoint for batch geocoding. We are also starting to introduce unlimited access for a flat monthly fee (with no limits to requests per sec), please contact us if you're interested [2].
That's all, carry on. I hope everyone's bickering and finger-pointing goes well on this fine Friday.