Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Do you use an elevation API?
2 points by bckygldstn 35 days ago | hide | past | favorite | 3 comments
I've been working on building an alternative to the Google Maps Elevation API, making use of the high-quality open elevation data released over the last few years: the 30m Copernicus global dataset, improved 1m coverage of the US and England, EU countries releasing national datasets under open licenses.

I'd love to know more about how HN uses elevation data and APIs. I have a 9 question survey here: https://forms.gle/1EhX4c2mLHuRTR1C9 and will share the results with the community.

It would also be great for people to share their usecases in this thread!

Some of the usecases for elevation data I've worked on or have seen

• Raster data for flood modelling

• Point queries comparing elevation to flood models

• Using elevation to improve accuracy of hyper-local weather modelling

• Elevation profiles of activities: like strava but for various different niches

• Flight planning for various aerial activities: drones, general aviation, hang gliding, paragliding, normal gliding

Compared to self hosting, APIs add latency and remove control. But people seem to use them to avoid dealing with multi-TB datasets, abstract away a lot of the geospatial complexity with projections and tiling and geoids, and to avoid dealing with lots of different datasets from different sources.

And with the Google Maps API in particular, people struggle with the high cost of course, but also the lack of providence about the data used, and the accuracy reduction of batch queries.

abstreet.org has an offline import process that combines data from OpenStreetMap, city-specific GIS datasets, and elevation into a single file. The process has to be deterministic, given the same input and code, and calling out to an external API is a non-starter. We use https://github.com/eldang/elevation_lookups, a Python library that downloads missing LIDAR or SRTM data and uses GDAL to handle batch queries. Two issues with it are having too many dependencies (so we run it in Docker) and not being able to parallelize lookups without blowing up memory, due to some GDAL caching internals.

Cool, I learned about abstreet the other day from HN! Thanks for the feedback.

These days I run everything geospatial in docker containers, the dependencies in geo are tricky.

Interesting about the memory/caching issues. I was going to suggest rasterio which I use for batch queries in https://github.com/ajnisbet/opentopodata and comes bundled with its own gdal binary, but looks like you're already using that.

I've also used zarr+tifffile for geotiffs in particular, it's faster and avoids a lot of gdal's warts, but you still need something like rasterio to read the geospatial metadata and handle projections.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact