The result was a 2MB hmb.pmtiles file which, when dragged onto the "drop zone" on https://maps.protomaps.com/ showed me a detailed zoomable map, with full OpenStreetMap metadata (airport polygons etc) - all the way down to individual building outlines level.
Beat me to it. I was working on this last night but had to sleep before writing it up. I was trying to use parcel as a bundler and was thrown because whatever dev server it uses doesn't seem to support HTTP range requests properly. I was getting "invalid magic number" on the client side.
That's really cool. Cloud be really use full for local businesses. They often embed Google maps, but I feels like having a map from the local are would be sufficient, and cheaper.
It depends what you use it for. I plan to use it for blog posts where I show the track of where I walked. I actually like that the map won't update. It wouldn't make sense to show me walking across a map of the future, would it? It's a record of the past and the map should be frozen just like the writing.
Sure, but in GP's context of local businesses, they certainly will want to keep up-to-date with any changes to make sure customers can still find them.
Protomaps generates daily world files I believe [1] but I haven't confirmed that they're used by the download tool yet. In theory you can download the data everyday to pick up updates.
I did the same and got a map of Gran Canaria which is 15MB! And it's not like a user is ever going to download the entire file. It's really quite cool and I'm quite excited to put maps on my static site.
I realise this isn't tech support but I have no idea why it doesn't work for me. The pmtile file you linked works, but creating my own smaller one doesn't.
But the concept is great to get a map for a project
I've been an avid Mapbox user since getting alpha access to Tilemill. From my perspective, this is the most important contribution to the open mapping space since the introduction of vector tiles and Mapbox GL / Map GL. Mapbox in my opinion left the door open in how they approached tile baking with MTS. While powerful it was way to confusing and expensive. While you could always bake tiles with Tippecanoe you still had to struggle with vector tile hosting hosting. With PMTIles vector tile hosting is arguably easier, more convenient and an order of magnitude cheaper then going with Mapbox (especially if you don't need fancy projection / 3d support and are ok with MapLibre). Felt is one of the major services that I believe uses the PMTile approach behind the scenes and they won't be the last. (Notably: they have continued to invest in Tippecanoe which is awesome). With Mapbox focus shifting, it appears, to car navigation - it's great to see disruptive innovation coming from new places. Even more amazing that is has come from a one person team.
Also to note, the emergence of PMTiles coincides DuckDB spatial query support will unlock a lot of innovation. The ability to query and process large geoparquet files and quickly stream them into baked PMTiles unlocks a lot of really compelling use cases.
Mapbox is still the best choice where a polished suite of mapping APIs is a better fit for a project.
Mapbox is a venture-backed company with a SaaS business model, and has never been open source in total - it used to be open core with a FOSS frontend and proprietary backend. This SaaS model is absolutely the best way to fund huge companies and give investors a return. Mapbox has also done the bulk of innovation in open source web mapping over the past 10 years - the Protomaps project uses MapLibre (fork of Mapbox GL 1.0) and the MVT tile format. Both required teams of full-time developers - easily tens of millions in salaries and stock - and they have given away version 1.0 for free as a gift, even though 2.0 is not open source.
The ideal software economy is one in which innovators capture a good portion of the wealth they create. This is why it's important for Protomaps to focus on use cases underserved by SaaS, instead of just being a cheaper map API. The sibling comment on wildfire mapping https://news.ycombinator.com/item?id=37989059 is a good example of the applications I want the project to support.
> The ideal software economy is one in which innovators capture a good portion of the wealth they create.
Beg to differ, the ideal software economies maximally empowers end-users at the absolute minimal cost. Innovators can and should leave substantial cash on the table. They should see themselves as stewards of a public good.
Ideal software economy would be all free (as in freedom) software. The current economy is hugely wasteful with lots of redundant work and generally bad outcomes.
Private ownership economy doesn't work for zero marginal cost products even in theory. It's a huge waste of our resources and a big hinderance to progress.
I lurk in an online community of journalist coders and some folks there are excited about PMTiles as cost-saving way to host your own custom-styled map tiles.
Most of these piggyback on the OSM for the data tier, which $T companies work on. Afaict the discussion is more about compute on top - tile generation, interactive renderers, ... .
I asked the question because it's an interesting question. In our world flooded in misinformation and cheap data, there's often little accurate, high-quality data.
I'm not sure what answer you are looking for / what alternative you are suggesting?
OSM is the biggest community effort - NGO, volunteer, corporate, etc - to solve data quality in GIS. The participants do everything to improve quality from individuals walking around with GPS devices to companies launching low earth orbit satellites into space and self-driving cars in the ground with AI error detection.
There is a more corporate and afaict anti-google effort more recently by tomtom and google competitors (Microsoft, meta) called Overture, which seems to be attempting a more closed and big corporate governed fork & ecosystem replacement of OSM, even if they phrase it as complementary. Your questioning of OSM-as-misinformation seems interesting in the comparative context of alternatives like Google Maps, Apple Maps, and Overture.
It is an interesting question. I've worked with a fair bit of geo-data, and found pretty much any sufficiently large geo-dataset, regardless of whether it's crowd-sourced or purchased from a commercial provider tends to have a fair number data accuracy issues.
If you spend any time on either OpenStreetMap or Google Maps in the rural US, and you are very likely to find both missing streets, and streets that have been completely hallucinated out of nothing -- mostly because they were both originally sourced from the US CENSUS TIGER data -- a US govt data set that tends to be full of errors.
OpenStreetMap contributors of often improve these issues over time (though it largely depends on if the area you care about has some sufficiently dedicated mapper).
Because nothing meets that standard, it's not an applicable measure of quality. Also, it tends to lead to binary thinking: 100% or null.
But we live in a world of less than 100%, and there is a lot of variation: Five nines, 80%, datasets with just a few valuable records, etc. That's where the real-life questions are.
The ordering applies across tiles, and not within - it should be efficient to diff two archives because there is a single, well-defined ordering over the entire archive, and writing algorithms over the tileset is all 1D operations instead of 2D (a single zoom level plane) or 3D (multiple planes)
I assume that would require BitTorrent v2 (and I think more specifically something along BEP-46 - Updating Torrents Via DHT Mutable Items) or your map is gonna be out-of-sync and sad after a while.
are the torrents in the wild that uses this? and are there bittorrent clients that implement this? how do they surface the mutable aspect to the user?
(is it like, your torrent client will silently update files, and if you dont want to update anymore you need to pause? can you still seed if you dont want to update?)
I've been testing a setup that automatically generates OpenStreetMap/OpenMapTiles formatted pmtiles and mbtiles, then makes them available by torrent. The torrents get placed in a rss feed at https://planetgen.wifidb.net/
Basically what I did was set up 'qbittorent' client to watch the openstreetmap rss weekly torrent feed. when it downloads a new pbf file, it runs this script I made at https://github.com/acalcutt/PlanetilerTorrent , which was based of the osm torrent creation process.
The script creates pmtiles and mbtiles using planetiler, then makes torrent and starts seeding them in qbittorent.
In qbittorent I have options like how long I want to seed for (30 days), what do do when done seeding (Delete the files) and also speed controls so I can limit bandwidth during my working hours.
I have this running on a old laptop with 2TB nvme/64GB memory. It seeems to work pretty well so far. It would be nice if my internet speeds were a little better for initial seeding, but at least the torrents share the load with other people who are downloading/
I don't think so because it is not updating the same torrent it is making new ones. I am transmitting the torrent over Bittorrent/DHT like they mention, not http, but without it being the same file I'm not sure that is the same.
> The Google Maps API has a generous free tier, but high fees past 30,000 map loads. Protomaps is designed from the ground up to be a system you run yourself - as a 100% static backend on cloud storage - so you only incur the usage costs of storing and serving that data, which can be pennies for moderate usage.
It would be interesting to see a cost comparison: recommended hosting setup vs Google Maps, and at what point the line intersects.
I haven't found this info in the FAQ, but it would be interesting to know how large that PMTiles file is for a medium-sized country.
It seems to be a really attractive way of self-hosting maps that don't have to be 100% up to date and are only used in a specific region. "Recompiling" that file once a year and uploading it to a static file hoster would be an easy process with very little external dependencies.
Edit: Found the official download link[1] for the whole world, which is a bit over 100 GB. So I guess the answer to my question would be "hundreds of MB to a few GB", which seems totally fine.
That's true - I suppose the cost changes depending on the size of the area you're mapping. Maybe a representative set? A city (e.g. Johannesburg), a small country (e.g. UK), and a large country (e.g. USA)?
Cloudflare: 100GB hosting is $1 and $.36 for 10 million requests.
If you read the Protomaps docs it explains how to cache the requests on Cloudflare CDN so you only have to pay for each tile request once per cache period. It's quite cheap.
Edit: previous submission title for Protomaps: Serverless maps at 1/700 the cost of Google Maps API
Okay but the question was what it actually costs, not what our favorite mitm provider offers as a commercial service (they have the worldwide hardware and sufficiently deep pockets to offer this at a rate that guarantees gaining market share, below the cost price of normal hosting providers)
> PMTiles is a single-file archive format for pyramids of tiled data. A PMTiles archive can be hosted on a storage platform like S3, and enables low-cost, zero-maintenance map applications.
> PMTiles is a general format for tiled data addressed by Z/X/Y coordinates. This can be cartographic basemap vector tiles, remote sensing observations, JPEG images, or more.
> PMTiles readers use HTTP Range Requests to fetch only the relevant tile or metadata inside a PMTiles archive on-demand.
> The arrangement of tiles and directories is designed to minimize the amount of overhead requests when panning and zooming.
PMTiles is amazing. We discovered it when we were building a Canadian wildfire hotspot map this summer.
We were taking daily CSV files, turning them into vector tiles, syncing them with S3, and displaying in mapbox-gl map. The biggest cost was S3 file operations. We had to aggressively cluster to reduce our cost down and got to a point where the tiles numbered ~75,000. At the frequency we wanted to update this was costing $500/mo in S3 ops.
Now we process the files into a single PMTiles file, deploy that one file and cut the costs down to pennies.
For a map that we make public, it was incredibly enabling. You can checkout the Canadian hotspot data for all of 2023 if you're curious: https://lens.pathandfocus.com
Pmtiles are the vector counterpart to Cloud Optimized GEOTIFF (COG), allowing for efficient usage of mapping info from clients when the server supports a http range request.
Previous iterations of this on the vector side have either been a ton of small files(pbf) or a large file that needed a front end to serve the individual tiles (mbtiles).
At this point, you can take an OSM dump and convert it to a country level basemap in minutes on a stout machine, or hours for a continent, which you can then self host on a plain old web server with a custom style sheet.
This can take seconds to minutes for small-medium areas and doesn't need a powerful computer at all. The drawbacks are that the build is only daily, and the lower-zoom tiles aren't clipped so include extra information beyond your specified area.
pmtiles are only the counterpart to the "overviews" part of COG. pmtiles doesn't give you analysis-ready data. For that, look to FlatGeobuf or GeoParquet (once it gets a spatial index in v1.1 of the spec)
Yeah, I'm aware of that. I love what you're going with geoparquet, it's certainly quite nice for analysis and some viewing. But there are still going to be a lot of datasets and clients that aren't going to be able to handle that where the tiled approach will work.
I'm looking at this currently for a couple of clients, and I think that for what we're doing -- we're going to wind up serving both a view and analysis version separately.
So, could PMTiles be used as a cheap/dirty/fast way to host other forms of geospatial data that isn't strictly "map tiles"? Such as a large-ish, infrequently updated POI database?
If the geometry is points/lines/polygon/multi* + properties on the features, yes. (In other words, if it's representable as geojson), with the caveat that it's more for viewing and less for analysis.
It's essentially what the internal layers are in the base layer -- there are land use, buildings, transit, physical, and border features that are combined with a stylesheet to make the base layer map tiles.
There are some subtleties for packing the overview data for the wider zooms, and what happens when you have tiles that are too big because of number of features or metadata size -- where you can drop on density or coalesce on density.
Maplibre[1] + PMTiles + Felt's "tippecanoe"[2] (it can output .pmtiles) are an awesome combination for self-hosted web maps if you're ok with being locked into a Web Mercator projection
for pretty much any geospatial source you can convert to .pmtiles via GDAL[3] and tippecanoe
(.shp .gpkg ...) | ogr2ogr -> .geojson | tippecanoe -> .pmtiles
for OpenStreetMap data there's planetiler[4], and and openmaptiles[5] styles that work with Maplibre
with those combinations you've got a great start to something you can host for pennies on AWS S3+CloudFront or Cloudflare R2, with an open source data pipeline
I'm a student trying to upskill in geospatial/compsci. What kinds of projects could you make with all of this? Any good starting points you'd recommend?
Not OP, but almost anything where "a map" is the output will be a hugely big learning opportunity for you. Static maps, print-quality maps, interactive web maps -- the experience of building any of those will bring lots of learning. (That's how I got started.)
I’ve contributed to this project, and we use it in production at work! It’s great. Brandon is very nice, and I really appreciate all the work that has been put into this.
We've had quite some success with deploying "serverless" maps, leveraging PMTiles. Have a look at https://github.com/serverlessmaps/serverlessmaps if you're interested to deploy this on AWS CloudFront/Lambda@Edge/S3...
I wonder if Github Pages supports HTTP range. Obviously putting a multi-GB blob on there would probably be abuse, but a small blob limited to a small region with limited detail would be really nice for having a map to accompany a blog post or something.
It would be cool to have an automatic tool to generate a blob of any size and it would include as much detail as possible within your given blob size limit...
--dry-run will output the total size of the final archive without downloading any tiles, so you can adjust the maxzoom from the default of 15 (for basemaps) until it fits under your limit
I am curious about maps and the software to generate them and play with them, but when I tried - a few years ago, admittedly - to dabble in it (I wanted to generate those art-sy nice city maps pictures for my hometown [1]), I was overwhelmed with the ecosystem, all the steps involved, all the options etc.
Does anyone have a good introduction to paint a high level picture of this, so someone like me can navigate?
The protomaps basemap is also built on planetiler: https://github.com/protomaps/basemaps - and Brandon is one of the main contributors a to planetiler!
We actively use this at Street Art Cities (https://streetartcities.com). Really helped keep hosting costs down for a project that is more community than commercial, and created a more cohesive experience.
Natural Earth is generalized (low zoom) data and has hand-curated scaleranks on features to determine what features appear at low zooms. None of that is in OSM.
In theory you could do that yourself with a vector basemap. Wouldn't it be great if browser minimum font size applied to absolutely everything (ie. minimum font size actually means "I cannot read anything below this size so never render it like that").
That is a constant problem on Google's and Apple's phone maps. The names are almost unreadable to many people I've encountered with vision in the normal range of issues (e.g., wear glasses, etc.).
Open streetmap is a database that you can use to produce maps from. Open streetmaps the organization does not produce maps as a service directly. They do provide a map editor, which of course also renders the tiles and they host the central database that people contribute to with that editor. But their tiles are not distributed via APIs or dumps otherwise. The only thing that they provide is a database dump in various formats.
Mostly people either generate their own tiles from this (using a variety of OSS and proprietary tools available for this) or use companies such as mapbox, maptiler, etc. that do this for them. Typically the commercial options bill using a metered per request model. This can make using maps quite expensive. Ever since Google raised their prices substantially a few years ago, things like Mapbox have become a lot more popular. Mapbox and similar products are also not cheap.
What pmtiles does is dramatically lowering the price and complexity for self hosting your own maps to basically the raw CDN network cost plus a small amount of overhead for storage and computation. The tiles are generated using e.g. lambda functions from a single pmtiles file on demand and then cached in the CDN. Any subsequent loads of the same tile are cache hits. So, especially for busy websites and apps, the savings can be substantial.
I really wish there was a PMTile implementation for MapLibreGL Native like there is for the JS side (I believe the JS side actually has a way to add arbitrary data protocols). Of course I could stand up some sort of worker or function to sit in the middle, but having this all work client side on Android and iOS would be a game changer for me!
A PMTile is basically a single giant file that can sit on a CDN somewhere, and the client can ask for a certain part of that file using the byte range header. It eliminates having to either have a server that is serving tiles, or paying through the nose for IO if serving static files (uploading literally millions of tiles costs a lot, and not just because of their size).
If you want to embed a map file in your app, you can already use MBTiles for that. A MBTile is conceptually like a PMTile, in a way, but it's a zip file full of the tiles with a SQLite database. MapBox/MapLibre already supports local MBTiles. You can serve them online, but of course you need something taking in the x, y z coordinates and serving back the tiles.
Most of/the entire value prospect of PMTiles is you can dump a single multi GB files on a CDN somewhere, and clients can just request a little slice of it. No servers or any logic required (besides a CDN that supports byte-range headers).
I am hoping that MapLibre native will one day will support sources that are PMTiles. `source: pmtile://my.cdn.com/bigOleFile.pmtile` and then it just works as if pointing at a traditional slippy tile source.
You download it within the app. Or drop it into iCloud and tell the app where to find it. I have 200 gigs of offlined map data my phone with a custom native map app.
Not a big deal with terabyte storage on mobile these days.
These PMTiles don't seem to work well with users behind corporate proxies. The proxy must have to download the entire files before releasing it. Not making a nice user experience. I wonder if there is a way around this?
If not, then probably it can be done by additional proxy that will convert range headers to GET params. But TBH corporate proxy in this context doesn't seem to work properly. For example, all other header-based authentication should be broken if proxy doesn't respect additional headers.
I'm currently self-hosting a planet OpenMapTiles mbtiles file with several styles on top of it. I've been meaning to look into Protomaps closer to see if I could use it to reduce costs. It looks like a very promising project!
OpenStreetMap is the data they use. Other software usually fetch tiles (images) to display, but here they built a single and much smaller file from OSM data which is rendered directly by clients.
For a while I've wanted an SVG map with a JS API where you can input latitude and longitude and show the point on it. I hope this will do that!
The reason I want that is, almost all humans have a GPS device in their pocket. You could easily get your approximate location from this, even without cell service. Implementation could be a PWA.
Your best bet for this is something like OsmAnd or organic maps, which download countries worth of osm data and make it available offline.
The amount of map data for any reasonable area is going to be too large for a PWA, though it would technically be possible to do with existing map viewers and this format. (Eg maplibre or leaflet)
Thanks. I'm OK with a single layer just showing the outlines of coastlines and regions, including the names of regions. I think a suitable simplified world map could fit all this in SVG.
edit: Initially underestimated so did some calculations below
Key Data Points for Estimate of SVG Size
- Radius of Earth: 6371 km
- Surface Area of Earth: 510e6 km^2
- Approximate Number of Countries: 250
- Land Area of Earth: 29% of Surface Area = 150e6 km^2
Corrections and Estimations Based on Average Country Size
- Average Land Area per Country: 150e6 km^2 / 250 = 6e5 km^2
- Country Closest to Average Size: Ukraine, with 603550 km^2, or roughly 6e5 km^2
- World Resources Institute Measure of Ukraine Coastline: 4953 km, or around 5e3 km
Assumptions for SVG Complexity
1. Only coastlines matter for the paths.
2. Resolution set at 10 meters.
SVG Data Calculations and Estimates
- Coastline Points for Average Country:
- 5e3 km = 5e6 m
- Equals 5e5 X,Y points (at 10m resolution)
- Total Points for All Countries:
- 250 countries * 5e5 points = 1.25e8 X,Y points
- Storage Requirement for Each Point:
- 6 figures per coordinate + delimiters = roughly 15 bytes
- Total Storage = 1.25e8 points * 15 bytes = ~1.875e9 bytes or about 1.875 GB
Even with a simplified approach, we're looking at an SVG that'd be almost 2 GB, just for coastlines at a 10m resolution. Definitely some hefty complexities in play here.
What about 1000m resolution? 20 megabytes. Much more achievable.
You probably get a better estimate by using the Ukraine to estimate the scaling constant (√603,550km²)÷4953km and then use a list of land areas to estimate border lengths.
I like your refinements! I may take a look at reviewing the estimate later! Thank you for making this contribution, it's really valuable that you share your intuitive statistical knowledge here :)
Countries with coastlines, especially with fjords, have disproportionately longer borders. Scaling constants aren't going to do it for something like Chile.
SVG is kind-of terrible for maps, but you can get pretty small with GeoPackage (read: sqlite). I recently spent a bit too long on exactly this problem and ended up with the following.
116KB - 5MB for country borders
16MB - 52MB for ~50K city/county level borders based on geoBoundaries
The range of sizes depends on how much custom compression/simplification you put into it. The source files are about 10x bigger, but that's already pretty small.
Well, mostly that it's text / XML you usually have to parse in full and boundaries are data heavy, so if you have anything more than a world map, I don't see that working very well.
In contrast with the OP or GeoPackage, you can query by tiles/range and only extract the boundaries you need if you're zoomed in somewhere. If you use Tiny Well-Known Binary compression, you can also shrink the data quite a bit while keeping the querying capabilities.
But if you only ever need to render the whole thing, topojson is probably the winner as it cleverly encodes only unique borders once, so it tends to be a lot smaller.
And of course if SVG works for your case, go ahead and use it, it's surely the easiest way to render vector gfx in the browser :)
I'm a total noob to all this, so I totally get if you can't help--it's not your problem, no worries! But would you be able to share any code to convert this, or any of the data?
I can't share this data, but there are a couple of ways to get started:
1) Just use the overview tiles from the global protomaps dump (OSM source). The getting started guide can walk you through some of it: https://docs.protomaps.com/guide/getting-started but you'd want to use the pmtiles tool to extract the widest zooms from the global dataset.
Then see the intro to pop it in a viewer, or it's a few dependencies and a style sheet and it pops into OpenLayers, Leaflet or MapLibre for a viewer. There may be some extra layers there, but at wide zooms there shouldn't be much data there. (e.g. buildings)
2) If you have/find your own data, use Felt/tippecanoe to convert it.
LLCs in this context signal "This company has no outside investors", because almost every venture-backed technology company will be organized as a C Corporation instead.
I am choosing to run this open source project through a commercial company - that enables me to have a bank account, use GitHub Sponsors, pay others for work, and enter in support and development contracts with other companies and the public sector. Non-profit foundations like US 501(c)(3) aren't practical to create for solo developers.
Not op but I believe the processes associated with LLCs are far better understood and streamlined than SPCs. If I'm diy-founding a legal entity myself, I'd try to follow the path of least surprises too.
> Usually .com domains were used by commercial entities
I must disagree. Even in the early days, even though that was the intended use of the .com TLD, it has always actually been used as the default TLD instead of the .net TLD.
I really hate it, but whenever I see a .com I have never, and will never assume it has some sort of commercial angle.
I'd go so far as to say "especially in the early days". When I got my first domain in ~'92, it was just a personal domain, and they wouldn't let me get a .net because I wasn't an ISP, or a .org because I wasn't a non-profit. And .edu was right out.
The head-scratcher is more like: If it's a non-profit, they (and I) would have expected a .org, which, unlike .net, has been used consistently over the years. But I get your point in general.
At the bottom of the frontpage there's a remark on "revenue"
A 100% independent software project
Protomaps is a self-funded, solo developer project with a mission to make interactive cartography accessible to hobbyists and organizations of all sizes. An essential part of that mission is publishing open source software under commercial-friendly licenses.
You can support my full-time work on Protomaps in a few ways:
* Downloading the open source world basemap tileset with a support plan on GitHub Sponsors.
* Paid development of open source features.
From what I can tell, it's a for-profit company for a single dev that is attempting to get by through nothing but GitHub Sponsorships.
Said sponsorship is a requirement for using the pre-hosted files for commercial uses in a SaaS-like manner.
But some of the wording in the faq is a bit confusing. E.G. There's a whole section about "Why don't you just sell plans for the hosted API?" right after detailing how to sign up for said plan for the hosted API.
That is correct, in addition I have contracts with companies for support as well as development of open source features.
The hosted API is "free" in the sense that there are no tiers. The API costs me money to run, which is why I require a GitHub sponsorship for commercial use. This is ideal for use cases where deploying your own tileset is impractical.
If you are using the hosted API heavily then it makes sense for you to "graduate" to deploying it yourself.
I'd like to correct that impression - Protomaps is an explicitly commercial venture.
The difference from other commercial vendors is the focus on a complete, easily deployable solution - the hosted API is meant for non-commercial and light commercial use, instead of being the main product offering, to avoid the incentive trap of locking-in paying users to the API.
Cool but basemaps are a very tricky thing from a geographers perspective. There are several border conflicts. When it is open source there could be some manipulation.
Immediatelly getting downvoted. All I'm saying is that official map boundaries these days should be used.
Because I can again not post because too fast here my clarification: I mean official not commercial or proprietary
I think the downvotes may be because people take "When it is open source there could be some manipulation" to mean that you think proprietary alternatives are somehow safe from such manipulation—but of course they aren't, least of all Google Maps.
The Protomaps project encompasses the generation of basemap PMTiles and the ecosystem for delivering them to clients. It supplies a standard daily build from OpenStreetMap, and since the generation is open source it can be customized to conform with your specific requirements.
A commercial data provider can also be showing a manipulated border, and is at least making some kind of editorial decision about what they are showing (whether they think so or not).
Why isn't my time zone highlighted on the world map?
> In early 1995, a border war broke out between Peru and Ecuador and the Peruvian government complained to Microsoft that the border was incorrectly placed. Of course, if we complied and moved the border northward, we’d get an equally angry letter from the Ecuadorian government demanding that we move it back. So we removed the feature altogether.
> The time zone map met a similar fate. The Indian government threatened to ban all Microsoft software from the country because we assigned a disputed region to Pakistan in the time zone map. (Any map that depicts an unfavorable border must bear a government stamp warning the end-user that the borders are incorrect. You can’t stamp software.) We had to make a special version of Windows 95 for them.
>"There are several border conflicts. When it is open source there could be some manipulation."
Manipulation can and does happen with non open source data. Just ask any politician. You should be thankful to developers who did a great service for free. If something is incorrect because of conflicts - the world will sort it out eventually. It is not developers job.
PS - I did not downvoted as I never downvote any posts.
I grabbed the latest macOS Go binary from https://github.com/protomaps/go-pmtiles/releases
I found a rough bounding box using http://bboxfinder.com/#37.373977,-122.593346,37.570977,-122....
Then I ran this:
The result was a 2MB hmb.pmtiles file which, when dragged onto the "drop zone" on https://maps.protomaps.com/ showed me a detailed zoomable map, with full OpenStreetMap metadata (airport polygons etc) - all the way down to individual building outlines level.All that in 2MB! I'm really impressed.
My file is here: https://gist.githubusercontent.com/simonw/3c1a5358ca8f782f13...