Hacker News new | past | comments | ask | show | jobs | submit login

Not an expert, however (lol) - even if the recommendations here are solid - this doesn't feel like a particularly well-reasoned comparison.

IMHO maintaining a large number of files on disk (be it 16m or 1.5b) should be a non-starter (not to mention that zipping them to slightly reduce the count removes one of the benefits of having them in the first place). A database makes sense - as does SOC between generation and distribution if it brings benefit - but immediately writing off SQLite because of write performance[1]? Similarly the arguments against pmtiles feel outdated and ignorant of the developing nature of the protocol (appreciate it's a 2yo post!).

OSM deciding on a format here has potential to significantly shape things to come. Hopefully Paul has progressed his thinking a bit since this was written.

[1] https://blog.wesleyac.com/posts/consider-sqlite




Most of the development for pmtiles has happened in the last 2 years, including a maturing server implementation (http://github.com/protomaps/go-pmtiles) but some key parts are still missing like the ability to decode an archive in native mobile applications. sqlite (mbtiles) already has ~10+ years of integration into the mapping ecosystem so that still works better if you want to move tilesets around to desktop applications and mobile devices.

The situation Paul is addressing is one unique to OpenStreetMap itself, which is minute-level updates of a global scale tileset. This is a use case pmtiles is designed explicitly not to address, where using a database is a better fit.


Appreciate you weighing in directly Brandon! To be clear though, I didn't mean to say that protomaps was potentially a better solution (of course an uncompressed DB makes much more sense), simply that the developing nature of the protocol meant that it's inclusion within the "ecosystem" shouldn't be a closed case :)

Which is to say, distribution of OSM data feels like a large part of the process. Of course there are various bottlenecks / considerations around edits / writes, but in practice surely reads are the bigger vector. I wonder how many OSM-external use cases rely on "minutely" updates or need the full fidelity of the raw data source. Feels like there is a solid case for providing less frequent (hourly, daily?) official updates via official "single-file" formats that could be widely distributed to the benefit of all, and e.g. allow OSM to loosen up their hot-linking policies and ensure continued investment in the chosen protocols.

But mainly I was questioning how a somewhat proven format like SQLite, with its many benefits (interoperability, distribution, etc.), would be so easily dropped from consideration without even a test having been run. Just my thoughts of course!


There is nothing stopping the OSM Foundation from say, offering a complete SQLite (or PMTiles) tileset download on planet.openstreetmap.org, technically, legally or otherwise. Creating and archiving the tiles shown on osm.org would be at least a couple terabytes once you get to around zoom level 16.

The key distinction is "official" - the only "official" data product of OpenStreetMap are its XML and PBF data dumps at planet.openstreetmap.org. The frequently-updated tiles you see on osm.org are "quasi-official", they're created by a separate project called OpenStreetMap Carto. These tiles have a special status within the OSM ecosystem for historical reasons; by virtue of OSM being map data, it should probably show something for human eyeballs on the website.

The design goals of OSM Carto are to show feedback to map editors; the linked vector tiles project is intended as a successor, or at least complement, to OSM Carto. The consumption of the tiles by third-party sites is a side-effect tolerated by OSMF; the general consensus within OSM seems to be that a consumable tile service for third parties is outside the scope of the project.


Great context + points. Thanks!

Definitely not in a position to question the status quo, let alone the OSM community consensus. Other than to say that - even if the focus is currently on internal software / editors alone - it still feels like a massive opportunity. And one that aligns with a number of strategic goals of OSMF[1], specifically around efforts to grow the community, extend the core developer base (and more) ... by fostering a closer relationship between OSM data-derived products / the wider (non-editor) community.

Let it be known I love scope creep.

[1] https://osmfoundation.org/wiki/Strategic_Plan


Yeah we are considering serving tileset downloads there. It should be easier to get OpenStreetMap data for your offline or on-premises map.


The SQLite limitations might also be solvable by sharding the databases (so each sees 1/N of the write volume), tiering them (so tiles that are written most recently are in file A, then file B, then file C), or a combination of these approaches.

I'm assuming the requirement is not really to build a single .mbtiles file, but to have some way for a process to serve the correct tile when given a z/x/y tuple.


What does the pmtiles protocol provide to help update the tiles in place?

That's the challenge with the osm.org feedback map, you need to quickly update the small fraction of tiles that very recently changed, while hopefully not processing the remainder.


It doesn't, pmtiles is designed for tilesets updated at a daily to quarterly frequency, by replacing the entire archive on cloud storage. for minute-level updates serving tiles out of PostGIS is the best current approach, which is the approach Paul is taking for the OSMF project.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: