Hacker News new | past | comments | ask | show | jobs | submit login

I checked in with Jed (the guest blogger) and he told me that Band 8 of the data has the highest (15m) resolution. Per the post at https://www.mapbox.com/blog/putting-landsat-8-bands-to-work/ , you can use this to sharpen the images in the other layers.



I wrote that Mapbox post, and helped with some of the processing that happens to get this data on AWS. Yep, you get multispectral resolution of 30 m, and with pansharpening (http://en.wikipedia.org/wiki/Pansharpened_image, etc.) you can get visually acceptable quality at 15 m in RGB.

Landsat is basically intended for science about seasonal/annual/decade-scale changes in Earth’s land surface. When you see an estimate of how a city’s built-up area has grown since 1980, or how the Everglades are changing, it probably has Landsat as one source. This explains a lot of design decisions that might seem weird to a layperson who wants to use it for everyday RGB imagery. Most use of Landsat imagery is basically off-label. It’s just very good data in terms of accuracy, precision, and general ease of use. And if I say so myself, it looks real pretty: https://www.mapbox.com/blog/landsat-live-live/


Which pansharpening method is used in the example image?


In the images in the blog post and the live map? Those aren’t pansharpened at all. If we do add pansharpening in a later version, it’ll likely be naïve, without spatially aware modeling of the multispectral data. (Specifically, it’ll probably be a cleaned-up, rasterio-based, null-aware, parallelized descendant of this sketch of the Brovey transform in numpy: https://gist.github.com/celoyd/2e7beed82951d22b9b90 .)

From what I’ve seen – and I haven’t tested it carefully yet, so I could be wrong – the more elaborate methods are severe overkill on Landsat 8. It has only 4 pan px per multi px (where some commercial data has 9 or 16), and the pan band is almost exactly R+G+B (without NIR). So my gut and some simple experiments suggest that doing PCA-or-whatever is overthinking it.


In the blog post

> Pansharpened Malibu, 15 m (50 ft) per pixel. Notice the wave texture in the water.

Ugh, Brovey. There's better options available. Like MMP (really low spectral distortion but can be slow): http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6677587) or even affinity/guided filtering (my own paper, more spectral distortion than MMP but a lot faster and you can sharpen hyperspectral with multispectral or RGB): http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=7008094.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: