
700 megapixels served in under 700 miliseconds - idvix
http://mag.prodibi.com/2016/11/10/700-megapixels-through-the-interwebs-in-under-700-miliseconds/
======
yread
There is opensource component that does this: Open Seadragon
[https://openseadragon.github.io](https://openseadragon.github.io)

You pass a big image to "vips dzsave"
[http://www.vips.ecs.soton.ac.uk/index.php?title=VIPS](http://www.vips.ecs.soton.ac.uk/index.php?title=VIPS)
and it creates a .dzi file describing the tiles and a set of directories with
zoomed in tiles. Then you add a div to your HTML and execute this js

    
    
        viewer = new OpenSeadragon({
                        id: "openseadragondiv",
                        prefixUrl: "/Scripts/openseadragon/images/"
                    });
        viewer.open(pathToDZI);
    

It also has plenty of useful plugins, eg. Scale bar
[https://pages.nist.gov/OpenSeadragonScalebar/](https://pages.nist.gov/OpenSeadragonScalebar/)
or Annotations (to draw on the image)
[https://github.com/Emigre/openseadragon-
annotations](https://github.com/Emigre/openseadragon-annotations)

~~~
idvix
Yeah pretty much the same idea, but not the same packaging. If you just want
to quickly add a big pic in your blog and are no dev (noob here) this is
everything you need. Same choice as embedding a youtube/wistia/vimeo or
hosting and streaming it yourself I guess.

------
ricardobeat
Misleading title. This uses zoomable tiles which has been done for a decade
already. It is most definitely _not_ serving 700MP in 700ms.

------
nautical
Only parts of image is requested based on where you zoom . Image is divided
into number of slices and fetched on demand .

~~~
nautical
My guess is it works well for very large images . For smaller images of real
world , I am not sure how effective it is ( apart from usual optimizations ) .

~~~
oolongCat
I think this is something that should be present everywhere. Especially when
serving customers from countries with unreasonable data caps enforced by ISPs.

I have seen some times load up 5-6 MB files when a 100-200kb image would have
done the job.

If us developers could have an easy way to do this, I think it would really
make the internet a lot faster.

------
ameesdotme
Quite misleading indeed and definitely done before. NASA released a 46000MP
picture of the milky way in 2015, which is also available online[1] using the
same technique.

[1] [http://gds.astro.rub.de/](http://gds.astro.rub.de/)

~~~
idvix
Mea culpa on the title... Tech is not new, but the convenience layer on top
is. Everything is taken care of.

------
lomereiter
Here's an open-source time-proven project for huge image visualization over
web: [http://iipimage.sourceforge.net/](http://iipimage.sourceforge.net/)

------
dbalan
Can somebody explain how this works?

~~~
gressquel
It works like google maps. The high resolution image is sampled into several
layers of lower resolution. You can call it levels.

Every level is split into quadrants.

So when someone zooms in, it will fetch the quadrants that is visible to the
browser viewport. By doing this way, the browser doesnt have to load the rest
of the image and thus save alot of bandwidth!

~~~
idvix
Exactly. Part of the same logic when you stream video: not 100% of the frames
are loaded at once, but they come in eventually.

------
anc84
Stupid clickbait about NOT serving 700 megapixels in under 700 milliseconds
but just about 5 megapixels in about 24 images of < 50 kilobytes each. Big
whoop...

------
nire
Would be awesome if Flickr or 500px would use this kind of tech.

