Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
700 megapixels served in under 700 miliseconds (prodibi.com)
10 points by idvix on Nov 10, 2016 | hide | past | favorite | 21 comments


There is opensource component that does this: Open Seadragon https://openseadragon.github.io

You pass a big image to "vips dzsave" http://www.vips.ecs.soton.ac.uk/index.php?title=VIPS and it creates a .dzi file describing the tiles and a set of directories with zoomed in tiles. Then you add a div to your HTML and execute this js

    viewer = new OpenSeadragon({
                    id: "openseadragondiv",
                    prefixUrl: "/Scripts/openseadragon/images/"
                });
    viewer.open(pathToDZI);
It also has plenty of useful plugins, eg. Scale bar https://pages.nist.gov/OpenSeadragonScalebar/ or Annotations (to draw on the image) https://github.com/Emigre/openseadragon-annotations


Yeah pretty much the same idea, but not the same packaging. If you just want to quickly add a big pic in your blog and are no dev (noob here) this is everything you need. Same choice as embedding a youtube/wistia/vimeo or hosting and streaming it yourself I guess.


Misleading title. This uses zoomable tiles which has been done for a decade already. It is most definitely not serving 700MP in 700ms.


Only parts of image is requested based on where you zoom . Image is divided into number of slices and fetched on demand .


My guess is it works well for very large images . For smaller images of real world , I am not sure how effective it is ( apart from usual optimizations ) .


I think this is something that should be present everywhere. Especially when serving customers from countries with unreasonable data caps enforced by ISPs.

I have seen some times load up 5-6 MB files when a 100-200kb image would have done the job.

If us developers could have an easy way to do this, I think it would really make the internet a lot faster.


Of course, but the average smartphones are now doing 20Mpx, and semi-pro cameras around 50Mpx...


You can probably do this by hooking Leaflet.js up to a custom tileserver ;P


A bit more work then ;)


Quite misleading indeed and definitely done before. NASA released a 46000MP picture of the milky way in 2015, which is also available online[1] using the same technique.

[1] http://gds.astro.rub.de/


Mea culpa on the title... Tech is not new, but the convenience layer on top is. Everything is taken care of.


Here's an open-source time-proven project for huge image visualization over web: http://iipimage.sourceforge.net/


Can somebody explain how this works?


It works like google maps. The high resolution image is sampled into several layers of lower resolution. You can call it levels.

Every level is split into quadrants.

So when someone zooms in, it will fetch the quadrants that is visible to the browser viewport. By doing this way, the browser doesnt have to load the rest of the image and thus save alot of bandwidth!


Exactly. Part of the same logic when you stream video: not 100% of the frames are loaded at once, but they come in eventually.




By lying about what actually gets served and by reinventing on-demand requesting of image tiles. This is ancient tech.


Ancient in a convenient, all-included package, yes. It's like serving videos yourself or embedding youtube/wistia/vimeo... The quick and performant way has value for some.


Stupid clickbait about NOT serving 700 megapixels in under 700 milliseconds but just about 5 megapixels in about 24 images of < 50 kilobytes each. Big whoop...


Would be awesome if Flickr or 500px would use this kind of tech.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: