Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sometimes you fetch a large dataset and only show one page at a time in the DOM, or render it as a line in a chart or something. At a previous workplace we had CSV responses in the hundreds of megabytes.


70 GB CSV files aren't uncommon at my work. It's not really a problem since CSV streams well.


That sounds incredible inefficient

What was the rational for such enormous single payloads?


Without knowing more about the application, I'd guess probably caching and/or scaling. If you only need 1 payload then that can be statically generated and cached in your CDN. Which in turn reduces your dependence on the web servers so few nodes are required and/or you can scale your site more easily to demand. Also compute time is more expensive than CDN costs so there might well be some cost savings there too.


This was basically it. The dataset was the same across users so caching was simple and efficient, and the front-end had no difficulty handling this much data (and paging client-side was snappier than requesting anew each time)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: