Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> How else do you test and optimize SQL queries that are only slow with production-size data?

With something like this https://www.getsynth.com/docs/blog/2021/03/09/postgres-data-... (disclaimer: no affiliation with them, I've not used their product but it appears to be fully open source)



I agree that this is a possible way. The main difficulty of the generated data is related to their quality and structure. Namely, how artificial data correspond (quantitatively and qualitatively) to real data.

Random data may give incorrect results when optimizing a query.


Yep! The idea of that is to generate data that isn't just random but has the right structure. But, I don't know if they deliver.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: