Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Right, but my point is that your numbers don't indicate anything about the actual potential as a compression format. It is certainly possible to use it that way, but there is still much that is uncertain about whether a neural network can do better than hand-crafted compression formats. So it's a little early to call it "next generation." It is literally meaningless to point out that "his neural network compresses each image down to 200 floats", since it doesn't actually work well with such a small latent space.

It's not exactly unknown that autoencoders can reconstruct images, so I don't see why it's "impressive that it works at all."



> So it's a little early to call it "next generation."

Please don't ignore words other people write, or at least reread before hitting post. I did say possibly.

> I don't see why it's "impressive that it works at all."

I thought it was implicit from my explanation that the "works at all" includes the fact that the data sizes are reasonable already. If we had the same result, but the intermediary data was 18 gigabytes, then there would be nothing impressive about it. As it is, we're a lot below that, before further compression, so it is.

Remember: Prototype, Proof of Concept. You're looking at the first step, not the last step. You're looking at a motor carriage, not a Tesla.


> I thought it was implicit from my explanation that the "works at all" includes the fact that the data sizes are reasonable already.

Right, but why is that impressive if it doesn't actually result in a good reconstruction? I can take any collection of numbers and summarize it with the mean value, that doesn't imply that averaging is a good compression method.

If you consider this proof of concept, then what, exactly, concept does it prove? That statistics can represent a dataset?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: