Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Hash tables win the interview O(n) vs O(n log n),

If the original table has n entries, the hash must have at least log(n) bits to get most of the time a different hash. I don't know the current state of the art, but I think the recommendation was to have at most a 95% filled array, so the are not too many collision. So both methods are O(n log n), one under the hood and the other explicitly.



I guess you have to distinguish between # of elements vs the max bit-width of a an element. But if you're leaving the constant-memory access model (where you can assume each element fits in memory and all arithmetic on them is constant time), then doesn't it take O(n log k) (k is the numeric value of the maximum element, n is total number of elements) to simplify write out the array? You physically can't do O(n) [ignoring the k]

Maybe relevant comments [1, 2, 3]

[1] https://news.ycombinator.com/item?id=44854520 [2] https://news.ycombinator.com/item?id=43005468 [3] https://news.ycombinator.com/item?id=45214106


N bits doesn’t mean n operations.


Probably N/64 depending, perhaps less if you can use the SIMD but I'm not sure it's possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: