Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ok, heres an example of a master:

Were all talking about how to optimize our lucene queries and cache intelligently and so on, because the memory overhead is getting big (mind you were practically replacing our database with lucene).

After about 30 minutes of hypotheticals and complexity, this one guy (previously sitting there listening) just says "Why not run java 64 bit and set the max memory to 20 gigs or whatever is appropriate, let the operating system page for you". KARATE CHOP! I think if we could, we would re-create the scene from "The Mask" where his mouth just drops to the table.



Better karate chop: Do the above and shell out 500 bucks for another 16G RAM in the server while you're at it. That way you won't wind up navigating an object graph that's been paged to disk (Lucene terms are all loaded into memory while the index is open, could be 20,000 terms or closer to a million if you have a whole bunch of unique IDs)


Exactly. The point was, the deadline was coming up in a few weeks. We had work up to our eyeballs with everything else, if we could punt this issue till it really REALLY becomes a problem, we should. And we did. And it hasn't been a problem.

It was a commercial product, but honestly the people who ran this would have no problem paying an extra 2 grand or so for ram for major performance gains (remember this aint your standard sdram).


That seems like it could easily end up being very slow. The OS doesn't know what data is going to be needed and it could page out the wrong data. It's probably worth trying, but it doesn't seem like a masterful solution.


Of course it was. Because that was better than what we wanted to do, simpler, no development time (cheap , met our deadline), worked well enough. Sure you can get a bit more bang for your buck making a catered-for swapping algorithm, but in the end it does not matter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: