>The biggest bottleneck was consistently the communication with MongoDB, something I had very little control over.
Interesting you say that, because one of Mongo's few redeeming features is that it's supposed to be pretty fast for both reading and writing. Exactly which part was the bottleneck? Was your app more read-heavy or write-heavy?
The problem I ran into was that I needed to perform roughly 20 interdependent queries in a short period of time. At most, I could do 5 queries in parallel due to the interdependent nature, but I still ran into trouble with the database being slow.
In the end, I added Redis as a caching layer, which cut time-per-query down a few ms per request, which was significant in this case.
I can only imagine how bad it could've been if I had block for each request instead of using the asynchronicity of node.
Right, caching is usually the easiest way to solve problems in those situations.
>I can only imagine how bad it could've been if I had block for each request instead of using the asynchronicity of node.
All modern web servers/frameworks are going to be asynchronous in one way or another, whether they spawn a new thread or process for each request (and for each database query), or whether they use polling and an event loop like Node. Node and HTTP servers like it are only a boon when you expect to have so many concurrent requests that threads/processes will begin to hog too many resources.
So unless your server was really being hammered with queries for hundreds or thousands of concurrent users every second, I imagine using Node was neither a significant advantage or disadvantage. And if it's the database or database driver that can't handle concurrency, then Node would do absolutely nothing to help there.
Interesting you say that, because one of Mongo's few redeeming features is that it's supposed to be pretty fast for both reading and writing. Exactly which part was the bottleneck? Was your app more read-heavy or write-heavy?