These web framework tests have been really interesting to look at, and each time I've been saddened to see that Rails/Ruby, the framework/language I program with most days, is consistently near the bottom. With Adequate Record now being merged into master, I'm hoping we start climbing the speed test records.
But a question that keeps coming up in my mind is that there are metrics that would be much harder to compare, but might be more useful in my book.
For example, I'd love to see a "framework olympics" where different developers build an agreed upon application/website on an agreed upon server using their favorite framework. The application has to be of some decent complexity, and using tools that an average developer using the framework might use.
In the end, you could compare the complexity of the code, the average page response time, maintainability/flexibility, and the time it took to actually develop the app and the results could let developers know what they sacrifice or what they gain by using one framework over the other. I know a lot of these metrics could reflect the developer themselves vs the actual framework, but it might also be a tool to let you know what an average developer, given a weekend, might be able to produce. It would also help me to see an application written a ton of different ways -- so I can make good decisions about what framework to choose based on my needs.
In the end, speed only tells us so much -- and speed is not the only metric that we consider when we write applications -- otherwise it looks like most developers would be coding their web apps in Gemini.
Readers should definitely evaluate the complexity of the code necessary to implement the tests. To that end, I plan to eventually enhance the results web site to allow you to readily view the related source for each test implementation (perhaps using an iframe pointing at GitHub).
For the very first round, we included the relevant code snippets in the blog entry, and I think that added a lot to the context. With nearly 100 frameworks, the volume of code has become too large to simply embed all of that code directly into the blog entry, but we've put too much burden on the reader to sift through the GitHub repo to compare code complexity.
Things we aim to do:
* Pop-up iframe with relevant code from GitHub.
* Use a source lines of code (sloc) counter to render sloc alongside each result in the charts.
* Render the number of GitHub commits each test implementation has seen at our repository. At the very least, this would show whether a test has seen a lot of review.
* Introduce more complex test types [1].
And as the other reply has mentioned, we have also discussed the possibility of a larger test type that might include a multi-step process. I'd love to eventually get to that point.
a matrix rank calculator might be helpful as well to filters results for users, though I am impressed with the present and new(ish) filtering available.
I actually have been working on a super ridiculous ruby benchmark for every ruby framework and server I could find. It should be out in the next month. It's been quite the undertaking.
Rails slowness just mean one thing, you'll have to spin more server instances for the same task that would have been faster in Java and would have needed less servers (though java eats a LOT of memory, so raw speed is not everything ).So it's a trade off. Spend more money on devs or on the infrastructure?
But dont worry, it's still way faster than most PHP frameworks, that have ridiculous performances,yet their core developpers dont seem to care.
I think it really depends on the kind of app one is building. Video Site like Youtube can use caching to the max,most of the hits wont touch any server-side code,only cached pages. on the other hand a webapp that actually does something and need realtime capabilities might not be the right use case for Rails(like Twitter,though it helped them build their MVP quite fast,same for Iron.io).
It's a tradeoff, do you want to develop fast,at the cost of raw perfs,or get good perfs from the beginning without scalability issues at first place?
I might just do that! At the very least, I'd be willing to represent the "average" Rails developer.
Getting an agreed upon application might be tough, but I'll try setting some stuff up to make it happen, as long as you agree you'll be a part of it :)
But a question that keeps coming up in my mind is that there are metrics that would be much harder to compare, but might be more useful in my book.
For example, I'd love to see a "framework olympics" where different developers build an agreed upon application/website on an agreed upon server using their favorite framework. The application has to be of some decent complexity, and using tools that an average developer using the framework might use.
In the end, you could compare the complexity of the code, the average page response time, maintainability/flexibility, and the time it took to actually develop the app and the results could let developers know what they sacrifice or what they gain by using one framework over the other. I know a lot of these metrics could reflect the developer themselves vs the actual framework, but it might also be a tool to let you know what an average developer, given a weekend, might be able to produce. It would also help me to see an application written a ton of different ways -- so I can make good decisions about what framework to choose based on my needs.
In the end, speed only tells us so much -- and speed is not the only metric that we consider when we write applications -- otherwise it looks like most developers would be coding their web apps in Gemini.