* Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.
* Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)
* Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)
As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.
And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?
I'm being a little coy because I do have a very detailed proposal.
In want the JS toolchain to stay written in JS but I want to unify the design and architecture of all those tools you mentioned so that they can all use a common syntax tree format and so can share data, e.g. between the linter and the formatter or the bundler and the type checker.
Hasn't that already been tried (10+ years ago) with projects like https://github.com/jquery/esprima ? Which have since seen usage dramatically reduced for performance reasons.
Yeah, you are correct. But that means I have the benefit of ten years development in the web platform, as well as having hindsight on the earlier effort.
I would say the reason the perf costs feel bad there is that the abstraction was unsuccessful. Throughtput isn't all that big a deal for a parser at all if you only need to parse the parts of the code that have actually changed
All I can say for sure is that the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.
And anyway, these new tools tend to have a "perf cliff" where you get all the speed of the new tool as long as you stay away from the JS integration API sued to support the "long tail" of uses cases. Once you fall off the cliff though, you're back to the old slow-JS cost regime...
> […] the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.
I don't have them at hand right now but there are various detailed write-ups from the maintainers of Vite, oxc, and more, that are addressing this specific argument to point out that indeed the JavaScript runtime was a hard limitation on the throughput they could achieve, making Rust a necessity to improve build speeds.
Why do you need high throughput though? Isn't that a metric of how fast a batch processing system is?
Why are we still treating batch processing as the controlling paradigm for tools that work on code. If we fully embraced incremental recomputation and shifted the focus to how to avoid re-doing the same work over and over, batch processing speed would become largely irrelevant as a metric
Each of these tools provides real value.
* Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.
* Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)
* Code formatters automatically ensure consistent formatting.
* Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)
As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.
And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?