I watched the talk linked by Ives, the guy behind Codesandbox.
So the exact quote in that talk, is:
"... and if you take the existing module map (file), it essentially replaces (function) #1 with this (new) one. And this is how bundlers like Metro work, for example -- They regenerate the file, and over a websocket connection, they send the new file and say "replace function #1 with this one, and execute it."
He then touches on how bundlers do a lot more than just transformation in this step, other processes like chunking, tree-shaking, code-splitting, etc. And this makes it incredibly difficult to properly cache a file, because you can't really get a good heuristic for diffing/patching.
And then the quote of the hour:
"But what if bundling had an O complexity of 1? That would mean that if one file changes, we only transform THAT file and send it to the browser." (no others in dependency tree)
Above quotes start here, and go for about 2 minutes:
In this context, what I believe Ives is aiming for is the idea that's pervasive throughout the talk, which is this "50ms or second HMR time."
If you overlook slight variances between time to update a file which contains more/less content with this system, and you average it out to "50ms, every file, any file", I think that would qualify as O(1) bundling/hot-reloading right?
However I think the intuition and excitement exists around the ideal bundler and I think this project provides hard evidence that we can strive for such things with incremental success.
So the exact quote in that talk, is:
"... and if you take the existing module map (file), it essentially replaces (function) #1 with this (new) one. And this is how bundlers like Metro work, for example -- They regenerate the file, and over a websocket connection, they send the new file and say "replace function #1 with this one, and execute it."
He then touches on how bundlers do a lot more than just transformation in this step, other processes like chunking, tree-shaking, code-splitting, etc. And this makes it incredibly difficult to properly cache a file, because you can't really get a good heuristic for diffing/patching.
And then the quote of the hour:
"But what if bundling had an O complexity of 1? That would mean that if one file changes, we only transform THAT file and send it to the browser." (no others in dependency tree)
Above quotes start here, and go for about 2 minutes:
https://youtu.be/Yu9zcJJ4Uz0?t=1018
----
In this context, what I believe Ives is aiming for is the idea that's pervasive throughout the talk, which is this "50ms or second HMR time."
If you overlook slight variances between time to update a file which contains more/less content with this system, and you average it out to "50ms, every file, any file", I think that would qualify as O(1) bundling/hot-reloading right?