"In short, if you’re in search of generalizable knowledge that compounds exponentially over time, then blub studies looks like the crap you have to wade through to get to the good stuff. So it’s easy to see why people give up on understanding all the blub they’re surrounded by, except what they need to get the job done."
Yes AND you should be studying advanced topics not included in your blub of choice. Neither alone is sufficient. Too many programmers only study blub or only study the extreme fringes. A healthy mental diet should have the programmer studying both with equal measure.
Blub pays the bills today, fringe pays the bills tomorrow.
Yeah but I think the point made by this blogpost is to distinguish shallow blub (copy and paste info from stack overflow) from deep blub (go deep into the source code, instrument it and reimplment it) and that the latter allow for generalization and even practical understanding of fringe ideas.
I dunno. IMO, there is just too much deep blub around and it loses value too fast for making a significant investment in deep blub to be a practical approach.
There’s a book called CLR via C# that explains the inner workings of the CLR from the standpoint of a C# programmer. I’ve learned more about JIT compilers, static typing, object oriented programming, and language runtimes in general from it than I ever would have expected. I also work in Elixir at my day job, and that book doesn’t describe the BEAM directly, but it gave me a pattern for how to think about how BEAM works that maps better than you might expect.
Blub is greater than fringe because Blub exists in objective reality; fringe will certainly exist in objective reality, but in such a different context that none of the authors of fringe will recognize that they're staring fringe right in the face.
This is exactly what happened to Alan Kay when OOP was fringe. He then saw all the mainstream programming languages implement OOP and exclaimed "I don't recognize any of these as OOP." Even though it was exactly the same OOP he envisioned, but in such a different context he couldn't see his idea. And the same thing will happen to monads too: Simon Peyton Jones will enter a code bootcamp and see a new programming language with monads and exclaim "I don't know what you're programming in but these aren't monads." And the same thing will happen to Bayesian convolutional neural networks, Word2Vec, and Gradient Descent if machine learning is your style.
This part ringed particularly true for me.