Walter, watch out. You want to talk about dumbing down? California is considering a new law in which every computer language must have a keyword 'quine' which prints 'quine' . And none of this looking under the hood stuff. That's doing your own research. Trust the computer science! :)
Is there really no demand? Or do those of us who like to learn that way just get used to researching these things ourselves so quietly get on with it. Many of the existing tutorials are at least a good starting point to teach engineers what topics they need to examine in more detail.
Anecdotally, when I've mentored juniors engineers I've had no shortage of people ask me "why" when I've explain concepts at a high level; them preferring I start at the bottom and work my way up. So I quite believe there could be an untapped demand out there.
There’s a difference between understanding something and learning how and why it works the way it does. You can understand how a compilation pipeline works never working with any low-level code and never writing a compiler yourself. You can walk across a bridge and understand it connects point A with point B and don’t understand how a specific bridge has to be constructed. A concrete implementation is just an implementation detail and if you focus too much on it you’ll get tunnel-visioned instead of understanding the concept behind it
EDIT: And I say that as someone who likes both learning and teaching from the ground-up. But there’s no demand for it because that’s not how you efficiently learn the concepts and understand the basics so you can take a deeper dive yourself
> you’ll get tunnel-visioned instead of understanding the concept behind it
You might have gotten tunnel-visioned but it's not a problem I've suffered from when learning this way. And why do you think I cannot understand the concept behind the code after reading the code? If anything, I take the understanding I've grokked from that reference implementation and then compare it against other implementations. Compare usages. And then compare that back to the original docs. But usually I require reading the code before I understand what the docs are describing (maybe this is due my dyslexia?)
Remember when I said everyone's brain is wired differently? Well there's a lot of people today trying to tell me they understand how my brain works better than I understand it. Which is a little patronising tbh.
I also have a hard time with learning concepts too if there are handwavey parts of it. I remember by recreating the higher level concepts from lower level ones at times.
To me, the abstraction is an oversimplification of actual, physical, systemic processes. Show me the processes, and it's obvious what problem the abstraction solves. Show me only the abstraction, and you might as well have taught me a secret language you yourself invented to talk to an imaginary friend.
I don't believe most productive programmers learned the quantum physics required for representing and manipulating 1s and 0s before they learned out to program. Abstractions are useful and efficient.
You're more comfortable with a certain level of abstraction that's different from others. I can't endorse others that try to criticize your way of understanding the world, but I'd also prefer if some people who in this thread subscribe to this "bottom up" approach had a bit more humility.
I think part of it comes from believability, or the inability to make a mental model of what is going on under the hood. If something seems magical, you don't really understand what is going on, it can make it hard to work with because you can't predict it's behavior in a bunch of key scenarios. It basically comes down to what people are comfortable with what their axiom set is. It gets really bad when the axiom set is uneven when your teaching it, and some higher abstractions are treated as axiomatic / hand waved, while other higher abstractions are filled in. This is also probably an issue for the experienced, because they have some filled in abstractions that they bring from experience, so their understanding is uneven and the unevenness of their abstraction understanding bugs them.
Like limits in calculus involved infinity or dividing by unspecified number seems non functional or handwavy in itself. Like how the hell does that actually function in a finite world then? Why can't you actually specify the epsilon to be a concrete number, etc? If you hand wave over it, then using calculus just feels like magic spells and ritual, vs. actual understanding. The more that 'ritual' bugs you, the less your able to accept it and becomes a blocker. This can be an issue if you learned math as a finite thing that matches to reality for the most part.
For me to solve the calculus issue, I had to realize that math is basically an RPG game, and doesn't actually need to match reality with it's finite limits or deal with edge cases like phase changes that might pop up once you reach certain large number thresholds. It's a game and it totally, completely does not have to match actual reality. When I digged into this with my math professors, they told me real continuous math starts in a 3rd year analysis class and sorry about the current handwaving, and no, we wont make an alternative math degree path that starts with zero handwave and builds it up from the bottom.
The last time I learned a new programming language (Squirrel), I did so by reading the VM and compiler source code in detail rather than writing code. You get a far more complete picture of the semantics that way! I didn't even read much of the documentation first; it answered far too few of my questions. (Edit:) I want to know things such as: how much overhead do function calls have, what's the in-memory size of various data types, which strings are interned, can I switch coroutines from inside a C function called from Squirrel...
> I want to know things such as: how much overhead do function calls have, what's the in-memory size of various data types, which strings are interned, can I switch coroutines from inside a C function called from Squirrel...
So is that a problem with learning from abstractions, or just simply a problem that this stuff isn't mentioned in the manual?
I do. I recommend it as a way to avoid thinking Haskell is magic, which a lot of people seem to be convinced of. GHC has pretty good desugared printing options.
I'm not sure how to view asm for HotSpot or a JavaScript engine though.
By the way, where can I read a D tutorial from the bottom up?