What's the tl;dr of how Julia is solving this? Looking around it seems the answer is "multiple dispatch". Which seems suspect considering many languages have already tried this (Common Lisp, for example).
> Clearly, multiple dispatch, or some other way around the expression problem, is necessary for the kind of fluent composability that I’ve described above—but it is not sufficient. Julia has enjoyed an explosive degree of uptake in the scientific community because it combines this feature with several others that make it very attractive to numericists.
That's incredibly handwavy. So what's the special sauce?
There is no such thing as a free lunch when it comes to dynamic vs. static. It also seems like Julia is trading off expressiveness and easy of use in favor of efficiency, based on comments from people that have used Julia. It's one thing to be faster than any inherently slow language (Ruby, Python, Smalltalk, etc.), but keeping that flexibility and being as fast as C/C++ is a rather bold claim. Most languages hit some middle ground between the two, such as Java. But no one is under the delusion that trade-offs weren't made to get there.
I guess the best way to put it is that Julia encourages a style where 90%+ of code can go through paths that are static.
Personally, I think Julia starts off as easy as python, but to get C++ or Fortran speed, you can't just code naively. Things go into a steep learning curve at that point, but perhaps there isn't yet as much know how about how to code "professional Julia" yet. There needs to be a book like Fluent Python or Effective C++ for Julia, or perhaps a condensed version of the Julia manual (see the 1 page zig manual for inspiration).
The other problem I have with Julia right now is lack of static type checkers. "modern python" (e.g, python in production in the last 5 years) tends to leverage the large ecosystem of things that hook into mypy (I'm taking about tools like pydantic) to reduce the inherent brittleness of the language. Ruby, php, and every other dynamic language has also seen that trend.
Right now, I've barely seen that with Julia, and it needs this badly for higher uptake in industry. It's why for example, perhaps you see a lot of Julia packages written for people's phds right now.
Julia does a few things differently then Common Lisp, though they both offer multiple dispatch.
One of the key things in CL is that it has its metaobject protocol which forces a lot of decisions on what gets executed to runtime. There are ways to speed it up, but if you have something like:
Then CL won't call foo specialized on number when given an integer, but will call foo :before specialized on number. It determines this at runtime by searching for all applicable methods based on the type (at least as a first pass, you can cache this to speed it up but then you also have to have cache invalidation if a definition is changed).
Julia doesn't have that aspect of CL's MOP. So this helps to simplify the search for applicable methods and dispatch. Even if it did all its dispatch at runtime, it would still be simpler. The other thing Julia does is aggressive JIT compilation. So if you wrote something like (with the Julia equivalent of foo from above):
function bar(x,y)
foo(x)
foo(y)
end
And, only considering floats and integers, later called it with each pair of float and integer then Julia would compile specialized versions for those 4 combinations. Now when you call bar it still has to properly dispatch it, but once inside bar the search for the correct foo can be bypassed because the types will be known. CL, again thanks to the MOP, doesn't make that as easy to achieve.
Your instinct that Julia is making tradeoffs is indeed correct, however I don't think it actually limits the expressiveness of the language. I happen to think that Julia is a more expressive language than Python. However, it does require learning new patterns and paradigms and someone who tries to write Python code in Julia is probably bound to eventually get frustrated.
A huge part of the design considerations for Julia essentially boiled down to "what sorts of dynamism and language semantics can we disallow while keeping the the good parts of dynamism"
The two biggest things that had to go in order to make Julia fast was
1) the ability to change the memory layout of a struct in a running session
2) the ability to eval in the local scope (our eval always occurs in the global scope)
These two things are huge performance problems. We might oneday solve 1) with Revise.jl (though it'll mean recompiling all your code if you do change the layout) but 2) is basically just a very bad idea and likely to never happen. Instead of a locally scoped eval, we have macros, multiple dispatch, parametric types, and generated functions. These give an incredibly powerful suite of metaprogramming tools that are beyond anything available in Python.
There are trade-offs made. This discussion of "why Julia" describes how multiple dispatch + type stability is what gives the speed, but the trade-offs and edge cases associated with that.
Julia makes a number of (in my opinion) really good tradeoffs here.
1. You can't add fields to a type (struct) after definition. This means that Julia's structs have no overhead and are essentially equivalent to structs in C (although they are parametric)
2. No local eval. Eval in Julia only happens in the global scope and results of eval are only visible the next time you visit the global scope. This may sound kind of unintuitive, but in practice people don't generally use this for good reasons. This allows Julia to never need to de-optimize code. Once a method is compiled that code remains valid.
3. Macros. Julia has really good macros and other code manipulation (since it is basically a Lisp). This makes it possible to generate very complicated but fast code that you would never write yourself. The tradeoff here is that it makes the language more complex, but that's a pretty good tradeoff. (especially compared to the C/Fortran land of using a preprocessor that works on text).
4. Just-In-Time (just ahead of time). Julia at it's core runs as if it were highly templated C++ code. If everything got compiled ahead of time, Julia would be generating terabytes of compiled code and never finish compiling. Instead, Julia makes the tradeoff of only compiling for the argument types that are actually used in the program, which means that it only compiles a reasonable amount of code. The tradeoff here is that compiling small binaries with Julia is very difficult (not possible to do automatically yet).
The TLDR is that most expressive languages started by giving away as much expressiveness as possible, and then looked at how they could be sped up. Julia started by being a modern fast language and looked to see how much expressiveness could be added without slowing the language down.
> Clearly, multiple dispatch, or some other way around the expression problem, is necessary for the kind of fluent composability that I’ve described above—but it is not sufficient. Julia has enjoyed an explosive degree of uptake in the scientific community because it combines this feature with several others that make it very attractive to numericists.
That's incredibly handwavy. So what's the special sauce?
There is no such thing as a free lunch when it comes to dynamic vs. static. It also seems like Julia is trading off expressiveness and easy of use in favor of efficiency, based on comments from people that have used Julia. It's one thing to be faster than any inherently slow language (Ruby, Python, Smalltalk, etc.), but keeping that flexibility and being as fast as C/C++ is a rather bold claim. Most languages hit some middle ground between the two, such as Java. But no one is under the delusion that trade-offs weren't made to get there.