Thanks for that! Some people I work with are constantly asking for ML, they invoke like its magic and will figure shit out by itself. Then when I push back asking how they would make the decisions themselves, their answers tend to be in the line of "it's ML, it should figure out by itself", and when I ask about the data to be used, "it sshould adapt itself and find the data". Getting to have a heuristic in the first place is so hard.
Reminds me of the book "Everything is obvious", where they experimented a few times and showed that in complex systems, advanced prediction systems made on many available and seamingly relevant variables are only marginally better (2 to 4% in the experiments) than the simplest heuristics you can use. They interpreted that as a limit of predictability, because systems with sufficient complexity behave with a seemingly irreducible random part.
Reminds me of the book "Everything is obvious", where they experimented a few times and showed that in complex systems, advanced prediction systems made on many available and seamingly relevant variables are only marginally better (2 to 4% in the experiments) than the simplest heuristics you can use. They interpreted that as a limit of predictability, because systems with sufficient complexity behave with a seemingly irreducible random part.