This is similar to the "Lattice Method" that they taught (still teach?) here in the US. I can't stand it. The students simply learn an algorithm, without developing a sense of place value and an intuition for "how numbers work." Sure, they can quickly multiply large numbers and it's a cool "party trick." But they aren't learning math. Might as well use an iPhone.
Fair point that memorizing an algorithm doesn't necessarily advance the student's intuition for how numbers work, but keep in mind that you're replying on a subthread about how learning a general algorithm is an improvement over memorizing multiplication tables.
Ideally, we teach children the intuition for mathematics. Just memorizing arithmetic algorithms isn't ideal, but it's surely better than memorizing finite arithmetic tables.
I don't think it is better to learn some shortcut algo at a young age. Memorizing your times tables up to say 10 * 10 or 12 * 12 is fundamental to understanding multiplication. Remember, the brain is not a turing machine, it's an associative memory machine. Facts are the foundation of knowledge.
First memorize the times tables. Then learn the long method. Then learn the shortcuts. IMO, of course.
I agree with you. To do multiplication by any non-parlor-trick method you have to be able to multiply single-digit numbers. If you don't memorize the table then every time you do larger numbers the process becomes that much more of a chore.
Now, that is not to say that you should forego understanding multiplication. But forcing student to memorize the basics (like forcing them to memorize verb forms) (1) makes their life easier later on and (2) gives them the chance to spot patterns themselves.