You only need a few thousand error-free qubits to implement Shor's algorithm for 256-bit Elliptic Curve Discrete Log, that will for instance break nearly all crypto. The "millions" is trying to account for the several orders of magnitude error correcting overhead.
Sure, I just don't think error-free qubits are a thing (or will be in the future). I don't think anyone seriously expects quantum computing to work without error correction.
The difficulty of adding qubits increases super-linearly with the number of qubits (especially because of communication delay vs time to decoherence) , so "only" a few thousand is already very optimistic. Worse, the idea of "error-free qubits" is essentially like cold fusion - you can say the words and we understand what you mean by them, but they don't describe anything that can exist in practice.