I only recently learnt C, and -Wall -Wextra -Werror was drummed into me as being "the right way". It certainly does catch a lot of my idiotic mistakes.
Unfortunately, if you're using anyone else's code (especially if it's multi-platform code), -Werror is sometimes not feasible. For one, lots of third-party code is full of warning-level mistakes, but most of them are not errors, and fixing them up would be a job for a whole separate company.
For another, sometimes warnings are there because the code generated by compiler macros isn't available to the compiler for the purpose of e.g. checking for unused variables/parameters or type range checking. For example, there are some Linux kernel macros that will ignore certain parameters on a lot of architectures; this will give unused-variable warnings on those architectures if the variable/param isn't used elsewhere. In other places, data types change between signed and unsigned based on architecture, meaning that an in-general-meaningful test for >= zero will give you a warning on platforms where the variable is unsigned.
Long story short, the preprocessor, while super-useful, is not good for static checking.
Although be aware that -Wall and -Wextra activate different warnings on different compilers and even on different versions of the same compiler. E.g. gcc added lots of warnings to them in the last versions.
So your code might not compile anymore on newer compiler versions. This is not a problem as such but for example in combination with updates to your continuous integration environment, this might cause build fails.
Yeah warnings are out of control. New architectures and platforms bring up new things to be concerned about, thus new warnings. But old code quits working solely because of the build issue you mention.
My number one take away from this is that we should all be using them more.