h.264 and WebM are similar enough that it depends entirely on the implementation details of a "hardware decoder" as to whether it can handle one or both. Not all hardware decoders simply take in an h.264 bitstream and emit uncompressed a/v signals. Many "hardware" solutions are just a collection of execution units that handle the most time-consuming stages of decoding or encoding, and they are strung together with software and possibly use of GPU shaders to form a complete pipeline. For such devices, it may be possible to use them to accelerate WebM with most stages going through the fixed-function units and just a few things like entropy coding handled with compute shaders. Such a solution would be just as deserving of the "hardware accelerated" moniker as most of the decoding engines currently out there.
Yeah, I wasn't very clear in my first post. What I was getting at is that it is hardware that you can't reconfigure on the fly, which means you would have to choose between h.264 or WebM, you can't have both. For the next few years at least, the choice is clearly going to be for h.264, not webm