All excellent points, and I think you should DM me on twitter to chat about this more. (I hope you will!)
DCT is on my radar. But there are several serious limitations that I think are overlooked. For example, convolution is no longer a simple component-wise multiplication. This seems, to me, a big deal.
In other words, you're probably right, but I'm focused solely on FFTs on the (very low) chance that people have overlooked something that will work well.
Sorry I don't work on neural networks much, and have my plate too full with other projects (and my DSP is a bit rusty?) to hold a conversation on this right now. And I don't use Twitter much either.
DCT is on my radar. But there are several serious limitations that I think are overlooked. For example, convolution is no longer a simple component-wise multiplication. This seems, to me, a big deal.
Complex numbers are tricky to model, but I think most people have given up too easily, or haven't been creative enough in how they're modeling them. Some of my (outdated) ideas: https://gist.github.com/shawwn/c6865fccafac5066e1c7bab672781...
In other words, you're probably right, but I'm focused solely on FFTs on the (very low) chance that people have overlooked something that will work well.