We as developers might have plenty of RAM to waste, but our users usually don't. If it runs well on our machine but not in the users', what good is it?
Even if it runs well on the users' machines, how many resource-heavy applications can they run concurrently? Will they have to close one to open another?
Unfortunately the attitude is pervasive and electron is also used for applications more in line with what your mom would be using. I think many would be shocked that a good chunk of their audience has less than 8GB: http://store.steampowered.com/hwsurvey
It's also a mistake to assume developers have plenty of RAM to waste. A couple of years ago I was using a potato for coding on the way to work that only had 2GB, which made atom a no go.
Also, what happened to the idea of dog-fooding? It's not dog-fooding if your on a machine much faster and with more memory than your users. I think the problem with windows 10 is that all the devs were dog-fooding on top of the line surface pros.
Even if it runs well on the users' machines, how many resource-heavy applications can they run concurrently? Will they have to close one to open another?