I've been thinking about building either a local proxy or firefox addon that puts everything into "first-party only" mode unless whitelisted.
It would probably break the web (at least the genuine parts) less than disabling javascript wholesale which is just too awkward, but it would vastly cut down on the "bullshit" as this article calls it.
It would need some care, for example it would probably have to work from root domains rather than subdomains for matching origin to prevent too much breakage but the improvement in download times would be astronomical, it's almost always the case that "bloat" is third-party bloat.
Obviously it would need to support a whitelist too so payment processors for example could continue to work, but in general the blacklist approach of ad-blockers just isn't working for me.
I think some kind of "auto-whitelist" so I'd need to actively request a domain before requests could be made to them would be the sweet spot for user experience but that itself would require substantial browser integration which I don't think could work through a plugin.
Perhaps a proxy approach would be the best from a UX perspective then. It could inspect headers to figure out if they're primary requests (using similar heuristics to CORS). Primary requests would (or could) be added to an auto-whitelist for future requests.
It would probably break the web (at least the genuine parts) less than disabling javascript wholesale which is just too awkward, but it would vastly cut down on the "bullshit" as this article calls it.
It would need some care, for example it would probably have to work from root domains rather than subdomains for matching origin to prevent too much breakage but the improvement in download times would be astronomical, it's almost always the case that "bloat" is third-party bloat.
Obviously it would need to support a whitelist too so payment processors for example could continue to work, but in general the blacklist approach of ad-blockers just isn't working for me.
I think some kind of "auto-whitelist" so I'd need to actively request a domain before requests could be made to them would be the sweet spot for user experience but that itself would require substantial browser integration which I don't think could work through a plugin.
Perhaps a proxy approach would be the best from a UX perspective then. It could inspect headers to figure out if they're primary requests (using similar heuristics to CORS). Primary requests would (or could) be added to an auto-whitelist for future requests.