I need to send this to the people who designed the website for one of my favorite local pizza places, Dino's (https://www.dinostomatopie.com/) -- they did _really_ well, generally, but no matter how hard you try, some things come through: the fonts rendered as text are perfectly crisp at whatever absurd DPI this Chromebook has.
This is better than 50% of websites out there today. I can find all the buttons, the functions of things are clear, and I know how to contact the owner. The menu is text instead of PDF, and the whole thing appears to run properly without javascript enabled.
This is glorious. The only way I could tell with any confidence it wasn't real was the clean, modern HTML and CSS (get with the program folks, this should be non-validating XHTML 1.1!)
using many apps with sound enabled is shocking after long periods of app silence. Social media apps sound like slot machines! So much pavlovian conditioning... Not sure how all that is still legal.
I'm happy to report that I was able to download the zip file and run the executable on my Ubuntu box with Wine. It was quite surreal to see Netscape installer running on my 4k monitor. I was even able to open the browser and navigate to Google.
You have to run the old browsers through a local proxy you configure yourself a lot of times these days if you're really interested in surfing the web with it.
I'm about to go to sleep but there's ones that forward https as http, convert webp and png to gif, strip JavaScript to prevent the browser from locking up, etc.
There's a real world use for this in crazy companies where people are forced on some old browser. This makes them not go down in glorious flames at every site.
There's also a nice (though under-documented) scripting interface, so you could probably write a script that does this for all sites in regular proxy mode. I found an example of this here, though I didn't test it: https://groups.google.com/forum/#!topic/mitmproxy/IAJ0-MHVC0...
back in the day I used RabbIT which handled both the https/http and the png problem but it looks like it hasn't been updated in years. It looks like squid can do the https problem (https://wiki.squid-cache.org/Features/HTTPS).
This actually doesn't looks as easy to do in 2020 as it was in 2009. I also had luck at chaining proxies that did different features as well so one would do a certain transformation and pass it off to another etc.
I've also done partial implementations just using php ... you can preg_replace most things and it more or less works.
But yes I had assumed things had gotten easier but apparently they've actually become more difficult.
While this is useful for people with out of date browsers, a slightly more sophisticated use-case is where you render the entire web page remotely in a sandbox for certain classes of links, eg email links. This prevents a whole class of phishing attacks.
There are companies who will sell this to you but essentially it’s really headless Chrome running in the cloud somewhere.
It stops a man-in-the-middle attacker from trying to exploit your browser/extensions/password manager by injecting code into that site (once the cert is pinned).
It might also help with SEO? Google's been pushing for https everywhere.