I've been using Deckard since the earliest builds, and can confirm libghostty was the worst part of it, to the point I had to go back to my old terminal.
I came back to Deckard a week ago and I've been enjoying building out my startup rmBug in it, running many sessions in parallel and never missing a beat when one of them is ready for my input. I would go so far as to say it's been instrumental in our launch yesterday.
The web was built to be open and available to everyone. Serving static HTML from disk back in the day, nobody could hurt you because there was nothing to hurt.
We need bot protection now because everything is dynamic, straight from the database with some light caching for hot content. When Facebook decides to recrawl your one million pages in the same instant, you're very much up shit creek without a paddle. A bot that crawls the full site doesn't steal anything, but it does take down the origin server. My clients never call me upset that a bot read their blog posts. They call because the bot knocked the site offline for paying customers.
Bot protection protects availability, not secrecy.
And the real bot problem isn't even crawling. It's automated signups. Fake accounts messaging your users. Bots buying out limited drops before a human can load the page. Like-farming. Credential stuffing. That's what bot protection is actually for: preventing fraud, not preventing someone from reading your public website.
Cloudflare's `/crawl` respects robots.txt. Don't want your content crawled, opt out. But if you want it indexed and can't handle the traffic spike, this gets your content out without hammering production.
As for the folks saying Cloudflare should keep blocking all crawlers forever: AI agents already drive real browsers. They click, scroll, render JavaScript. Go look at what browser automation frameworks can do today and then explain to me how you tell a bot from a person. That distinction is already gone. The hot takes are about a version of the internet that doesn't exist anymore.
Evolving existing solutions that work is always a balancing act -- do you buy into the latest trends wholesale, or do you stop and consider your actual needs before picking a solution that will serve you best? A recap of how we successfully containerized workloads without Kubernetes
The first version was built with Facebook login on purpose, because we offload the social graph building to FB. Destinations are suggested based on where your friends currently live, and where they've previously been or want to go in the future.
It's not just about not bothering with user registration and passwords, it's about using the data from FB to give users added value.
We are in the early stages of building a completely non-social experience, including possibly a fully-logged out mode, which will use even more signals to suggest destinations. We don't have an ETA to announce for this as of yet.
Ugh really? Travel for me is almost exclusively about getting away from my "social graph." If I'm taking a vacation it is hopefully no where near anyone I know.
I didn't make it past page three. Enjoy responsibly.
reply