Hacker Newsnew | past | comments | ask | show | jobs | submit | estebank's commentslogin

Social media being bad is partly because of shady business practices, and partly because a lot of people suck (in different ways, at different times, including us).

Having said that all of that, have you tried mastodon?


Mastodon, Bluesky, etc are neat - both in what they're trying to be and their technology. But ultimately these days I reject them in favor of more local socialization (again, not geographically). What this looks like is a constellation of private (or pseudo private) discord communities. If I make friends in one, I often get invited to another. I recognize the merit in broader social forums like Mastodon, but it is not worth the drawbacks to me.

As an aside, I'm not happy with Discord as a platform so I'm working on my own clone with some common identity stuff but with community servers run independently. That is, there are some "federated" identity providers so community servers can agree on identity across servers, then each community server runs its own thing. The trust model is based on the community server - private channels in a community server are not E2E encrypted, you must trust the server. But DMs and DM groups are E2E encrypted and use mutual community servers as relays (with a special class of relay server for people who want to DM but don't have an actual mutual server). I'm having fun with it. Now if only I could figure out why my video has such high latency (even locally!).


Hey! It also had a barely working physics engine.

Then again the dinosaurs were physics entities, so maybe you already mentioned it. :)


If the genersted PDFs are stored encrypted in an accessible server with proper access control, then that is a measurable improvement over email containing medical informstion that a random citizen would send, which would be bouncing around unencrypted around at least one third party SMTP server. Of course, if then that person uses an online Fax service, they are sharing that information with at least one other party...

And that's even without considering the security benefit of not receiving files that could be compromised, instead generating a file from an image stream. (Now I'm trying to picture what a daisy chain of exploits would be needed to craft a malicious Fax.)


The OP states they've migrated. That might mean that the field on their account database entry might be related to that move. The account is older, but when moving countries I've had to do weird dances to get my Google accounts to accept the new locale, and wouldn't be surprised if their computed account age coincides with me having done that change.

A law can be bad and its implementation can be worse.

I found that I would have enjoyed the movie a bit more if I hadn't read the book, but it was still a solid 8/10. I'm really glad that a movie like this did well in opening weekend.

> Zig vs Rust also shows up with how object destruction is handled.

I often hear critiques of Drop being less efficient for anything Arena-like, where batch destruction would be better, and holding that as the reason defer is a better approach. What is not mentioned there is that there's nothing stopping you from having both. In Rust you can perform batch destruction by having additional logic (it might require control of the container and its contents' types for easiest implementation), but the default behavior remains sane.


That's fair, since you can leak the box. I will say though it's not as ergonomic as defer, as defer handles all exits from the scope, where it's trickier to juggle destructors. Though on further thought, I suppose the arena can have Drop.

EDIT: What you can't really do is this: https://github.com/smj-edison/zicl/blob/ea8b75a1284e5bd5a309...

Here I'm able to swap out std.MultiArrayList's backing to be backed by virtual memory, and correctly clean it up. I'm not sure you can really do that with Rust, barring making custom data structures for everything.


You can look at the discussions in any of the language RFCs to see that increased complexity is one of the recurring themes that get brought up. RFCs themselves have a "how do we teach this?" section, that IMO makes or break a proposal.

Keep in mind that as time goes on, features being introduced will be more and more niche. If you could do things in a reasonable way without the new feature, the feature wouldn't be needed. That doesn't mean that everyone needs to learn about the feature, only the people that need that niche have to even know about it (as long as it is 1) it interacts reasonably with the rest of the language, 2) its syntax is reasonable in that it is either obvious what's going on or easy to google and memorable so that you don't have to look it up again and 3) it is uncommon enough that looking at a random library you won't be seeing it pop up).


Thanks for the context. That makes a lot of sense! Those three constraints seem pretty important and a useful way to think about the problem.

I think there is: a schism. Another language, inspired, intelligible and interoperable with Rust, but with other goals, likely ease of use or surface simplicity. In my mind it would be pretty much the same as Rust, but whenever a compile error gives you a suggestion in rustc would instead compile (and at most be a warning in this hypothetical language). Migrating from Rust to this language would be changing a single setting in Cargo.toml. The other way around would be fixing a bunch of compile errors. You could use the entire crate ecosystem in a native way. This language could also serve as a test bed for features that might or might not be suitable for Rust. It can also have a more aggressive evolution schedule, meaning that it wouldn't be perma-1.x, so it can be bolder on what is attempted.

There is precedent: with type checkers like pyright you can opt into specific checks, or have a basic, standard, strict setting, each expanding the set of checks done.

How would dependencies work in this schism? E.g. if serde starts using named impls, do all dependencies have to use named impls?


I'd take `Rust with a GC and specialization` over current Rust any day.

I mean… Sure, if we’re just making stuff up, a compiler that can magically understand whatever you were trying to do and then do that instead of what you wrote, I guess that’s a nice fantasy?

But out here on this miserable old Earth I happen to think that Rust’s errors are pretty great. They’re usually catching things I didn’t actually intend to do, rather than preventing me from doing those things.


> But out here on this miserable old Earth I happen to think that Rust’s errors are pretty great. They’re usually catching things I didn’t actually intend to do, rather than preventing me from doing those things.

As it happens, you are replying to the person who made Rust's errors great! (it wasn't just them of course, but they did a lot of it)


I bow to them and thank them for their service!

I used to hate semicolons. Then I started working in parser recovery for rustc. I now love semicolons.

Removing redundancy from syntax should be a non-goal, an anti-goal even. The more redundancy there is, the higher the likelihood of making a mistake while writing, but the higher the ability for humans and machines to understand the developer's intent unambiguously.

Having "flagposts" in the code lets people skim code ("I'm only looking at every pub fn") and the parser have a fighting chance of recovering ("found a parse error inside of a function def, consume everything until the first unmatched } which would correspond to the fn body start and mark the whole body as having failed parsing, let the rest of the compiler run"). Semicolons allow for that kind of recovery. And the same logic that you would use for automatic semicolon insertion can be used to tell the user where they forgot a semicolon. That way you get the ergonomics of writting code in a slightly less principled way while still being able to read principled code after you're done.


Why is ";" different from \n from the perspective of the parser when handling recovery within scopes? Similarly, what's different with "consume everything until the first unmatched }" except substituting a DEDENT token generated by the lexer?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: