Hacker Newsnew | past | comments | ask | show | jobs | submit | squiggleblaz's commentslogin

Not the greatest fan of python, but when I've got to run a python script, I do `nix-shell -p 'python3.withPackages (ps: [ps.requests])' --command 'python3 your-script.py'` Note that there is one argument to -p and one argument to --command -- both are quoted. The argument to -p is a nix expression that will provide a python3 command, referring to a python3 with the requests package. The argument to --command is a bash script that will run python3 with the argument "your-script.py" i.e. it will run your-script.py with the python3 that has the requests package.

I think there's ways you can autoderive a python3 with specific packages from python dependency files, but I can't help you there. I do find AI to be reasonably helpful for answering questions like this: it just might sometimes require a bit of help that you want to understand the answer rather than receive a perfect packaged shell.nix file.


Yes I just replied to your other comment with the same observation. It reminds me of an article by Paul Graham, I forget which, who expressed the difficulty of explaining to programmers who lack an abstraction just how good the abstraction is. Anything you can do with NixOS, you can do with any distribution, because it isn't magic. But somehow, more stuff becomes possible because it gives you a better way to think.

(As for why the docs are so bad, I think it's because of the lack of good canonical documentation. There's too many copies of it. Search engines ignore the canonical version because it's presented as one giant document. Parts of the system aren't documented at all and you have to work out what you've got by reading the code. The result is that you have no idea what to do if you want to improve the situation - it seems like your best option is to create new documentation. And now you have the same basic level of documentation that didn't help the first hundred times it was rewritten. And I don't really think submitting a PR to nixpkgs is exactly userfriendly, so it probably discourages people from doing the "I'm just trying to understand this, so I'll fix up the documentation as I learn something" thing.)


yes i think you've hit the nail on the head. I tend to view NixOS not as a distribution, but as a distribution framework. The system configuration is the sources for an immutable distribution as much as it as system configuration.

You're in no way bound by decisions of the nixpkgs contributors: as you say, we can add a patch. Or we can also decide we totally disapprove of the way they've configured such-and-such a service and write our own systemd service to run it.

Anyone can write a local debian package which adds a patch, and build and install it. And anyone can write a systemd service and use it instead of the distribution's systemd service. But on NixOS, these are equal to the rest of the system rather than outside it. Nixpkgs is just a library which your configuration uses to build a system.


While nix might be free of side effects, activating a nixos configuration isn't as free as you imply. As an example, nixos keeps state around regarding user id/username mappings, to avoid giving the same user id to different users across time. So a fresh install of nixos might leave services unable to read their data files, because the file might be owned by a different user id. And if you activate and enable incus, for instance, it will probably create a bridge device: the device will remain in place after you remove incus, which will have implications for how your network/firewall works that your configuration will depend on but will not enforce or be able to reproduce.

Not an argument against using NixOS - I think the bridge device issue could reasonably be regarded as a bug rather than a fundamental design issue, and the user id/username mapping is a totally reasonable design decision which can be taken into account by forcing the user id numbers anyway.


> As an example, nixos keeps state around regarding user id/username mappings, to avoid giving the same user id to different users across time. So a fresh install of nixos might leave services unable to read their data files, because the file might be owned by a different user id.

One reason to set `mutableUsers = false`: https://mynixos.com/nixpkgs/option/users.mutableUsers.

> And if you activate and enable incus, for instance, it will probably create a bridge device: the device will remain in place after you remove incus, which will have implications for how your network/firewall works that your configuration will depend on but will not enforce or be able to reproduce.

Impermanence: https://github.com/nix-community/impermanence.

To be clear, I don't use neither. But you can get NixOS to be almost completely stateless (if this is something you care) with a few changes. The power is there, but it is disabled by default because it is not the pragmatic choice in most cases.


`One reason to set `mutableUsers = false`: https://mynixos.com/nixpkgs/option/users.mutableUsers.`

That doesn't help. Mutable users is about the lifecycle of the /etc/passwd file. What's I'm referring to is /var/lib/nixos/uid-map.


I'm not OP but that's basically right. With NixOS, nix generates the system configuration as well as making sure the packages are available. If you pin your dependencies using something like nix flakes and rely on git as your source of truth, you can get GitOps for the operating system.

But it isn't necessary. You can certainly make a change and apply it without committing it to git or relying on a CI/CD pipeline to deploy it. And it isn't necessary to use input pinning - if you don't, you can wind up making it at best archaeological work to rollback. Most people recommend flakes nowadays though, whose input pinning and purity rules should prevent any need for archaeology if you do commit before applying.


Engineers want some kind of regulation because they feel like computer systems, which they nominally control, are out of control, because of the business people's demands. They want the right to say no without having to have the consequences of saying no. But then when regulations come in, they're not about regulating business, they're about regulated interactions between people and business. And whereas the idealist sees a regulation as a chance to change things for the better, a regulator sees a regulation as a chance to preserve things as they were just before they became bad. (It takes a politician, not a regulator, to change things.)


> Fixing this is difficult, not just because people are resistant to change, but also because the variations in accents.

The relevance of accents is greatly overstated. The argument is of the form "we should let the perfect be the enemy of the good, and therefore it's impossible". There are a great many words in English whose pronunciation is irregular: these are the ones we should fix. For these, accent is irrelevant; you can pronounce your r's hard or your a's broad, and it doesn't matter: "bury" is pronounced to rhyme with "merry" in probably every accent of English that's ever been, from Old English (ic byrge vs myrge) on. You could just fix 100 words like "bury" and "could" and "are" whose spellings are either wrong or etymological but don't reflect extant variants, and the spelling would be reformed, children's lives would be improved, and it wouldn't be a problem from any perspective of accent variation or etymology or anything.


> "bury" is pronounced to rhyme with "merry" in probably every accent of English that's ever been

I've definitely heard speakers for whom "bury" rhymes with "furry", and that's without the "Merry–Murray merger" (i.e., the same person would pronounce "berry" to rhyme with "merry" and quite distinctly from "bury".)

> You could just fix 100 words like "bury" and "could" and "are" whose spellings are either wrong or etymological but don't reflect extant variants, and the spelling would be reformed, children's lives would be improved, and it wouldn't be a problem from any perspective of accent variation or etymology or anything.

In many cases it would take existing homophones and turn them into additional meanings of the same spelling, which would actually reduce clarity and comprehensibility of written text.


> "bury" is pronounced to rhyme with "merry" in probably every accent of English that's ever been

Bury rhymes with hurry around Philadelphia (NJ, Maryland, some parts of NY).


A certificate authority is an organisation that pays good money to make sure that their internet connection is not being subjected to MITMs. They put vastly more resources into that than you can.

A certificate is evidence that the server you're connected to has a secret that was also possessed by the server that the certificate authority connected to. This means that whether or not you're subject to MITMs, at least you don't seem to be getting MITMed right now.

The importance of certificates is quite clear if you were around on the web in the last days before universal HTTPS became a thing. You would connect to the internet, and you would somehow notice that the ISP you're connected to had modified the website you're accessing.


> pays good money to make sure that their internet connection is not being subjected to MITMs

Is that actually true? I mean, obviously CAs aren't validating DNS challenges over coffee shop Wi-Fi so it's probably less likely to be MITMd than your laptop, but I don't think the BRs require any special precautions to assure that the CA's ISP isn't being MITMd, do they?


A local minimum is a point in the design space from which any change is an improvement (but there's other designs which would be worse, if they make several larger changes). I think it's hard to make that claim about Git. You're probably referring to a local maximum, a point in the design space from which any change makes it better (but there's other designs which would be better, if they make several larger changes).

In my career, I've used Svn, Git and something I think it was called VSS. Git has definitively caused less problems, it's also been easy to teach to newbies. And I think the best feature of Git is that people really really benefit from being taught the Git models and data structures (even bootcamp juniors on their first job), because suddenly they go from a magic incantation perspective to a problem-solving perspective. I've never experienced any other software which has such a powerful mental model.

That of course doesn't mean that Mercurial is not better; I've never used it. It might be that Mercurial would have all the advantages of git and then some. But if that were so, I think it would be hard to say that Git is at a local maximum.


> something I think it was called VSS

Hmm, maybe Microsoft Visual Source Safe? I remember that. It was notorious for multiple reasons:

* Defaulted to requiring users to exclusively 'check out' files before modifying them. Meaning that if one person had checked out a file, no one else could edit that file until it was checked in again.

* Had a nasty habit of occasionally corrupting the database.

* Was rumored to be rarely or not at all used within Microsoft.

* Was so slow as to be nearly unusable if you weren't on the same LAN as the server. Not that a lot of people were working remotely back then (i.e. using a dial-up connection), but for those who were it was really quite bad.


> it's also been easy to teach to newbies

The number of guides proclaiming the ease of Git is evidence that Git is not easy. Things that are actually easy do involve countless arguments about how easy they are.

I can teach an artist or designer who has never heard of version control how to use Perforce in 10 minutes. They’ll run into corner cases, but they’ll probably never lose work or get “into a bad state”.


> A local minimum is [...]

Unless you're in ML, in which case it's a minimum of the loss function, not the utility function...


> You're probably referring to a local maximum, a point in the design space from which any change makes it better (but there's other designs which would be better, if they make several larger changes).

I think you meant "worse" for that first "better."


Git being easy to teach to newbies is an uncommon opinion. It was not clear if you meant easier than Subversion. But this would be even more uncommon.


> I've never experienced any other software which has such a powerful mental model.

I hate to be that guy, but you should spend some time with jj. I thought the same, but jj takes this model, refines it, and gives you more power with fewer primitives. If you feel this way about git, but give it an honest try, I feel like you'd appreciate it.

Or maybe not. Different people are different :)


Reinforcement learning, maximise rewards? They work because rabbits like carrots. What does an LLM want? Haven't we already committed the fundamental error when we're saying we're using reinforcement learning and they want rewards?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: