Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Nushell.sh ls | where size > 10mb | sort-by modified (nushell.sh)
295 points by nailer on March 12, 2023 | hide | past | favorite | 203 comments


My problem with alternative shells (anything not bash) is that I have to learn bash anyway, and if I want to use a scripting language that requires me to first install a runtime, drop into it, doesn't work on colleagues machines, etc. then there are many non-shell alternatives that are better. I'll learn nushell or whatever once a mainstream or somehow desirable distro ships with it as their default shell.


This is the same reason I eventually learned and stuck with Vi(m). Implementing a project on a massive array of SPARC Solaris boxes and every "easy' tool I'd learned just... wasn't there. I had kind of learned Vi at that point, but wasn't remotely proficient in it, and had only gotten as far as I had due to a bunch of cool plugins I'd installed that definitely weren't available.


That last part is often forgotten. I usually stick with defaults (or as close as I reasonably can) not because they are functionally better but because for most things I don't have the return on learning and regularly using two instances of ${tool} isn't worth the extra time. Sometimes it's a pain but it seems to pay off.


And when I'm doing something on my local machine - very often, a more modern tool is required anyway. (Visual Studio in the case of C#/C++, and VS Code is infinitely quicker and simpler to get up and going with when editing other source code).


I used to do this, but then found emacs/tramp and that has worked better ever since.


Assuming everything you connect to has a remote file interface and you only use your computer affords many conveniences. Knowing the default tool first is for when those don't hold, not for when you need to do something from your machine.


That is a good point. But Nu offers someting else:

> First and foremost, Nu is cross-platform. Commands and techniques should work across platforms and Nu has first-class support for Windows, macOS, and Linux.

Source: GitHub. No snark intended.


Nano is fairly commonly installed mercifully. Micro (which is an actually "normal" terminal editor) is becoming more popular too and is also trivial to install (since it is written in Go).


> I have to learn bash anyway

Do you? You need bash installed, certainly, but you don't need to learn bash in order to merely execute bash scripts. To use an analogy, just because your distro scripts things in Python doesn't mean one needs to learn Python to use the distro. I basically never need to write or read a bash scripts these days (I use fish, but I rarely write a real fish script either; just learn the language of your shell and YOLO).


As a fellow fish user, I agree. Occasionally I write fish scripts/functions, but they're for my environment so I don't expect them to run on other people's fish shells, let alone bash. Then when I am in another shell, I can usually get by on the minimum (glob, pipes, redirection), which is mostly fairly universal. And when I'm not SSHd into some other machine (which is 99% of the time), then I get all the superpowers of fish to use.

It's a bit like the equivalent vim argument. Yes, using unmodified vim means you have access to the same tools in every environment. But I spend 99% of my time in one environment, so I'd like to optimise for that one, whether that's by using an IDE or installing a bunch of vim plugins and customising everything. Then when I do need to SSH into a server somewhere, I can get by on the bare minimum - in my case, usually nano. If I'm using nano for long enough that I start missing my IDEA features, then I've been using nano for too long and need to figure out a different way of deploying/debugging changes in the first place.


I have the same problem. The only viable neo-shell in my eyes would have to be 100% Bash compatible anyway, so what's the point?


So, '>' isn't for redirection anymore. How does one do that?

    open foo.txt | append "world"| save --raw foo.txt
Oh.


This reminds me of how Fish handles process substitution without requiring a special syntax.

I love it!

It may seem a bit odd or even cumbersome to experienced shell programmers, but in my experience with Fish, this kind of thing helps a lot with ease of learning.


It is much much better to have things be actual commands (which have a uniform syntax, and are `--help`able) rather than obscure sigils like $(#!@$).

Fish is amazing!


Many of these next-gen shells like Nushell are clearly (and sometimes explicitly) inspired equally by Fish and by PowerShell, and it shows. It's a brilliant combination!


I think the problem with this approach is that it breaks down at some point, and then your only option is to rewrite the entire thing in a real scripting language.

I.e., there is no graceful scaling path.


Isn’t this the problem with older shells as well? Eg you have to use e.g. awk or sed to deal with more complicated text processing. My understanding that the biggest difference between powershell (and nushell) and the older bash-like shells is that they deal with structured, object-like data rather than text. How would things be any different when “things break down” vs an older shell?


One of the nice things about bash is that it interacts very nicely with Perl and Ruby.

I wonder how these other shells work with the backticks and such in Perl/Ruby. Theoretically there is room for very cool interoperability but I wonder how it goes in practice and if any nice tools and patterns are being developed.


Since this is nushell and perl and ruby are now starting to be considered 'old, is that so much of a problem?


It's a path to scaling. You can basically dump a shell script (I assume including nushell) into your .pl/.rb file with a little punctuation and then slowly refactor it to a language that can handle greater complexity.


As opposed to what? Regular shells which break down even quicker?

And there's no hard constraint that would require this to "break down at some point". A well designed shell could also be a "real scripting language", all-in-one.

But the horrible CP/M and POSIX legacy of shell design doesn't help with that.


I suppose I'm more optimistic. From my point of view, structures pipelines and some of the goodies that come with it (like better error handling), can make a shell language into a perfectly capable scripting language. I think the big weakness for these new shells is in the small sizes of their library ecosystems. But PowerShell shows what a language of this kind can do with a sizeable ecosystem and some OS integration.


Should be doing that anyway, bash is terrible for actual programs.


Fish is amazing. And. It was missing critical features like subprocesses for so long (10 years?) that I had to keep passing it by. No subprocesses!


I'm not sure what you mean here. The term 'subprocess' doesn't appear in the fish or bash manuals afaict. (Most web search results for the word 'subprocess' paired with the names of various shells are about Python APIs for shelling out to stuff.) The term does appear in an O'Reilly book about the Korn shell, which identifies command substitution (`$()`) as a form of subprocess. But I think Fish has had command substitution from the start. The same goes for job control, which is the only other thing I could think of that you might mean by the term 'subprocess'.

There are some old job control bugs, job control behaviors that used to not be POSIX compliant, and one outstanding bug having to do with IO redirection in fish functions that seem like they could be sort of relevant here. But there's nothing I can think of that I could interpret as 'lacking subprocesses'.

I only spent a few minutes looking into this, though. I'm curious to know if you can still recall more specifics, so many years later! Maybe there's something from before I started using fish that I've missed here. But fish is only 18 years old, going by the date of its first release, and I've been using it as a daily driver for ~13 years, and it has had everything I might call 'subprocesses' for that entire duration.


Historically we would discourage shell users from

    cat somefile.txt | whatever
and instead tell them to use either

    whatever < somefile.txt
or, if the command accepts input file names as arguments then use those like

    whatever somefile.txt
or

    whatever -i somefile.txt
etc

But perhaps in fish, “useless use of cat” is not so useless and would be recommended?

(I mean aside from the fact that they seem to recommend their “open” command, and so they probably prefer their own “open” over “cat”.)


For some reason I never liked to do it the "right way". First of all it doesn't seem to actually matter.

But most of all the cat way just aligns with my mental model more. Data flows left to right, if you catch my drift.

It also makes it easier to add arguments to the end if re-running it.


Besides it's also a fail-safe way to make sure the program doesn't modify the original file.

    random-app ./myfile.txt 
The app opens the file on its own. It could decide to write to it.

    cat ./myfile.txt | random-app
The app receives chunks of the file from a pipe. It doesn't know where the original file is.

This is especially useful if you aren't sure the program will by default modify the file (such as formatters).


Yeah but you get the same from

    random-app < ./myfile.txt


Only if random-app is well-behaved (and if you knew it was you could just use `random-app ./myfile.txt`).

  $ echo foo > a.txt
  $ <a.txt ( echo bar >/proc/self/fd/0 )
  $ cat a.txt
  bar


That's bizarre and perverse! Surely someone has brought it up as a bug before that stdin is read-write! Is there some Posix standard preventing the standard shells from opening stdin as read-only???


stdin is readonly. `<a.txt ls /proc/self/fd/0` generally gives 'lr-x------' for the permissions. The problem is that having a open file descriptor to a file lets a program get and/or act-as-if-it-had a path to the file; /proc/self/fd/ is just the easiest way to do that.


So the thing is that this is a nothingburger. Just because you gave a pathname to an untrusted app means nothing. You've already trusted the app with your user account. It could already overwrite or vandalize that file no matter how you invoked it. Just because you indicate that file is special to you doesn't change anything in the threat model here. For all you know, the app could just traverse the entire directory tree and trash every file it could possibly write to, or just confine the damage to your $HOME.

There's no reason IMHO to avoid using a file as an argument, or directly as stdin. If you don't trust an app, don't run it in your user account; you run it in a sandbox, right? This is 2023.

Now a case could be made for defending against misbehavior by an app that might write to an fd by mistake, but as a1369209993 demonstrates, writing to stdin is a very deliberate choice, as you'll need to look up a pathname and deliberately open that file as writable. That's not misbehavior, that's malice, and that doesn't belong anywhere near your user account in the first place.


Ah, I believe I see what's happening there.

So, hiding the pathname from an untrusted tool via an unnamed pipe is a Posix security measure. Still bizarre!

Who was the guy who awarded UUOC prizes? Wasn't he a real dyed-in-the-wool security wonk?


>But most of all the cat way just aligns with my mental model more. Data flows left to right, if you catch my drift.

Using `<` doesn't change that model. You can write `< somefile.txt whatever`. I always write my commandlines as `<in_file cmd1 | cmd2 | cmd3 >out_file`

(Though annoyingly this doesn't work for feeding input to while loops. `< <(echo a; echo b; echo c;) while read -r foo; do echo "$foo"; done` is invalid syntax; it needs to be `while read -r foo; do echo "$foo"; done < <(echo a; echo b; echo c;)`)


> Using `<` doesn't change that model. You can write `< somefile.txt whatever`.

If you showed that without context it would look like some prefix/lispy notation so no, you are just showing how bad it is


"This thing I don't understand reminds me of another thing I don't understand, so it's bad I decided."


[flagged]


I thought it was an informative response. I certainly learned some stuff about shell I didn't know before. I'm still gonna use cat because it's simpler to me.

I think your snark is unfounded here.


doesn't it work if you put it after the while but before the read?


No. That will execute `< <(...) read` anew for every loop iteration, which in this case will cause it to print an endless stream of `a`s.


I share the same opinion, I was made fun of with the "useless use of cat award" but find it so convienent to cat | grep then cat | grep | awk | wc then whatever with data flowing left to right and modifying the command sequence as I explore the file content.


While I get your point...

> It also makes it easier to add arguments to the end if re-running it.

I'd like to point out that a redirection doesn't have to be the last thing in a command. E.g. the common 'echo >&2 "some message"' works fine.


Quite. I had occasion to share this[0] in another thread, and it's relevant here, too:

> When I offer a pipeline as a solution I expect it to be reusable. It is quite likely that a pipeline would be added at the end of or spliced into another pipeline. In that case having a file argument to grep screws up reusability, and quite possibly do so silently without an error message if the file argument exists. I. e. `grep foo xyz | grep bar xyz | wc` will give you how many lines in xyz contain bar while you are expecting the number of lines that contain both foo and bar. Having to change arguments to a command in a pipeline before using it is prone to errors. Add to it the possibility of silent failures and it becomes a particularly insidious practice.

[0] https://stackoverflow.com/a/16619430/1040915


> Historically we would discourage shell users from...

For no good reason at all.

It's not less confusing, and when shell programmers start talking about how wasteful it is to fork another process, it's because all the good explanations were already refuted.

Also, it's worth pointing that this style was never unanimous. There is a very vocal minority that pushes for it, and a wide non-vocal majority that says basically "whatever, it's not like there's any difference" if you go and ask the question.


The use of cat is not useless. Having a "read this file" command be the source of the pipeline makes a lot of sense. It makes it seamless to replace the cat with a pv if you want progress, zcat if you're reading a gzipped file, curl if you want to use a URL, etc.


Fair point.

But how often do you make those kinds of changes in any of your scripts and not have to change anything? For me, exactly 0 times


Quite frequently? "Oh this operation turned out to take a while, I'll use pv instead of cat", or "oh what I grepped for wasn't in the current log, it's probably in the previous log that's gzipped, I'll replace 'cat current.log' with 'zcat previous.log'".


Sounds like you describe a interactive shell session, not stuff that happens in that script that was written for a specific purpose.


The 1st scenario and things similar to the 2nd happen in script development.


You'd be surprised


I'm also a big fan of doing it the wrong way. (While I understand there is no practical difference) I find it more natural, because it's symmetrical (or parallel:

  $ < file util1 | util2 | util3 | ...
The first util is the originator of the data pipeline, and it plus the file are the first command

  $ cat file | util1 | ... 
Each utility is it's own thing, being fed a data stream that came from cat. I know it's just personal preference but it feels neater


Or:

  open foo.txt out> bar.txt



`save` seems to be a general serialization command, which lets you save Nu objects in formats like JSON, YAML, and others. The `--raw` option seems like a convenience for enabling that tee-like usage without relying on external programs, but the driving use case is serializing structured data rather than unstructured text or byte streams.


Ah, I failed to realize that nushell deals with rich data.

I think this might hit the sweetspot for me, for dealing with json - I never could get a feel for "jq" like I have with sed/awk, and don't much enjoy powershell.

I don't think I need a new shell (bash is fine - and I would probably move to fish for a new shell if I were to change) - but I certainly need a sane way to wrangle json (hello docker inspect!) and yaml (hello kubectl/k8s and github actions!).


FWIW, I've made basically the same call for now: Fish has been my daily driver for some time, but I have a lot of interest in Nushell and some real affection for Elvish, and I think I'd like to eventually make the switch to one of these newfangled shells with the structured pipelines one day.

The great thing about shell languages is that you get to practice the scripting language with your everyday interactive usage, which is one of the main reasons I'd eventually like to switch, even if I see that perhaps the most exciting use cases are wrangling JSON and YAML in scripts.


I tried to use it as my main shell for a couple days, found it fun and potentially useful, played with its unique features... and eventually uninstalled it and went back to zsh.

For 99.99% of what I use a shell for, Nushell doesn't do anything that I can't do with a regular shell. Maybe just because I'm used to shells/sed/awk/etc.

For the few cases where Nushell could be useful, I can usually do the same thing with Vim/Emacs/VSCode or a trivial Python/Ruby script.


What if someone starts from a clean slate. Which shell is better?

That is the question. Old habits hold us back from progress in usability. Eventually if one system is better it should completely replace the old system. But only if it's better. What one power user deems as "useful" according to his own usage patterns shouldn't be a metric for measurement.

I haven't played with nushell yet, but it actually seems more powerful then python for shell tasks. All the data structures are catered towards easy visualization and parsing parameters related to a shell. Python does not have such convenience.


> What if someone starts from a clean slate. Which shell is better?

Then they learn the better and end up having to learn bash anyway coz that's what runs on remote machines.

There is also not that much to be gained. Common easy tasks might be easier to learn but just few percent faster in the end. Harder ones are faster by one-off script in "real" programming language most of the time. Harder repeatable one should never be written in shell or shell-like language because it is just a terrible one to write any real programs.

Also if you really spend that much time gnobling in shell you'd be FAR better off investing time into learning configuration management tools.


> coz that's what runs on remote machines

Is that actually still an issue these days with Docker containers and what not? How often do you access a remote server via SSH where you share an account with your colleagues and cannot install another shell next to bash? (So that bash could still be the default but you could switch on the fly.)


I think the problem is more about friction. While it might be possible to install another shell on every remote server you work on, are the features really worth the trouble?

Also, I've sometimes had to work on remote boxes on an internal-only network, with no outside internet access. In that case installing anything new is even more of a pain, so it's easier to work with the defaults as much as possible.

That said, I'm all for learning new tools. Progress couldn't be made if everybody just stuck with the old. But there's always a trade-off to be made between time investment into new tools vs. just getting your work done with the old ones.


If you connect sidecart to debug it will most likely have bash.

And why you'd need bash fu just for local work ? The most I use it is for old fashioned servers (we have anything from "classics" to few k8s clusters)


>How often do you access a remote server via SSH where you share an account with your colleagues and cannot install another shell next to bash?

About... almost every day?


> Then they learn the better and end up having to learn bash anyway coz that's what runs on remote machines.

unless you are the person to push for it to be installed on the remote machine.


From a clean state almost everything is better than Bash. I don't think many people seriously say that Bash is good. We use it because of the insane network effects that come from it being preinstalled almost everywhere (except Windows).

The question is not "is nushell better than Bash" - it obviously is. The question is "is nushell better enough that you are willing to install it on every system you use, deal with naysaying colleagues and idiots that have been brainwashed into thinking Bash is great, learn a new tool, deal with all the incompatibilities with software that assumes everyone uses Bash, etc. etc."

As much as I hate Bash I would say it isn't that good. At least currently. If you're willing to install a new tool to run your scripts then there are better options, e.g. Deno. If it catches on and gets installed by default in more places in future (hopefully!) then that might change.

Basically I would say for interactive use Bash is not great but fine. For scripting use you shouldn't use Bash anyway.


>If you're willing to install a new tool to run your scripts then there are better options, e.g. Deno

A shell scripting language should also have a sort of convenience that allows the user to easily navigate and manipulate the OS via repl commands. Javascript while definetely a more powerful scripting language lacks the power of a repl OS command prompt.

Bash at the very least has the property of being good as a UI, even though it is clearly a horrible scripting language.

So nushell in this sense unionizes the best aspects of both worlds. Powerful UI, powerful scripting.


Fish is that shell for me. I tried nushell and zsh. zsh required a bunch of setup and nushell was too young. Whereas fish has really great defaults and enough of support to just go and have fun.


Have you tried oh-my-zsh? It's what got me to install zsh on all my servers, even at work. Just run their setup script, let it change your default shell and set your preferred theme in the config.


I have tried oh-my-zsh, but still fish was easier to get started with. Especially with autocompletions.

However oh-my-zsh was pleasant to use as well!


Unfortunately I felt the same way. I'm probably their prototypical target audience (neovim user, I stay in the terminal 99% of the time) and I couldnt justify switching over to nushell full time. I wish they didn't change the syntax too much - there's still tons of scripts and commands that work on zsh or bash and are used in normal work flows.


> Maybe just because I'm used to shells/sed/awk/etc.

I suspect we’re both great at regexs. But scraping is brittle and trying to use it with object shells defeats the purposes.


I've never seen nushell as a daily driver -- more as a data exploration tool. Have a random export you want to go splunking in? `nu` from your current zsh session and go wild. When you're done, ^D back to your main zsh session, job done. Whereas `jq` is only useful for json, and `xsv` is only useful for CSVs, `nu` offers uniform syntax for exploring many different formats and producing structured data out at the end as well. Neat!


This was precisely my experience as well. It was magnified by the burden of having to rewrite various zsh scripts and parts of my config that I've cultivated over a decade. I thought Nushell was _neat_, but not neat or useful enough spend time migrating my stuff.


I have nothing but positive things to say about nushell, but I do wish the describe command took a positional argument rather than just being piped into. I was trying to write a pipeline that could give me all the versions from a Cargo.toml and "flatten" any records, but got stuck trying to use describe. Ex:

open Cargo.toml | get dependencies | transpose | rename dep version | each { |row| let version = $row.version; if (describe $version) =~ "record" { $version.version } else { $version } }


Have you tried opening up a feature suggestion? The maintainers are very receptive/helpful from my experiences.


I would use

open Cargo.toml | get dependencies | transpose | rename dep version | each { |row| if (($row.version | get -i version | default "" ) != "") { $row.version.version } else { $row.version } }

not sure whether that's the best way.


This has been on HN previously but it's a couple of years later and pleased to see it's still a very active project https://www.nushell.sh/blog/ with good documentation: https://www.nushell.sh/cookbook/



I read that several times thinking it was Nutshell. And now I wish it were.


What's the advantage of this over PowerShell?


Recycling my comment from last time this came up. Some reasons I prefer Nushell over PowerShell:

- less verbose syntax

- better cross-platform support (PowerShell is technically cross-platform, but it has some baggage from its Windows-first history)

- way faster to start up

I'm biased (I'm a member of the Nushell core team), but those are all things that drew me to start contributing to Nushell.

On the other hand, Nushell is certainly less mature+stable than PowerShell. And if Windows administration is your main use case, PowerShell can't really be beat.


- Better errors

- Native support for JSON, etc (no more ConvertFrom-* ConvertTo-*)

- Less verbose

- Different visualisation


PowerShell isn’t that verbose, compared to NuShell. The ls example would be (with standard PowerShell aliases):

    gci | where { $_.Length -gt (10 * 1024 * 1024) } | sort LastWriteTime
-gt might seem weird, but it’s better than overloading >.

(Also, I like the fact that PowerShell is built into Windows and has a large corporation behind it.)


The fact PowerShell has this masochistic tendency to have commands that start with capital case already makes it outside of the realm of consideration for me. It's absolute nonsense. Also I have to say your example is extremely confusing and would probably take quite a bit of time to write IRL. For reference, the equivalent in sh would be:

   find . -maxdepth 1 -type f -size +10M -exec ls -lt {} +
Or a very terse example:

    du -s -h * | sort -r -h | sed 10q
The thing is that if you're already used to POSIX-compliant shells then this is simply natural to write. Nushell has a different proposition of making it closer to natural language, which makes the learning curve much better. PowerShell just makes it different, but not really that much better.

Also this is somewhat tangential, but I hate that PowerShell 5 and 7 are completely different things, and that PowerShell 5 is still the default even in Windows 11.

Oh yeah, and just for lols this is the same thing in zsh:

   ls -lt *(.L+10240k)
By the way, I should note that it's possible to make the PowerShell example much simpler like the following:

   gci | ? Length -gt 10MB | sort LastWriteTime
The thing that really gets me here is the need to know aliases to keep it short, if you were to do it without the aliases it would look like this:

   Get-ChildItem | Where-Object { $_.Length -gt 10MB } | Sort-Object LastWriteTime
Also, again, case-sensitive commands make me mad.


Commands aren't case sensitive. Just try it! Tab completion even works if you start the command lowercase.

As for needing to know aliases... Did the knowledge that 'du' stands for 'disk usage' exist in your brain at birth, or did you have to learn it somehow? That criticism is nonsense.

At least with powershell, commands have a canonical, explicit name with standard verbs and nouns; and a short alias that's equivalent to the initial command in every way. What I also like is that parameters have aliases too: you can write - WhatIf or -wi and it'll work the same. And everything is documented.


PowerShell command names, their parameters, string comparison, and regex matching are all not case sensitive, unless you want them to be.

It really gets you that you need to know aliases to keep it short? So why aren't you writing:

    du --summarize --human-readable | sort --human-numeric-sort --reverse | sed --expression='10q'

`-exec ls -lt {} +` is not good because -exec is not handled by the shell so it behaves differently to running other commands, because `+` is a nonstandard command terminator, because {} is a nonstandard placeholder and because using `find` to run commands is an onion nobody would guess.

In your terse example, `sort -h` depends on you having run `du -h` to trigger the matching "serialise information to text" and "parse information out of text" that they both need to make that one use work, because there's no separation of content and presentation (like PowerShell has) and no structured data to pass between different commands.

Not only will PowerShell tab-complete the names and the parameters, it will tab complete the properties Length and LastWriteTime because it can tell what objects come out of Get-ChildItem and what their available properties are.

> "Also, again, case-sensitive commands make me mad."

You do understand that du, find, sort, sed are case sensitive commands with case sensitive parameters, right?


PowerShell isn’t case sensitive. Personally I prefer `where` over `?` though.


I think I'm an outlier/odd-one for liking PowerShell, but terseness hasn't been a goal for me when I write in PS. I typically write and maintain long scripts and spelling everything out helps me keep track of the logic.

It's not the best language for anything, but working a Windows shop, I'd rather bang stuff out in PS than grind my way through Visual Studio/C# for simple tasks.


tbh, I have used way more bash than ps in my life so far, and the following command is the only one, I understand without googling anything:

Get-ChildItem | Where-Object { $_.Length -gt 10MB } | Sort-Object LastWriteTime `


Get-Old -Fast


I thought you just ranted but then saw the space between it and you acctually wrote something that contributed to the conversation. Well done.


ls is an alias for Get-ChildItem in Powershell as well, btw


Only on Windows. On Linux it's the original `ls`. Same for `cat` and many other commands. That's why using `gci` / `gc` is safer if you want your script to work everywhere.

https://learn.microsoft.com/en-us/powershell/scripting/whats...


Aliases can be redefined in certain situations (such as compiling a LCM configuration into a MOF for use with PowerShell DSC). The only safe way to script is to use the full command names (Verb-Noun).


Command names can be shadowed, you can `function get-content { "hello world" }` and that also breaks gc alias. The only safe way to script is to use the full command names including module names `Microsoft.PowerShell.Management\Get-Content`


For scripts that you intend to be used by other people, yes. My comment was about personal scripts.


Frankly, you lost my interest at LCM, MOF & DSC.


Sorry but that looks terrible.


I don't mind the looks but I find it hard to debug and have a poor instinct for "does this need a curly bracket or a parens?" That's my real issue with PowerShell, I find it very hard to predict what will work unless I know exactly what the rule is.

Shell has similar issues but I've put 20 years into it so the knowledge is there. Every time the PowerShell people describe PowerShell I think "that sounds awesome" but the ergonomics don't work for me.


Parentheses are for grouping. Curly brackets are for creating a new script block. If you would have wanted to write "function foo() {}" in another language, go for curly brackets. It's really not that complex.


Blocks always use braces. Expressions use parentheses for grouping sub-expressions. It's the same as many C-like languages. I can't imagine any situation where you would be confused which one you need to use.


> I can't imagine any situation where you would be confused which one you need to use.

Why is that test returning a boolean on the GP a block?


Because it's an arbitrary piece of code that needs to be re-evaluated for every input item, like an inline function / lambda in other languages, not an expression that is evaluated once when the commandline is instantiated.


Blocks always use braces, but sometimes you are passing parameters and other times you are passing a block.

If I spent all day as a PowerShell dev I'm sure I would know it, but for something I reach for once or twice a year intuitiveness is a feature I would like to have.


What about it looks terrible? The braces are necessary, because this is a lambda.


I'd say that $_ is really weird and looks horrible. Also having to do (10 * 1024 * 1024) instead of just writing 10MB is a downgrade even compared to POSIX-compliant sh. In my other comment I've added a few examples in sh, zsh, and a better PowerShell example:

   gci | ? Length -gt 10MB | sort LastWriteTime


You don’t need the lambda. Where-Object can just take the Length property.


NuShell is properly MIT-licensed, and is therefore Free Software.

Powershell is kind of MIT-licensed, except for the parts that aren't but are still an integral part of it, so not really, and is therefore not Free Software. At least, it's not DFSG-free.

https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=834756

https://github.com/PowerShell/PowerShell/issues/5869


Just for reference, the PowerShell version of this would be:

    ls | where Length -gt 1mb | sort LastWriteTime


Heads-up: if current directory contains other directories, this will fail. You'll need to adjust the filter:

    ls | where { -not $_.PSIsContainer -and $_.Length -gt 1mb } | sort LastWriteTime


This is written in Rust

/s


No, that’s not mentioned as an advantage on the website.


I assume the parent comment did not make fun of nushell advertizing this, but instead about the rust community advertizing itself all the time.


To be honest, they have good reason. Being written in a strongly typed, memory-safe language is a huge advantage. Obviously it's not the primary thing to look out for, but I do prefer tools that are written in it.


There are plenty of strongly typed memory safe languages. In fact I think those are the majority of popular languages.


Yet, almost all the basic tools in a Unix system are written in C.


Of course the person you're replying to is talking about languages without a garbage collector. Systems languages with zero cost abstractions.


But why? There is nothing about a shell that requires zero cost abstractions or no gc. It farms all of its work out to other executables. A shell could be written in literally any language.


I'm not really sure which other popular languages would be considered memory safe AND strongly typed. I know of both C and C++ which I wouldn't consider memory safe. And I know of Javascript, which is not strongly typed... so which do you mean?


Java, Scala, Kotlin, C#, D, Go, and TypeScript to name a few.


Of those only D and Go build self contained binaries (yes i know about Graal/Kotlin Native/.Net AOT). No snark intended.


This is moving the goal post a bit, since the person I replied to was considering JavaScript, but I don’t really think this distinction matters. You get most programs from a package manager.

I am curious though why you don’t count graal or .net aot? They are valid options to produce an aot binary and C# has been able to produce a self contained non-aot runtime for a long a time.


Even Python fits the bill.


> Being written in a strongly typed, memory-safe language is a huge advantage.

Great point, C# is a strongly typed and memory-safe language if you don't use unsafe. So this really is a huge advantage for PowerShell.


> Nu has great error messages

The example is actually terrible though. It implies that changing the 10 to a string will make division a valid operation.

It’s also incredibly verbose.



    [zsh]$ ls -l *(Lm+10om)
zsh supports extended globs, this says:

    -l give me a long listing
    “*(…)” of any file
    Lm+10 who’s size “L” in megabytes “m” is greater than 10
    “om” ordered by modification ascending


Cool but that’s as opaque and hard to understand as sacrificing a goat to order ls output.

It’s of course possible to do with existing shells, but it’s impossible to argue that it’s a clear


Depends what you’re optimising for. Are you optimising for readability - e.g. maybe you share shell history with team mates for example.

Or are you optimising for getting your thoughts transcribed to the machine as quickly as possible. It’s usually this case for me.

I tend to maintain a huge shell history so i really like the terse format for quickly filtering down my historical commands.

Writing shell scripts might be a good case for the longer form style although bourne-esque syntax is burned into my fingertips after all these years that long form is not going to be useful to me.


I also maintain a huge shell history, but in my experience fzf-ing through it becomes a lot harder when it’s terse and unreadable.

But sure, there’s a tension between readability and brain-to-command throughput that’s always been there.

The way you proposed is a bit too write-only for my taste, and the argument that you “had to learn it and now you’re muscle memory bound to it” is applicable to you only and does not preclude a better way from existing.


I use a trick to improve this. I too have a 7 year Shell history with Fzf and love using it as my second brain.

I put #tags at the end of long, opaque, reusable commands so that I can FZF search it quicker.


Wouldn't you want to use 'ls -lt' (sort by time) (or possibly with '-r' if you really want oldest first)? (I.e., does 'om' sort the glob passed to 'ls', which it will then reorder, or does zsh do something magic with the 'ls' command?)


yeah if you were typing this case yourself you would use -lt but i figured it was worth showing that extended globs can be combined.

This globbing functionality in zsh is separate from the ls command, so no ls magic here, but the ls builtin in zsh does have some useful extras in it all the same


I had no idea Zsh could do that. I'm currently reading this article which explains a lot of this wizardry.

https://thevaluable.dev/zsh-expansion-guide-example/


This is fun, but reading the commands[1] reminds me of learning a new language. There are just too of them, and they are all pipelined in a functional manner so I need to understand the I/O of them. In that case, why would one bother remembering all of those, when Xonsh or IPython does similar job to their data, only with a few more lines of (more familiar) code?

[1]: https://www.nushell.sh/commands/


I don't think you need to remember all of Nushell's commands (I certainly don't). You can go far with a handful of important ones; off the top of my head, get/select/where/sort-by/open are the most important.

This is something we should communicate better in the docs, I'll see what I can do.


I have written a bunch of nu and: no dot method syntax means you need to browse documentation for literally everything. So much worse than your average programming language, which takes the data type + you enter a dot = it suggests all the relevant methods. The Nu docs are better than they were a few years ago but as far as I can tell they keep changing command names, having subtle changes of behaviour for commands with the same name but apparently different implementations for different data types, having a crucial command listed under a weird name that takes an hour to find, etc etc. If I need to wrangle some JSON again I think I’ll just use Python.


> subtle changes of behaviour for commands with the same name but apparently different implementations for different data types

This was one of the biggest problems with Nu's dataframe functionality; it should be resolved now that we no longer ship with dataframes by default.

Totally agreed on dot method being nice for discoverability, that's something we need to do better on.


That sounds really promising though, you convinced me. Will have a try.


I have been given a link to this when I was describing my ideas of an operating system design, since some of the ideas are similar. However, many ideas are different, since the operating system is in many ways fundamentally different from Windows and from POSIX. Some of these differences may allow such a similar idea to even work better in some ways, possibly.


This looks extremely interesting! Is anyone using it as a daily driver on Linux? How has your experience been?


I have been using this on my Linux and windows machine as a daily driver. It's cross platform approach to handling commands, and overall usability is what made me switch to this completely. It takes a little while to get used to, especially when coming from bash, but if you know powershell already, I'd say just give it a go.


Yes, daily driving it (it's my default shell in zellij), and have ported my use-every-5-mins scripts to it.

It's VERY refreshing, but I still have issues with it not short-circuiting when an error is thrown in a loop... Which I suspect will get fixed in time...


I second this question. I was using FISH for quite a while and really liking it, but there was always something that it was incompatible with and had to eventually give it up.


I gave up on fish many years ago because of various incompabitilities (e.g. syntax like `FOO=bar some-cmd` or `env FOO=bar some-cmd` etc), but to be fair, they have managed to patch lots/most of those.

Returned back (from zsh) to fish this year and am super happy. Still the best built-in completion / history auto-suggest that I've seen in any shell so far, and super-fast at that.

Most importantly, barely any config needed (no needs for oh-my-zsh, prezto and all of those).


I find this project interesting

But doesn't it look like too monolithic? That will have problems with maintenance and scaling? And features that overlap with many already existing cli tools?


I think Nu generally plays nicely with other CLI tools and data sources. For example if you've got an external executable that emits JSON, all it takes is `from json` to convert that output to a Nushell table.


It's so wildly awesome to deal with data natively in nushell than donhacky shit with jq and strings... In bash. No thanks.



That inspired me to give it a quick try, however the assumption that white on bright white is a readable color conflicts with the need to make them the same color for other software that I use (which only use eight colors and I want white to be white). I'm guessing this might be configurable for anyone sufficiently interested.


I like the trend of commands adding json output. Then you don't need to get a new shell just incrementally add other commands than can process json.


Plus having less cryptic version of jq is a big win considering how often we all work with JSON files these days. It should be a 'primitive' in shells like strings.


> other commands than can process json

Happily, this includes import/conversion functions for new shells, so this plays nice with efforts like Nushell as well.


I’ve been using nu as my main shell for a few years now. Very happy with it.


Seems like a less-asinine take on the ideas underlying PowerShell. Neat!


How well can you use this as utility from your current shell, without replacing it? Something like "nu <command>" to execute nu commands from inside bash?


It would be neat to be able to add durations to dates. This would let me e.g. get all files that are newer than an hour: ls | where modified > time now - 1hr


This works in Nushell; it just needs parentheses:

ls | where modified > (date now) - 1hr


Nice!


I'm sure there's a lot more once you get to more advanced stuff, but going through their System examples has me puzzled:

> ls | where type == dir

or in Bash:

`ls -d`

> ps | where cpu > 0 | sort-by cpu | reverse

or in Bash:

`htop` ?

> ps | where name == Notepad2.exe > # find process, then ... > ps | where name == Notepad2.exe | get pid.0 | kill -9 $in

or in Bash:

`pkill Notepad2.exe`

> Pipeline content to clipboard

Or on MacOS:

`anything | pbcopy`

The first example is actually the only one that makes sense to me:

> ls | where type == file

Amazingly, there's no straightforward way to tell `ls` to output only files (that I know of):

`ls -p | grep -v /`

I don't mean to knock it, but I think it's better to learn the standard tools available everywhere than invest any time in tools that aim to lessen the learning curve.


I think you've partly proven that nushell is useful. `ls -d` doesn't list directories, it lists only the files given as arguments instead of their children. Also, htop is nothing like a scriptable and composable pipeline from ps, and there's no reason `pkill` couldn't co-exist with the equivalent of `kill $(ps | grep Notepad2 | grep -v grep)`.

Dozens of specific commands and flags for specific usecases aren't always enough, and, dare I say it, counter to the UNIX philosophy.


He’s giving simple examples everyone can quickly understand.

When I show people the same advantage that PowerShell has over bash I pick harder problems where bash/gnu tools have no trivial answers. But then people get sour grapes and say “I don’t need to do that kind of thing anyway.” The reality of course is they avoid those harder problems and solve them manually or with different tools (Python).

Sorting is as good example, because while commands like ‘ps’ have some built in sorting capability, it is limited and inconsistent with other tools. You can sort by some columns, but not others. Sorting by the same field is often a different option across different tools even if they manipulate the same objects, etc…

If you would like a harder problem: kill every process on the machine run by the user “ash” that has used more than 1 hour of total CPU time but only those of his processes that are in the top 5 in the entire system by memory usage.

Hint: Don’t kill anyone else’s processes by accident even if their user name is “flash” or if their process name is “bash”. Also don’t confuse instantaneous %CPU with total CPU usage.

Did I say total CPU? Oops, I meant total kernel time used! My mistake. Update your a script, that should be a simple change… right?


As with almost everything, Bash is easy once you learn how to use it ; )

I think this tool is useful for people who need to use the terminal right now, without having to spend quite a bit of time learning, e.g., Bash. Your point about the portability of Bash is, of course, valid. It just that sometimes you need to get stuff done quickly without parsing, at times unfriendly, documentation of more mature tools.


> Amazingly, there's no straightforward way to tell `ls` to output only files (that I know of):

`ls -l|grep -e^-`

Unless you meant without pipes/grep in which case I think no, but, you're bumping up against the fact that `ls` is really more of a user-presentation tool than for scripting. Use `find . -type f` maybe with some `-maxdepth 1`


Literally nothing you mentioned is a shell builtin, your choice of shell matters not at all.


`find . -type f -maxdepth 1`


What's the story with not having a deb / flatpak / appimage or snap distribution? Just lack of resources to help package?


Don't think we've had any requests for those, but we'd welcome contributions if someone wants to make it happen.

For now, Homebrew works if you want a cross-distro way to install Nu on x64 Linux.


Looks like it's fairly trivial to build/get via cargo: https://www.nushell.sh/book/installation.html#build-using-cr...

I'm not likely to use brew on Linux (rather stow/xstow if I have to).


Personally, I would love to give Nushell a try but having to install another package manager (Homebrew) first…


You might be covered by one of the other package managers: https://www.nushell.sh/book/installation.html#package-manage...

We also publish binaries: https://github.com/nushell/nushell/releases


I misread this as Nutshell. Slightly disappointed that wasn’t its actual name (perhaps it’s already taken).


What is the advantage over powershell?


It's not batshit infuriating to use?

Idk, I was an OG PowerShell, I'd take nushells approach every day of the week.


It's not Microsoft based?


It's cross platform. Although of you're coming from powershell background, it is easier to learn.


Powershell is also cross-platform


> Powershell is also cross-platform

I was replying to your "Microsoft based" comment. And yes, powershell is also cross platform and open source.


So there's no support from one of the largest tech companies in the world?


I avoid MS products like the plague nowadays exactly because of their "support", including free overnight OS sabotage.


> one of the largest tech companies in the world

Bigger isn't better my guy


Read this as "Nutshell", which would have been a really cool name for a shell.


same here


Still not 1.0 after all those years?


du -s * | sort -n -r | sed 10q


Okay, but now show me the size of the file in a human readible format next to the name.


You can use the "-h" (human-readable) options to du and sort to do this:

  $ du -sh * | sort -rh
  20M dist
  6.9M pebble
  1.5M internal
  308K cmd
  ...


du -s -h * | sort -r -h


add: | numfmt --to=iec


it's already in a human readable format next to the name.


"sed 10q" is equivalent to "head"?


I like the idea of piping structured data between processes instead of streams of bytes, but that's what PowerShell is for. I don't think I've ever met another dev who likes PowerShell


I'm of the opinion that structured data is something PowerShell got 100% right; people tend to dislike PowerShell for other reasons (verbosity, startup time, Windows-first history).


> I don't think I've ever met another dev who likes PowerShell

I have, a few! But they are the kind of dev to never publish their work anywhere, not even little shell scripts. Secretive windows power users


I'd actually like the ability to do either raw bytes (traditional) or structured bytes (the NuShell/PowerShell idea) in an "opt-in" scenario. This would also allow a smoother transition/lack of needing to fully commit right off the bat. I've considered writing wrappers for standard utilities in Bash/Zsh that accept and output structured data in JSON (or maybe a denser serialization format that can easily convert to JSON?) instead of a raw byte stream that you could then just use regular old pipes with (a lot of `jq` would likely get called in between...) The "structured" versions of the utilities would have some namespacing convention such as a "struct_" prefix (or perhaps optionally, or aliased, for brevity, "s_") or (another naming idea I just had... oooh, I like this one) they would be boxed in brackets, so "[ls]" or "]ls[" would call "structured ls" (note that brackets without spaces around them are valid name characters)

To handle the transition to/from structured data, an idea I had was to omit one of the brackets in these names, so for example "ls[" would emit structured data but accept (well, assuming "ls" was a command that took stdin) an unstructured bytestream, and something like "]cat" would take in structured data on stdin but emit raw data... "cat[" would take raw data and... interpret it as JSON? or something? and output that as structured data? I don't know, it has to be fleshed out, but this could work maybe!

To get the JSON data back to a visual format like a table, we'd probably have to explicitly do what NuShell implicitly calls when you don't provide it (I forgot the name of it).

anyway, I haven't even begun a POC of this idea, but it was one I had. Anyone else like this idea?


This is similar to how my shell works. It still just passes bytes around but additionally passes information about how those bytes could be interpreted. A schema if you will. So it works as cleanly with POSIX / GNU / et al tools as it does with fancy JSON, YAML, CSV and other document formats.

It basically sits somewhere between Powershell and Bash: typed pipelines like Powershell but without sacrificing familiarity with all the CLI commands you already use day in and day out.

https://github.com/lmorg/murex

As an aside, I’m about to drop a massive update in the next few days that will make the shell even more intuitive to use.


Very interesting! Do you transmit this metadata on a different fd, like 3?

(reading more) Super cool. Since it's Go, does that mean runtime errors will silently fail and keep chugging along? /dig ;)

Can you customize the PS1?


> Very interesting! Do you transmit this metadata on a different fd, like 3?

At the moment there is POC code to transmit over fd 3 but I’m looking into using UNIX sockets instead.

> Super cool. Since it's Go, does that mean runtime errors will silently fail and keep chugging along? /dig ;)

This project predates Rust 1.0 so it was a no brainier at the time to use Go. But actually I do think Go is well suited for this type of problem because much as some complain about errors being regular types, the way Go encourages granular handling of errors rather than larger try / catch blocks or exceptions does lead to more tailored error messages. Which ultimately helps explain the problem better for the end user.

Not taking anything away from Rust or any other language. Nor am I saying Go’s error handling doesn’t have its problems. Just that I’d probably choose Go again if I were to start this project tomorrow.

> Can you customize the PS1?

Every part of the shell is customisable. However rather than having hard to discover environmental variables that can alter the shells behaviour, instead there is a command called config that allows you to inspect every option.

https://murex.rocks/docs/commands/config.html


> I don't think I've ever met another dev who likes PowerShell

I have. But they have never used bash, zsh, Fish, or NuShell.


I’ve been using bash for 25 years and like powershell. I like nushell more though (for the reasons the sibling comment mentioned).


I use Zsh at home, and PowerShell 5 (for DSC 1.1) and 7 at work. I love both. I would almost be tempted to use pwsh as my main shell on Linux, but I am too attached to Zsh’s amazing RPROMPT!


I don’t know about rprompt but check out oh-my-pwsh


Few more decades of shell development and the people involved will maybe discover GUI with hotkeys. /s

Seriously though, for me their front page fails to give a convicing example how it's better than existing shells for a typical shell workflows.

Even the last one, which supposed to demonstrate better error messages, shows a cryptic error "change a or b to be the right types and try again". Perhaps type checking can help to say which argument is wrong and how to fix it? Now it's literally "one of your arguments is wrong, check man and try again".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: