Nushell 0.91.0

Nushell, or Nu for short, is a new shell that takes a modern, structured approach to your command line. It works seamlessly with the data from your filesystem, operating system, and a growing number of file formats to make it easy to build powerful command line pipelines.

Today, we're releasing version 0.91.0 of Nu. This release adds changes to globing, an overhaul to the plugin protocol, support for piping command stderr, and new commands!

Where to get it

Nu 0.91.0 is available as pre-built binariesopen in new window or from crates.ioopen in new window. If you have Rust installed you can install it using cargo install nu.

Note

The optional dataframe functionality is available by cargo install nu --features=dataframe.

As part of this release, we also publish a set of optional plugins you can install and use with Nu. To install, use cargo install nu_plugin_<plugin name>.

Table of content

Themes of this release / New features [toc]

Handling globs for variables [toc]

Breaking change

See a full overview of the breaking changes

From this release, if you pass a string variable to commands that support glob patterns, then Nushell won't auto-expand the glob pattern (#11886open in new window, #11946open in new window).

For example, given let f = "a[ab]c.txt", then ls $f will only list a file named a[ab]c.txt. But if you want to auto-expand the glob pattern on variables, there are 3 ways:

  1. Use the glob command with spreading:
    let f = "a*c.txt"
    rm ...(glob $f)
    
  2. Use the into glob command:
    let f = "a*c.txt"
    rm ($f | into glob)
    
  3. Annotate the variable with glob type:
    let f: glob = "a*c.txt"
    rm $f
    

In that case, str escape-glob command is no longer useful, and it has been deprecated (#12018open in new window).

You can check the guideopen in new window and bookopen in new window for more information.

Plugin protocol overhaul [toc]

Breaking change

See a full overview of the breaking changes

The plugin protocol has been redesigned to support plugins that operate on streams (#11911open in new window). A new StreamingPluginopen in new window trait is provided for plugins that want to take advantage of this functionality.

The API for existing plugins written in Rust has not changed, but they do need to be recompiled with the latest version of nu-plugin to use the new protocol.

As part of this effort, several other aspects of the protocol have been improved:

  • Plugins will now automatically refuse to run with an incompatible version of Nushell. This should greatly improve error messages in that case.

  • Plugin custom values are now accepted nested within other values and as arguments. The previous protocol treated them as a special case, and they could only be used as input or output from a plugin command, and could not be nested within other values.

  • Plugins are now expected to keep running until Nushell tells them to stop. Currently, we still run a new plugin executable for each command, but this opens the door for that to change in the future.

  • The bidirectional communication is abstract enough to allow for plugins to have much tighter integration with the engine. Expect further improvements to plugins in future releases!

For details about how the new protocol works, please refer to the updated plugins section of the contributor book, as well as the new plugin protocol reference.

As a great example of a streaming plugin in the wild, @cableheadopen in new window has already created the nu_plugin_from_sseopen in new window plugin for parsing server-sent eventsopen in new window in realtime from an http get call. Kudos!

Stderr pipeling support [toc]

Nushell now supports piping an external command's stderr to another command (#11708open in new window).

Let's say you want to pass stderr output to the less command, you can do this:

cmd e>| less

In case you want to combine both stdout and stderr output to less, you can do this:

cmd o+e>| less

For more information, you can check the guideopen in new window.

REPL stability and panic recovery [toc]

Thanks to the work of @ayax79open in new window in #11860open in new window, #11935open in new window, and #11953open in new window, the Nushell REPL should no longer crash if a panic occurs and should no longer exit if some other error is encountered. Besides being very convenient, this should also make Nushell safer to use a login shell, since panics would previously cause Nushell to crash on an attempted login. Similarly, if a panic was triggered when loading a config file, then this used to prevent the REPL from starting. Now, the REPL falls back to loading the default config files in this case.

Note that panics are still important errors/bugs, so please open issues and bug reports if you encounter any!

Hall of fame [toc]

Bug fixes [toc]

Thanks to all the contributors below for helping us solve issues and bugs 🙏

authortitlePR
@PanGan21open in new windowfix: process empty headers in to md command#12012open in new window
@devynopen in new windowAdd Goodbye message to ensure plugins exit when they are no longer needed#12014open in new window
@zhiburtopen in new windownu-table: Improve table -a#11905open in new window
@ayax79open in new windowwrapping run_repl with catch_unwind and restarting the repl on panic#11860open in new window
@WindSoilderopen in new windowmake stderr works for failed external command#11914open in new window
@kit494wayopen in new windowseparate commandline into subcommands#11877open in new window
@kit494wayopen in new windowFix panic in seq date#11871open in new window
@kit494wayopen in new windowFix commandline --cursor to return int#11864open in new window
@dannou812open in new windowFixed to/from toml date format#11846open in new window
@IanManskeopen in new windowPrevent duplicate keys for lazy make#11808open in new window
@IanManskeopen in new windowPrevent duplicate records keys when decoding from nuon#11807open in new window
@kit494wayopen in new windowAllow comments in match blocks#11717open in new window
@WindSoilderopen in new windowFix file completions which contains glob pattern#11766open in new window
@TrMenopen in new windowEnforce call stack depth limit for all calls#11729open in new window

Our set of commands is evolving [toc]

New commands [toc]

Thanks to the work of @devynopen in new window, this release adds two new commands related to streaming!

tee

Inspired by the Unix tee command, this command allows you to make a copy of a stream to a closure in the middle of your pipeline (#11928open in new window).

Examples:

# Get the sum of numbers from 1 to 100, but also save those numbers to a text file
seq 1 100 | tee { save numbers.txt } | math sum
# The exact opposite: keep the numbers, but save the sum to a file
seq 1 100 | tee { math sum | save sum.txt }
# Run an external command, and save a copy of its log output on stderr
do { cargo run } | tee --stderr { save err.txt }
# Filter the log output before saving it
do { cargo run } | tee --stderr { lines | find WARN | save warnings.txt }

The closure will run in a background thread in parallel, and it will only get a copy of values that actually make their way to the end of the pipeline. For example, if you cut the stream short:

seq 1 100 | tee { save numbers.txt } | first 5

then "numbers.txt" will only contain the first 5 numbers as well.

interleave

This command supports consuming multiple streams in parallel, and combining the streams into a single stream (#11955open in new window).

In contrast to zip, the values are added to the final stream as soon as they're ready, without any regard for fairness. There is no way to know which stream the values came from unless that information is embedded into the values by the closures being executed.

For example, the following zip-based pipeline will always produce output that looks like this:

> seq 1 50 | wrap a | zip { seq 1 50 | wrap b } | flatten
╭────┬────┬────╮
  # │ a  │ b  │
├────┼────┼────┤
  0  1
  1  1
  2  2
  3  2
  4  3
  5  3
...

Each number from a is always paired with a number from b. However, if interleave is used instead, it is not predictable in which order a values will appear with respect to b:

> seq 1 50 | wrap a | interleave { seq 1 50 | wrap b }
╭────┬────┬────╮
  # │ b  │ a  │
├────┼────┼────┤
  0  1
  1  1
  2  2
  3  3
  4  2
  5  3
  6  4
  7  5
  8  6
  9  7
 10  4
...

One advantage of this is that it is not necessary for both streams to produce the same amount of output, and whatever output is produced will be immediately available. This is particularly helpful for running external commands and interleaving lines from all of them:

interleave ...(
  (ls projects).name | each { |project|
    {
      cd $project
      make | lines | each { |line| {project: $project, out: $line} }
    }
  }
)

This example would run the make command in every subdirectory of the "projects" directory, and gather the output lines in parallel.

is-not-empty

As a quality of life improvement, we have added the is-not-empty command in #11991open in new window. It is the same as the is-empty command but negates the result. This should hopefully eliminate the need for users to hand-roll their own is-not-empty command.

commandline

In #11877open in new window, @kit494wayopen in new window improved the signatures for the commandline command. Instead of using flags to perform different operations, commandline now has subcommands:

  • commandline edit: to append, insert, or replace the cursor line (returns nothing)
  • commandline get-cursor: to get the cursor position (returns an integer)
  • commandline set-cursor: to set the cursor position (returns nothing)

These subcommands make certain flags unnecessary, and so these have been marked as deprecated:

  • --cursor
  • --cursor-end
  • --append
  • --insert
  • --replace

Changes to existing commands [toc]

zip supports closures

When zip is passed a closure now, it will run that closure and zip its output as a stream (#11924open in new window). With this change, it is no longer necessary for the argument to zip to complete before the result is available:

seq 1 10 | zip { 1.. | each { $in * 2 } }

This would not have completed before: the infinite stream would have had to be on the input side of zip, because a subexpression would have been (attempted to be) fully consumed:

# never completes, and probably eats up all of your memory!
seq 1 10 | zip (1.. | each { $in * 2 })
# works fine
1.. | each { $in * 2 } | zip (seq 1 10)

Migrating more commands to use uutils

Continuing the integration with uutils, this release migrates the mv and mkdir commands. In #12022open in new window, we renamed the umv command to mv, removing the old implementation of mv. Similarly, we removed the old mkdir command in #12007open in new window and renamed the umkdir command to take its place.

bits supports binary values

Despite their name, the bits commands used to only support integers and did not work with binary values. @astral-lopen in new window has remedied this in #11854open in new window, and binary values now work with the bits family of commands. Where appropriate, some of these commands (bits and, bits or, and bits xor) now have an --endian flag to specify the endianness when operating on binary values of different lengths.

into int --signed

Thanks to @astral-lopen in new window's work in #11902open in new window, the into int command now has a --signed flag for interpreting binary values as signed integers. (The default behavior when converting binary values is to zero-extend.) Additionally, binary values longer than 8 bytes will now error, as this could overflow a 64-bit integer which is what Nushell uses internally for integers.

List spreading for filesystem commands

With #11858open in new window, rm, open, and touch no longer have a required positional parameter and now only have a rest parameter. This should make spreading a list of files/paths to these commands more ergonomic.

Duplicate record keys now error

With #11807open in new window, from nuon now errors if duplicate record keys are found. Similarly, with #11808open in new window, lazy make now errors if duplicate keys are provided.

Removing list of cell path support

select and reject used to allow lists of cell paths as positional arguments like:

let cols = [name size]
ls | select $cols

In #11859open in new window, this was removed in favor of the spread operator:

ls | select ...$cols

Or, one can still provide the cell paths directly:

ls | select name size

Deprecated commands [toc]

Removed commands [toc]

Breaking changes [toc]

Full changelog [toc]