# Command Reference

alias

This command allows you to define shortcuts for other common commands. By default, they only apply to the current session. To persist them, add them to your config.

Syntax: alias <name> = <body>

The command expects two parameters:

  • The name of the alias
  • The body of the alias

# Examples

Define a custom myecho command as an alias:

> alias myecho = echo
> myecho "hello world"
hello world

The suggested help command works!

> myecho -h

Usage:
  > myecho {flags}

flags:
  -h, --help: Display this help message

append

Append a row to the table.

# Examples

Given the following text file cities.txt containing cities:

Canberra
London
Nairobi
Washington

And getting back a Nu table:

> open cities.txt | lines
───┬────────────
 0 │ Canberra
 1 │ London
 2 │ Nairobi
 3 │ Washington
───┴────────────

Add the city named Beijing like so:

> open cities.txt | lines | append Beijing
───┬────────────
 0 │ Canberra
 1 │ London
 2 │ Nairobi
 3 │ Washington
 4 │ Beijing
───┴────────────

It's not possible to add multiple rows at once, so you'll need to use append multiple times:

> open cities.txt | lines | append Beijing | append "Buenos Aires"
───┬──────────────
 0 │ Canberra
 1 │ London
 2 │ Nairobi
 3 │ Washington
 4 │ Beijing
 5 │ Buenos Aires
───┴──────────────

So far we have been working with a table without a column, which leaves us with plain rows. Let's wrap the plain rows into a column called city and save it as a json file called cities.json:

Before we save, let's check how it looks after wrapping:

open cities.txt | lines | wrap city
───┬────────────
 # │ city
───┼────────────
 0 │ Canberra
 1 │ London
 2 │ Nairobi
 3 │ Washington
───┴────────────

And save:

> open cities.txt | lines | wrap city | save cities.json

Since we will be working with rows that have a column, appending like before won't quite give us back what we want:

> open cities.json | append Guayaquil
───┬────────────
 # │ city
───┼────────────
 0 │ Canberra
 1 │ London
 2 │ Nairobi
 3 │ Washington
───┴────────────
───┬───────────
 4 │ Guayaquil
───┴───────────

We append a row literal directly:

> open cities.json | append [[city]; [Guayaquil]]
───┬────────────
 # │ city
───┼────────────
 0 │ Canberra
 1 │ London
 2 │ Nairobi
 3 │ Washington
 4 │ Guayaquil
───┴────────────

autoview

Print the content of the pipeline as a table or list. It is the implied or default viewer when none is provided.

When reading a single value, a table or a list, autoview will attempt to view it. When reading a string that originally comes from a source file it will attempt to use textview. When reading a binary file it will attempt to display its content as hexadecimal numbers and the corresponding characters.

-h, --help Display help message.

# Examples

In all following examples autoview can be removed with no change in the output. The use of autoview at the end of the pipeline is implied when no viewer is explicitly used.

> which nu | get path | autoview
/home/me/.cargo/bin/nu
> ls | autoview
────┬────────────────────┬──────┬─────────┬──────────────
 #  │ name               │ type │ size    │ modified
────┼────────────────────┼──────┼─────────┼──────────────
  0 │ README.md          │ File │   932 B │ 19 hours ago
  1 │ alias.md           │ File │  2.0 KB │ 19 hours ago
  2 │ append.md          │ File │  1.4 KB │ 19 hours ago
   ...
 82 │ wrap.md            │ File │  1.8 KB │ 19 hours ago
────┴────────────────────┴──────┴─────────┴──────────────
> echo "# Hi" "## Section" "Some text" | save file.md
> open file.md | autoview
# Hi
## Section
Some text

autoview will use textview to colorize the text based on the file format. The style used by textview can be configured in config.toml.

> open --raw $(which nu | get path) | autoview
...
126d1c0:   64 31 66 37  62 30 31 63  36 2e 31 31  38 2e 6c 6c   d1f7b01c6.118.ll
126d1d0:   76 6d 2e 34  34 38 37 35  37 31 32 34  39 35 33 39   vm.4487571249539
126d1e0:   34 34 30 34  30 39 00 61  6e 6f 6e 2e  30 30 61 63   440409.anon.00ac
126d1f0:   37 32 65 36  37 66 32 31  39 34 62 32  32 61 61 63   72e67f2194b22aac
126d200:   62 35 39 37  33 36 30 62  64 31 39 38  2e 31 36 2e   b597360bd198.16.
...

cal

Use cal to display a calendar.

# Flags

  • -y, --year: Display the year column
  • -q, --quarter: Display the quarter column
  • -m, --month: Display the month column
  • --full-year <integer>: Display a year-long calendar for the specified year
  • --week-start <string>: Display the calendar with the specified day as the first day of the week
  • --month-names: Display the month names instead of integers

# Examples

> cal
───┬────────┬────────┬─────────┬───────────┬──────────┬────────┬──────────
 # │ sunday │ monday │ tuesday │ wednesday │ thursday │ friday │ saturday
───┼────────┼────────┼─────────┼───────────┼──────────┼────────┼──────────
 0 │        │        │         │           │          │      12
 13456789
 210111213141516
 317181920212223
 424252627282930
 531 │        │         │           │          │        │
───┴────────┴────────┴─────────┴───────────┴──────────┴────────┴──────────
> cal -yqm --full-year 2020
────┬──────┬─────────┬───────┬────────┬────────┬─────────┬───────────┬──────────┬────────┬──────────
 #  │ year │ quarter │ month │ sunday │ monday │ tuesday │ wednesday │ thursday │ friday │ saturday
────┼──────┼─────────┼───────┼────────┼────────┼─────────┼───────────┼──────────┼────────┼──────────
  0202011 │        │        │         │         1234
  1202011567891011
  220201112131415161718
  320201119202122232425
  42020112627282930315202012 │        │        │         │           │          │        │        1
  62020122345678
  72020129101112131415
  820201216171819202122
  920201223242526272829
 102020131234567
 11202013891011121314
 1220201315161718192021
 1320201322232425262728
 14202013293031 │           │          │        │
 15202024 │        │        │         │         1234
 16202024567891011
 1720202412131415161718
 1820202419202122232425
 192020242627282930 │        │
 20202025 │        │        │         │           │          │      12
 212020253456789
 2220202510111213141516
 2320202517181920212223
 2420202524252627282930
 2520202531 │        │         │           │          │        │
 26202026 │        │      123456
 2720202678910111213
 2820202614151617181920
 2920202621222324252627
 30202026282930 │           │          │        │
 31202037 │        │        │         │         1234
 32202037567891011
 3320203712131415161718
 3420203719202122232425
 3520203726272829303136202038 │        │        │         │           │          │        │        1
 372020382345678
 382020389101112131415
 3920203816171819202122
 4020203823242526272829
 412020383031 │         │           │          │        │
 42202039 │        │        │       12345
 432020396789101112
 4420203913141516171819
 4520203920212223242526
 4620203927282930 │          │        │
 472020410 │        │        │         │           │        123
 48202041045678910
 49202041011121314151617
 50202041018192021222324
 51202041025262728293031
 5220204111234567
 532020411891011121314
 54202041115161718192021
 55202041122232425262728
 5620204112930 │         │           │          │        │
 572020412 │        │        │       12345
 5820204126789101112
 59202041213141516171819
 60202041220212223242526
 6120204122728293031 │        │
────┴──────┴─────────┴───────┴────────┴────────┴─────────┴───────────┴──────────┴────────┴──────────
> cal -yqm --full-year 2020 --month-names
────┬──────┬─────────┬───────────┬────────┬────────┬─────────┬───────────┬──────────┬────────┬──────────
 #  │ year │ quarter │ month     │ sunday │ monday │ tuesday │ wednesday │ thursday │ friday │ saturday
────┼──────┼─────────┼───────────┼────────┼────────┼─────────┼───────────┼──────────┼────────┼──────────
  020201 │ january   │        │        │         │         1234
  120201 │ january   │      567891011
  220201 │ january   │     12131415161718
  320201 │ january   │     19202122232425
  420201 │ january   │     262728293031520201 │ february  │        │        │         │           │          │        │        1
  620201 │ february  │      2345678
  720201 │ february  │      9101112131415
  820201 │ february  │     16171819202122
  920201 │ february  │     23242526272829
 1020201 │ march     │      1234567
 1120201 │ march     │      891011121314
 1220201 │ march     │     15161718192021
 1320201 │ march     │     22232425262728
 1420201 │ march     │     293031 │           │          │        │
 1520202 │ april     │        │        │         │         1234
 1620202 │ april     │      567891011
 1720202 │ april     │     12131415161718
 1820202 │ april     │     19202122232425
 1920202 │ april     │     2627282930 │        │
 2020202 │ may       │        │        │         │           │          │      12
 2120202 │ may       │      3456789
 2220202 │ may       │     10111213141516
 2320202 │ may       │     17181920212223
 2420202 │ may       │     24252627282930
 2520202 │ may       │     31 │        │         │           │          │        │
 2620202 │ june      │        │      123456
 2720202 │ june      │      78910111213
 2820202 │ june      │     14151617181920
 2920202 │ june      │     21222324252627
 3020202 │ june      │     282930 │           │          │        │
 3120203 │ july      │        │        │         │         1234
 3220203 │ july      │      567891011
 3320203 │ july      │     12131415161718
 3420203 │ july      │     19202122232425
 3520203 │ july      │     2627282930313620203 │ august    │        │        │         │           │          │        │        1
 3720203 │ august    │      2345678
 3820203 │ august    │      9101112131415
 3920203 │ august    │     16171819202122
 4020203 │ august    │     23242526272829
 4120203 │ august    │     3031 │         │           │          │        │
 4220203 │ september │        │        │       12345
 4320203 │ september │      6789101112
 4420203 │ september │     13141516171819
 4520203 │ september │     20212223242526
 4620203 │ september │     27282930 │          │        │
 4720204 │ october   │        │        │         │           │        123
 4820204 │ october   │      45678910
 4920204 │ october   │     11121314151617
 5020204 │ october   │     18192021222324
 5120204 │ october   │     25262728293031
 5220204 │ november  │      1234567
 5320204 │ november  │      891011121314
 5420204 │ november  │     15161718192021
 5520204 │ november  │     22232425262728
 5620204 │ november  │     2930 │         │           │          │        │
 5720204 │ december  │        │        │       12345
 5820204 │ december  │      6789101112
 5920204 │ december  │     13141516171819
 6020204 │ december  │     20212223242526
 6120204 │ december  │     2728293031 │        │
────┴──────┴─────────┴───────────┴────────┴────────┴─────────┴───────────┴──────────┴────────┴──────────
> cal -ym --full-year 2303 --month-names | where month == "june"
───┬──────┬───────┬────────┬────────┬─────────┬───────────┬──────────┬────────┬──────────
 # │ year │ month │ sunday │ monday │ tuesday │ wednesday │ thursday │ friday │ saturday
───┼──────┼───────┼────────┼────────┼─────────┼───────────┼──────────┼────────┼──────────
 02303 │ june  │        │      123456
 12303 │ june  │      78910111213
 22303 │ june  │     14151617181920
 32303 │ june  │     21222324252627
 42303 │ june  │     282930 │           │          │        │
───┴──────┴───────┴────────┴────────┴─────────┴───────────┴──────────┴────────┴──────────
> cal -my --full-year 2020 --month-names | default friday 0 | where friday == 13
───┬──────┬──────────┬────────┬────────┬─────────┬───────────┬──────────┬────────┬──────────
 # │ year │ month    │ sunday │ monday │ tuesday │ wednesday │ thursday │ friday │ saturday
───┼──────┼──────────┼────────┼────────┼─────────┼───────────┼──────────┼────────┼──────────
 02020 │ march    │      891011121314
 12020 │ november │      891011121314
───┴──────┴──────────┴────────┴────────┴─────────┴───────────┴──────────┴────────┴──────────
> cal -ymq --month-names --week-start monday
───┬──────┬─────────┬───────┬────────┬─────────┬───────────┬──────────┬────────┬──────────┬────────
 # │ year │ quarter │ month │ monday │ tuesday │ wednesday │ thursday │ friday │ saturday │ sunday
───┼──────┼─────────┼───────┼────────┼─────────┼───────────┼──────────┼────────┼──────────┼────────
 020202 │ june  │      1234567
 120202 │ june  │      891011121314
 220202 │ june  │     15161718192021
 320202 │ june  │     22232425262728
 420202 │ june  │     2930 │           │          │        │          │
───┴──────┴─────────┴───────┴────────┴─────────┴───────────┴──────────┴────────┴──────────┴────────

cd

If you didn't already know, the cd command is very simple. It stands for 'change directory' and it does exactly that. It changes the current directory to the one specified. If no directory is specified, it takes you to the home directory. Additionally, using cd .. takes you to the parent directory.

# Examples

/home/username> cd Desktop
/home/username/Desktop> now your current directory has been changed
/home/username/Desktop/nested/folders> cd ..
/home/username/Desktop/nested> cd ..
/home/username/Desktop> cd ../Documents/school_related
/home/username/Documents/school_related> cd ../../..
/home/>
/home/username/Desktop/super/duper/crazy/nested/folders> cd
/home/username> cd ../../usr
/usr> cd
/home/username>

Using cd - will take you to the previous directory:

/home/username/Desktop/super/duper/crazy/nested/folders> cd
/home/username> cd -
/home/username/Desktop/super/duper/crazy/nested/folders> cd

compact

This command allows us to filters out rows with empty columns. Other commands are capable of feeding compact with their output through pipelines.

# Usage

> [input-command] | compact [column-name]

# Examples

Let's say we have a table like this:

> open contacts.json
━━━┯━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━
 # │ name     │ email
───┼──────────┼──────────────────
 0 │ paul     │ paul@example.com
 1 │ andres   │
 2 │ jonathan │
━━━┷━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━

compact allows us to filter out rows with empty email column:

> open contacts.json | compact email
━━━━━━┯━━━━━━━━━━━━━━━━━━
 name │ email
──────┼──────────────────
 paul │ paul@example.com
━━━━━━┷━━━━━━━━━━━━━━━━━━

config

Configuration management.

Syntax: config {flags}

# Flags

load <file path shape>
  load the config from the path give

set <any shape>
  set a value in the config, eg) set variable value

set_into <member shape>
  sets a variable from values in the pipeline

get <any shape>
  get a value from the config

remove <any shape>
  remove a value from the config

clear
  clear the config

path
  return the path to the config file

# Variables

Variable Type Description
path table of strings PATH to use to find binaries
env row the environment variables to pass to external commands
ctrlc_exit boolean whether or not to exit Nu after multiple ctrl-c presses
table_mode "light" or other enable lightweight or normal tables
edit_mode "vi" or "emacs" changes line editing to "vi" or "emacs" mode
key_timeout integer (milliseconds) vi: the delay to wait for a longer key sequence after ESC
history_size integer maximum entries that will be stored in history (100,000 default)
completion_mode "circular" or "list" changes completion type to "circular" (default) or "list" mode
complete_from_path boolean whether or not to complete names of binaries on PATH (default true)
rm_always_trash boolean whether or not to always use system trash when no flags are given to rm
pivot_mode "auto" or "always" or "never" "auto" will only pivot single row tables if the output is greater than the terminal width. "always" will always pivot single row tables. "never" will never pivot single row tables.
plugin_dirs table of strings additional directories to search for plugins during startup

# Examples

> config set table_mode "light"

A more detailed description on how to use this command to configure Nu shell can be found in the configuration chapter of Nu Book (opens new window).

count

Obtain the row or column count of a table.

# Flags

  • -c, --column: Calculate number of columns in table

# Examples

> ls
────┬────────────────────┬──────┬──────────┬──────────────
 #  │ name               │ type │ size     │ modified
────┼────────────────────┼──────┼──────────┼──────────────
 0  │ CODE_OF_CONDUCT.md │ File │   3.4 KB │ 42 mins ago
 1  │ CONTRIBUTING.md    │ File │   1.3 KB │ 42 mins ago
 2  │ Cargo.lock         │ File │ 113.3 KB │ 42 mins ago
 3  │ Cargo.toml         │ File │   4.6 KB │ 42 mins ago
 4  │ LICENSE            │ File │   1.1 KB │ 3 months ago
 5  │ Makefile.toml      │ File │    449 B │ 5 months ago
 6  │ README.md          │ File │  15.9 KB │ 31 mins ago
 7  │ TODO.md            │ File │      0 B │ 42 mins ago
 8  │ assets             │ Dir  │    128 B │ 5 months ago
 9  │ build.rs           │ File │     78 B │ 4 months ago
 10 │ crates             │ Dir  │    704 B │ 42 mins ago
 11 │ debian             │ Dir  │    352 B │ 5 months ago
 12 │ docker             │ Dir  │    288 B │ 3 months ago
 13 │ docs               │ Dir  │    192 B │ 42 mins ago
 14 │ features.toml      │ File │    632 B │ 4 months ago
 15 │ images             │ Dir  │    160 B │ 5 months ago
 16 │ rustfmt.toml       │ File │     16 B │ 5 months ago
 17 │ src                │ Dir  │    128 B │ 1 day ago
 18 │ target             │ Dir  │    160 B │ 5 days ago
 19 │ tests              │ Dir  │    192 B │ 3 months ago
────┴────────────────────┴──────┴──────────┴──────────────

By default, count will return the number of rows in a table

> ls | count
20

The -c flag will produce a count of the columns in the table

> ls | count -c
4
> ls | where type == File | count
11
> ls | where type == Dir | count
9
> ls | where size > 2KB | count
4

debug

debug prints a debugging view of the table data. It is useful when you want to get the specific types of the data and while investigating errors.

# Examples

> ls | first 2 | debug
───┬──────────────────────────────────────────
 # │
───┼──────────────────────────────────────────
 0(name=".azure"type="Dir"size=nothing
   │ modified=2020-02-09T05:31:39.950305440Z((B
   │ mdate))
 1(name=".cargo"type="Dir"size=nothing
   │ modified=2020-01-06T05:45:30.933303081Z((B
   │ mdate))
───┴──────────────────────────────────────────
> ls | last 8 | get type | debug
───┬───────────────────────
 # │
───┼───────────────────────
 0"Dir"
 1"Dir"
 2"File"
 3"Dir"
 4"File"
 5"Dir"
 6"Dir"
 7"Dir"
───┴───────────────────────
> open --raw Cargo.toml | size | debug
(lines=139 words=560 chars=4607 bytes=4607)
> du src/ | debug
(path="src"(path)
 apparent=705300(bytesize)
 physical=1118208(bytesize)
 directories=[(path="src/utils"(path) apparent=21203(bytesize) physical=24576(bytesize))
  (path="src/data"(path)
   apparent=52860(bytesize)
   physical=86016(bytesize)
   directories=[(path="src/data/config"(path) apparent=2609(bytesize) physical=12288(bytesize))
    (path="src/data/base"(path) apparent=12627(bytesize) physical=16384(bytesize))])
  (path="src/env"(path) apparent=30257(bytesize) physical=36864(bytesize))
  (path="src/plugins"(path) apparent=1358(bytesize) physical=49152(bytesize))
  (path="src/commands"(path)
   apparent=412617(bytesize)
   physical=651264(bytesize)
   directories=[(path="src/commands/classified"(path) apparent=37125(bytesize) physical=49152(bytesize))])
  (path="src/evaluate"(path) apparent=11475(bytesize) physical=24576(bytesize))
  (path="src/format"(path) apparent=15426(bytesize) physical=24576(bytesize))
  (path="src/shell"(path) apparent=81093(bytesize) physical=94208(bytesize))])

date

Use date to get the current date and time. Defaults to local timezone but you can get it in UTC too.

# Flags

--utc
  Returns the current date and time in UTC

--local
  Returns the current date and time in your local timezone

# Examples

> date
──────────┬────────
 year     │ 2020
 month    │ 6
 day      │ 21
 hour     │ 18
 minute   │ 3
 second   │ 43
 timezone │ -04:00
──────────┴────────
> date --utc
──────────┬──────
 year     │ 2020
 month    │ 6
 day      │ 21
 hour     │ 22
 minute   │ 3
 second   │ 53
 timezone │ UTC
──────────┴──────
> date --local
──────────┬────────
 year     │ 2020
 month    │ 6
 day      │ 21
 hour     │ 18
 minute   │ 4
 second   │ 3
 timezone │ -04:00
──────────┴────────

def

Use def to create a custom command.

# Examples

> def my_command [] { echo hi nu }
> my_command
hi nu
> def my_command [adjective: string, num: int] { echo $adjective $num meet nu }
> my_command nice 2
nice 2 meet nu
def my_cookie_daemon [
    in: path             # Specify where the cookie daemon shall look for cookies :p
    ...rest: path        # Other places to consider for cookie supplies
    --output (-o): path  # Where to store leftovers
    --verbose
] {
    echo $in $rest | each { eat $it }
    ...
}
my_cookie_daemon /home/bob /home/alice --output /home/mallory

Further (and non trivial) examples can be found in our nushell scripts repo (opens new window)

# Syntax

The syntax of the def command is as follows. def <name> <signature> <block>

The signature is a list of parameters flags and at maximum one rest argument. You can specify the type of each of them by appending : <type>. Example:

def cmd [
parameter: string
--flag: int
...rest: path
] { ... }

It is possible to comment them by appending # Comment text! Example

def cmd [
parameter # Paramter Comment
--flag: int # Flag comment
...rest: path # Rest comment
] { ... }

Flags can have a single character shorthand form. For example --output is often abbreviated by -o. You can declare a shorthand by writing (-<shorthand>) after the flag name. Example

def cmd [
--flag(-f): int # Flag comment
] { ... }

You can make a parameter optional by adding ? to its name. Optional parameters do not need to be passed. (TODO Handling optional parameters in scripts is WIP. Please don't expect it to work seamlessly)

def cmd [
parameter?: path # Optional parameter
] { ... }

default

This command sets a default row's column if missing. Other commands are capable of feeding default with their output through pipelines.

# Usage

> [input-command] | default [column-name] [column-value]

# Examples

Let's say we have a table like this:

> open contacts.json
━━━┯━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━
 # │ name     │ email
───┼──────────┼──────────────────
 0 │ paul     │ paul@example.com
 1 │ andres   │
 2 │ jonathan │
━━━┷━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━

default allows us to fill email column with a default value:

> open contacts.json | default email "no-reply@example.com"
━━━┯━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━
 # │ name     │ email
───┼──────────┼──────────────────────
 0 │ paul     │ paul@example.com
 1 │ andres   │ no-reply@example.com
 2 │ jonathan │ no-reply@example.com
━━━┷━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━

du

du stands for disk usage. It will give you the physical and apparent size of files and folders

# Examples

> du src/commands
─────────────┬────────────────────────────
 path        │ crates/nu-cli/src/commands
 apparent    │ 655.9 KB
 physical    │ 950.3 KB
 directories │ [table 5 rows]
 files       │
─────────────┴────────────────────────────
> du -a src/commands
─────────────┬────────────────────────────
 path        │ crates/nu-cli/src/commands
 apparent    │ 655.9 KB
 physical    │ 950.3 KB
 directories │ [table 5 rows]
 files       │ [table 118 rows]
─────────────┴────────────────────────────
> du *.rs
───┬──────────┬──────────┬──────────
 # │ path     │ apparent │ physical
───┼──────────┼──────────┼──────────
 0 │ build.rs │     78 B │   4.1 KB
───┴──────────┴──────────┴──────────

echo

Use echo to repeat arguments back to the user

# Examples

> echo Hello world
───┬───────
 # │
───┼───────
 0 │ Hello
 1 │ world
───┴───────
> echo "Hello, world!"
Hello, world!

empty

Check for empty values. Pass the column names to check emptiness. Optionally pass a block as the last parameter if setting contents to empty columns is wanted.

# Examples

Check if a value is empty

> echo '' | empty?
true

Given the following meals

> echo [[meal size]; [arepa small] [taco '']]
═══╦═══════╦═══════
 # ║ meal  ║ size
═══╬═══════╬═══════
 0 ║ arepa ║ small
 1 ║ taco  ║
═══╩═══════╩═══════

Show the empty contents

> echo [[meal size]; [arepa small] [taco '']] | empty? meal size
═══╦══════╦══════
 # ║ meal ║ size
═══╬══════╬══════
 0 ║ No   ║ No
 1 ║ No   ║ Yes
═══╩══════╩══════

Let's assume we have a report of totals per day. For simplicity we show just for three days 2020/04/16, 2020/07/10, and 2020/11/16. Like so

> echo [[2020/04/16 2020/07/10 2020/11/16]; ['' 27 37]]
═══╦════════════╦════════════╦════════════
 # ║ 2020/04/16 ║ 2020/07/10 ║ 2020/11/16
═══╬════════════╬════════════╬════════════
 0 ║            ║         2737
═══╩════════════╩════════════╩════════════

In the future, the report now has many totals logged per day. In this example, we have 1 total for the day 2020/07/10 and 2020/11/16 like so

> echo [[2020/04/16 2020/07/10 2020/11/16]; ['' [27] [37]]]
═══╦════════════╦════════════════╦════════════════
 # ║ 2020/04/16 ║ 2020/07/10     ║ 2020/11/16
═══╬════════════╬════════════════╬════════════════
 0 ║            ║ [table 1 rows][table 1 rows]
═══╩════════════╩════════════════╩════════════════

We want to add two totals (numbers 33 and 37) for the day 2020/04/16

Set a table with two numbers for the empty column

> echo [[2020/04/16 2020/07/10 2020/11/16]; ['' [27] [37]]] | empty? 2020/04/16 { = [33 37] }
═══╦════════════════╦════════════════╦════════════════
 # ║ 2020/04/16     ║ 2020/07/10     ║ 2020/11/16
═══╬════════════════╬════════════════╬════════════════
 0[table 2 rows][table 1 rows][table 1 rows]
═══╩════════════════╩════════════════╩════════════════

Checking all the numbers

> echo [[2020/04/16 2020/07/10 2020/11/16]; ['' [27] [37]]] | empty? 2020/04/16 { = [33 37] } | pivot _ totals | get totals
═══╦════
 033
 137
 227
 337
═══╩════

enter

This command creates a new shell and begin at this path.

# Examples

/home/foobar> cat user.json
{
    "Name": "Peter",
    "Age": 30,
    "Telephone": 88204828,
    "Country": "Singapore"
}
/home/foobar> enter user.json
/> ls
━━━━━━━┯━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━
 Name  │ Age │ Telephone │ Country
───────┼─────┼───────────┼───────────
 Peter │  3088204828 │ Singapore
━━━━━━━┷━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━
/> exit
/home/foobar>

It also provides the ability to work with multiple directories at the same time. This command will allow you to create a new "shell" and enter it at the specified path. You can toggle between this new shell and the original shell with the p (for previous) and n (for next), allowing you to navigate around a ring buffer of shells. Once you're done with a shell, you can exit it and remove it from the ring buffer.

/> enter /tmp
/tmp> enter /usr
/usr> enter /bin
/bin> enter /opt
/opt> p
/bin> p
/usr> p
/tmp> p
/> n
/tmp>

# Note

If you enter a JSON file with multiple a top-level list, this will open one new shell for each list element.

/private/tmp> printf "1\\n2\\n3\\n" | lines | save foo.json
/private/tmp> enter foo.json
/> shells
───┬────────┬─────────────────────────┬──────────────
 # │ active │ name                    │ path
───┼────────┼─────────────────────────┼──────────────
 0 │        │ filesystem              │ /private/tmp
 1 │        │ {/private/tmp/foo.json} │ /
 2 │        │ {/private/tmp/foo.json} │ /
 3 │ X      │ {/private/tmp/foo.json} │ /
───┴────────┴─────────────────────────┴──────────────
/>

every

Selects every n-th row of a table, starting from the first one. With the --skip flag, every n-th row will be skipped, inverting the original functionality.

Syntax: > [input-command] | every <stride> {flags}

# Flags

  • --skip, -s: Skip the rows that would be returned, instead of selecting them

# Examples

> open contacts.csv
───┬─────────┬──────┬─────────────────
 # │ first   │ last │ email
───┼─────────┼──────┼─────────────────
 0 │ John    │ Doe  │ doe.1@email.com
 1 │ Jane    │ Doe  │ doe.2@email.com
 2 │ Chris   │ Doe  │ doe.3@email.com
 3 │ Francis │ Doe  │ doe.4@email.com
 4 │ Stella  │ Doe  │ doe.5@email.com
───┴─────────┴──────┴─────────────────
> open contacts.csv | every 2
───┬─────────┬──────┬─────────────────
 # │ first   │ last │ email
───┼─────────┼──────┼─────────────────
 0 │ John    │ Doe  │ doe.1@email.com
 2 │ Chris   │ Doe  │ doe.3@email.com
 4 │ Stella  │ Doe  │ doe.5@email.com
───┴─────────┴──────┴─────────────────
> open contacts.csv | every 2 --skip
───┬─────────┬──────┬─────────────────
 # │ first   │ last │ email
───┼─────────┼──────┼─────────────────
 1 │ Jane    │ Doe  │ doe.2@email.com
 3 │ Francis │ Doe  │ doe.4@email.com
───┴─────────┴──────┴─────────────────

exit

Exits the nu shell. If you have multiple nu shells, use exit --now to exit all of them.

# Examples

> exit
> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼─────────────────────────────────────
 0 │   │ filesystem │ /home/jonathanturner/Source/nushell
 1 │   │ filesystem │ /home
 2 │ X │ filesystem │ /usr
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> exit
> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼─────────────────────────────────────
 0 │   │ filesystem │ /home/jonathanturner/Source/nushell
 1 │ X │ filesystem │ /home
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> exit --now
exits both the shells

fetch

This command loads from a URL into a cell, convert it to table if possible (avoid by appending --raw flag)

# Examples

> fetch http://headers.jsontest.com
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━
 X-Cloud-Trace-Context                                 │ Accept │ Host                 │ Content-Length │ user-agent
───────────────────────────────────────────────────────┼────────┼──────────────────────┼────────────────┼─────────────────────────
 aeee1a8abf08820f6fe19d114dc3bb87/16772233176633589121 │ */*    │ headers.jsontest.com │ 0              │ curl/7.54.0 isahc/0.7.1
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━
> fetch http://headers.jsontest.com --raw
{
   "X-Cloud-Trace-Context": "aeee1a8abf08820f6fe19d114dc3bb87/16772233176633589121",
   "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3",
   "Upgrade-Insecure-Requests": "1",
   "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.90 Safari/537.36",
   "Host": "headers.jsontest.com",
   "Accept-Language": "en-GB,en-US;q=0.9,en;q=0.8"
}
> fetch https://www.jonathanturner.org/feed.xml
━━━━━━━━━━━━━━━━
 rss
────────────────
 [table: 1 row]
━━━━━━━━━━━━━━━━

first

Use first to retrieve the first "n" rows of a table. first has a required amount parameter that indicates how many rows you would like returned. If more than one row is returned, an index column will be included showing the row number.

# Examples

> ps | first 1
─────────┬──────────────────
 pid     │ 14733
 name    │ nu_plugin_core_p
 status  │ Running
 cpu     │ 4.1229
 mem     │ 2.1 MB
 virtual │ 4.8 GB
─────────┴──────────────────

> ps | first 5
───┬───────┬──────────────────┬─────────┬──────────┬─────────┬─────────
 # │ pid   │ name             │ status  │ cpu      │ mem     │ virtual
───┼───────┼──────────────────┼─────────┼──────────┼─────────┼─────────
 014747 │ nu_plugin_core_p │ Running │   3.56532.1 MB │  4.8 GB
 114735 │ Python           │ Running │ 100.000827.4 MB │  5.4 GB
 214734 │ mdworker_shared  │ Running │   0.000018.4 MB │  4.7 GB
 314729 │ mdworker_shared  │ Running │   0.00008.2 MB │  5.0 GB
 414728 │ mdworker_shared  │ Running │   0.00008.0 MB │  4.9 GB
───┴───────┴──────────────────┴─────────┴──────────┴─────────┴─────────

format

Format columns into a string using a simple pattern

Syntax: format <pattern>

# Parameters

  • <pattern>: the pattern to match

# Example

Let's say we have a table like this:

> open pets.csv
━━━┯━━━━━━━━━━━┯━━━━━━━━┯━━━━━
 # │ animal    │ name   │ age
───┼───────────┼────────┼─────
 0cat       │ Tom    │ 7
 1 │ dog       │ Alfred │ 10
 2 │ chameleon │ Linda  │ 1
━━━┷━━━━━━━━━━━┷━━━━━━━━┷━━━━━

format allows us to convert table data into a string by following a formatting pattern. To print the value of a column we have to put the column name in curly brackets:

> open pets.csv | format "{name} is a {age} year old {animal}"
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │
───┼─────────────────────────────────
 0 │ Tom is a 7 year old cat
 1 │ Alfred is a 10 year old dog
 2 │ Linda is a 1 year old chameleon
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

from csv

Converts csv data into table. Use this when nushell cannot determine the input file extension.

# Example

Let's say we have the following file:

> cat pets.txt
animal, name, age
cat, Tom, 7
dog, Alfred, 10
chameleon, Linda, 1

pets.txt is actually a .csv file but it has the .txt extension, open is not able to convert it into a table:

> open pets.txt
animal, name, age
cat, Tom, 7
dog, Alfred, 10
chameleon, Linda, 1

To get a table from pets.txt we need to use the from csv command:

> open pets.txt | from csv
━━━┯━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━
 # │ animal    │  name   │  age
───┼───────────┼─────────┼──────
 0cat       │  Tom    │  7
 1 │ dog       │  Alfred │  10
 2 │ chameleon │  Linda  │  1
━━━┷━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━

To ignore the csv headers use --headerless:

━━━┯━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━━━
 # │ Column1   │ Column2 │ Column3
───┼───────────┼─────────┼─────────
 0 │ dog       │  Alfred │  10
 1 │ chameleon │  Linda  │  1
━━━┷━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━━━

To split on a character other than ',' use --separator:

> open pets.txt
animal; name; age
cat; Tom; 7
dog; Alfred; 10
chameleon; Linda; 1
> open pets.txt | from csv --separator ';'
━━━┯━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━
 # │ animal    │  name   │  age
───┼───────────┼─────────┼──────
 0cat       │  Tom    │  7
 1 │ dog       │  Alfred │  10
 2 │ chameleon │  Linda  │  1
━━━┷━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━

To use this command to open a csv with separators other than a comma, use the --raw switch of open to open the csv, otherwise the csv will enter from csv as a table split on commas rather than raw text.

> mv pets.txt pets.csv
> open pets.csv | from csv --separator ';'
error: Expected a string from pipeline
- shell:1:16
1 | open pets.csv | from csv --separator ';'
  |                 ^^^^^^^^ requires string input
- shell:1:0
1 | open pets.csv | from csv --separator ';'
  |  value originates from here

> open pets.csv --raw | from csv --separator ';'
━━━┯━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━
 # │ animal    │  name   │  age
───┼───────────┼─────────┼──────
 0cat       │  Tom    │  7
 1 │ dog       │  Alfred │  10
 2 │ chameleon │  Linda  │  1
━━━┷━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━

The string '\t' can be used to separate on tabs. Note that this is the same as using the from tsv command.

Newlines '\n' are not acceptable separators.

Note that separators are currently provided as strings and need to be wrapped in quotes.

> open pets.csv --raw | from csv --separator ;
- shell:1:43
1 | open pets.csv --raw | from csv --separator ;
  |                                            ^

It is also considered an error to use a separator greater than one char:

> open pets.txt | from csv --separator '123'
error: Expected a single separator char from --separator
- shell:1:37
1 | open pets.txt | from csv --separator '123'
  |                                      ^^^^^ requires a single character string input

from ics

Parse text as .ics and create table.

Syntax: from ics

# Examples

Suppose calendar.txt is a text file that is formatted like a .ics (iCal) file:

> open calendar.txt
BEGIN:VCALENDAR
BEGIN:VEVENT
DTSTART:20171007T200000Z
DTEND:20171007T233000Z
DTSTAMP:20200319T182138Z
SUMMARY:Basketball Game
UID:4l80f6dcovnriq38g57g07btid@google.com
...

Pass the output of the open command to from ics to get a correctly formatted table:

> open calendar.txt | from ics
───┬────────────────┬──────────────────┬────────────────┬────────────────┬────────────────┬────────────────┬────────────────
 # │ properties     │ events           │ alarms         │ to-Dos         │ journals       │ free-busys     │ timezones
───┼────────────────┼──────────────────┼────────────────┼────────────────┼────────────────┼────────────────┼────────────────
 0[table 0 rows][table 1 row][table 0 rows][table 0 rows][table 0 rows][table 0 rows][table 0 rows]
───┴────────────────┴──────────────────┴────────────────┴────────────────┴────────────────┴────────────────┴────────────────
> open calendar.txt | from ics | get events | get properties | where name == "SUMMARY"
─────┬─────────┬───────────────────────────────────────┬────────
 #   │ name    │ value                                 │ params
─────┼─────────┼───────────────────────────────────────┼────────
   0 │ SUMMARY │ Basketball Game                       │

from ini

Converts ini data into table. Use this when nushell cannot determine the input file extension.

# Example

Let's say we have the following .txt file:

> open sample.txt
[SectionOne]

key = value
integer = 1234
string1 = 'Case 1'

This file is actually a ini file, but the file extension isn't .ini. That's okay, we can use the from ini command:

> open sample.txt | from ini | get SectionOne
━━━━━━━┯━━━━━━━━━┯━━━━━━━━━━
 key   │ integer │ string1
───────┼─────────┼──────────
 value │ 1234'Case 1'
━━━━━━━┷━━━━━━━━━┷━━━━━━━━━━

from json

Parse text as .json and create table. Use this when nushell cannot determine the input file extension.

Syntax: from json {flags}

# Flags

--objects
  treat each line as a separate value

# Examples

> open command_from-json
[
    {
        title: "from json",
        type: "command",
        flags: true
    }
]
> open command_from-json | from json
━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━
 title     │ type    │ flags
───────────┼─────────┼───────
 from json │ command │ Yes
━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━

from ods

Parses OpenDocument Spreadsheet binary data into a table. open calls from ods automatically when the file extension is ods. Use this command when open is unable to guess the file type from the extension.

# Examples

> open abc.ods
─────────────────
 Sheet1
─────────────────
 [table 26 rows]
─────────────────
> open abc.ods --raw
Length: 4816 (0x12d0) bytes
0000:   50 4b 03 04  14 00 00 00  00 00 00 00  00 00 85 6c   PK.............l
0010:   39 8a 2e 00  00 00 2e 00  00 00 08 00  00 00 6d 69   9.............mi
0020:   6d 65 74 79  70 65 61 70  70 6c 69 63  61 74 69 6f   metypeapplicatio
...
12a0:   00 61 10 00  00 4d 45 54  41 2d 49 4e  46 2f 6d 61   .a...META-INF/ma
12b0:   6e 69 66 65  73 74 2e 78  6d 6c 50 4b  05 06 00 00   nifest.xmlPK....
12c0:   00 00 06 00  06 00 5a 01  00 00 60 11  00 00 00 00   ......Z...`.....
> open abc.ods --raw | from ods
─────────────────
 Sheet1
─────────────────
 [table 26 rows]
─────────────────

from toml

Converts toml data into table. Use this when nushell cannot determine the input file extension.

# Example

Let's say we have the following Rust .lock file:

> open Cargo.lock
# This file is automatically @generated by Cargo.
# It is not intended for manual editing.                                   [[package]]                                                                name = "adler32"                                                           version = "1.0.3"                                                          source = "registry+https://github.com/rust-lang/crates.io-index"
...

The "Cargo.lock" file is actually a .toml file, but the file extension isn't .toml. That's okay, we can use the from toml command:

> open Cargo.lock | from toml
─────────┬──────────────────
 package │ [table 459 rows]
─────────┴──────────────────

from tsv

Parse text as .tsv and create table.

Syntax: from tsv {flags}

# Flags

--headerless
  don't treat the first row as column names

# Examples

Let's say we have the following file which is formatted like a tsv file:

> open elements.txt
Symbol        Element
H        Hydrogen
He        Helium
Li        Lithium
Be        Beryllium

If we pass the output of the open command to from tsv we get a correct formatted table:

> open elements.txt | from tsv
━━━┯━━━━━━━━┯━━━━━━━━━━━
 # │ Symbol │ Element
───┼────────┼───────────
 0 │ H      │ Hydrogen
 1 │ He     │ Helium
 2 │ Li     │ Lithium
 3 │ Be     │ Beryllium
━━━┷━━━━━━━━┷━━━━━━━━━━━

Using the --headerless flag has the following output:

> open elements.txt | from tsv --headerless
━━━━┯━━━━━━━━━┯━━━━━━━━━━━
 #  │ Column1 │ Column2
────┼─────────┼───────────
  0 │ Symbol  │ Element
  1 │ H       │ Hydrogen
  2 │ He      │ Helium
  3 │ Li      │ Lithium
  4 │ Be      │ Beryllium
━━━━┷━━━━━━━━━┷━━━━━━━━━━━

from url

Parse url-encoded string (opens new window) as a table.

# Example

> echo 'bread=baguette&cheese=comt%C3%A9&meat=ham&fat=butter' | from url
────────┬──────────
 bread  │ baguette
 cheese │ comté
 meat   │ ham
 fat    │ butter
────────┴──────────

from vcf

Parse text as .vcf and create table.

Syntax: from vcf

# Examples

Suppose contacts.txt is a text file that is formatted like a .vcf (vCard) file:

> open contacts.txt
BEGIN:VCARD
VERSION:3.0
FN:John Doe
N:Doe;John;;;
EMAIL;TYPE=INTERNET:john.doe99@gmail.com
...

Pass the output of the open command to from vcf to get a correctly formatted table:

> open contacts.txt | from vcf
─────┬─────────────────
 #   │ properties
─────┼─────────────────
   0[table 8 rows]
> open contacts.txt | from vcf | get properties | where $it.name == "FN" | select value
─────┬──────────────────────
 #   │
─────┼──────────────────────
   0 │ John Doe

from xlsx

Parses MS Excel binary data into a table. open calls from xlsx automatically when the file extension is xlsx. Use this command when open is unable to guess the file type from the extension.

# Examples

> open abc.xlsx
─────────────────
 Sheet1
─────────────────
 [table 26 rows]
─────────────────
> open abc.xlsx --raw
Length: 6344 (0x18c8) bytes
0000:   50 4b 03 04  14 00 00 00  08 00 00 00  00 00 d5 5f   PK............._
0010:   a7 48 68 01  00 00 23 05  00 00 13 00  00 00 5b 43   .Hh...#.......[C
0020:   6f 6e 74 65  6e 74 5f 54  79 70 65 73  5d 2e 78 6d   ontent_Types].xm
...
18a0:   6b 73 68 65  65 74 73 2f  73 68 65 65  74 31 2e 78   ksheets/sheet1.x
18b0:   6d 6c 50 4b  05 06 00 00  00 00 0a 00  0a 00 7f 02   mlPK............
18c0:   00 00 33 16  00 00 00 00                             ..3.....
> open abc.xlsx --raw | from xlsx
─────────────────
 Sheet1
─────────────────
 [table 26 rows]
─────────────────

from xml

Parse text as .xml and create table. Use this when nushell cannot determine the input file extension.

Syntax: from xml

# Examples

Let's say we've got a file in xml format but the file extension is different so Nu can't auto-format it:

> open world.txt
<?xml version="1.0" encoding="utf-8"?>
<world>
    <continent>Africa</continent>
    <continent>Antarctica</continent>
    <continent>Asia</continent>
    <continent>Australia</continent>
    <continent>Europe</continent>
    <continent>North America</continent>
    <continent>South America</continent>
</world>

We can use from xml to read the input like a xml file:

> open world.txt | from xml
━━━━━━━━━━━━━━━━
 world
────────────────
 [table 7 rows]
━━━━━━━━━━━━━━━━

from yaml

Parse text as .yaml/.yml and create table. Use this when nushell cannot determine the input file extension.

Syntax: from yaml

# Examples

> open command_from-yaml
title: from-yaml
type: command
flags: false
> open command_from-yaml | from yaml
━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━
 title     │ type    │ flags
───────────┼─────────┼───────
 from-yaml │ command │ No
━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━

from

Converts content (string or binary) into a table. The source format is specified as a subcommand, like from csv or from json.

Use this when nushell cannot determine the input file extension.

# Available Subcommands

Subcommands without links are currently missing their documentation.

# Example for from csv

Let's say we have the following file:

> cat pets.txt
animal, name, age
cat, Tom, 7
dog, Alfred, 10
chameleon, Linda, 1

pets.txt is actually a .csv file but it has the .txt extension, open is not able to convert it into a table:

> open pets.txt
animal, name, age
cat, Tom, 7
dog, Alfred, 10
chameleon, Linda, 1

To get a table from pets.txt we need to use the from csv command:

> open pets.txt | from csv
━━━┯━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━
 # │ animal    │  name   │  age
───┼───────────┼─────────┼──────
 0cat       │  Tom    │  7
 1 │ dog       │  Alfred │  10
 2 │ chameleon │  Linda  │  1
━━━┷━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━

get

Open given cells as text.

Syntax: get ...args

# Parameters

  • args: optionally return additional data by path

# Examples

If we run sys we receive a table which contains tables itself:

> sys
─────────┬─────────────────────────────────────────
 host[row 7 columns]
 cpu     │ [row cores current ghz max ghz min ghz]
 disks   │ [table 4 rows]
 mem     │ [row free swap free swap total total]
 net     │ [table 19 rows]
 battery │ [table 1 rows]
─────────┴─────────────────────────────────────────

To access one of the embedded tables we can use the get command

> sys | get cpu
─────────────┬────────
 cores       │ 16
 current ghz │ 2.4000
 min ghz     │ 2.4000
 max ghz     │ 2.4000
─────────────┴────────
> sys | get battery
───────────────┬──────────
 vendor        │ DSY
 model         │ bq40z651
 cycles        │ 43
 mins to empty │ 70.0000
───────────────┴──────────

There's also the ability to pass multiple parameters to get which results in an output like this

sys | get cpu battery
───┬───────┬─────────────┬─────────┬─────────
 # │ cores │ current ghz │ min ghz │ max ghz
───┼───────┼─────────────┼─────────┼─────────
 0162.40002.40002.4000
───┴───────┴─────────────┴─────────┴─────────
───┬────────┬──────────┬────────┬───────────────
 # │ vendor │ model    │ cycles │ mins to empty
───┼────────┼──────────┼────────┼───────────────
 1 │ DSY    │ bq40z651 │     4370.0000
───┴────────┴──────────┴────────┴───────────────

group-by

# group-by

This command creates a new table with the data from the table rows grouped by the column given.

# Examples

Let's say we have this table of all countries in the world sorted by their population:

> open countries_by_population.json | from json | first 10
━━━┯━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━
 # │ rank │ country or area │ UN continental region │ UN statistical region │ population 2018 │ population 2019 │ change
───┼──────┼─────────────────┼───────────────────────┼───────────────────────┼─────────────────┼─────────────────┼────────
 01    │ China           │ Asia                  │ Eastern Asia          │ 1,427,647,786   │ 1,433,783,686   │ +0.4%
 12    │ India           │ Asia                  │ Southern Asia         │ 1,352,642,280   │ 1,366,417,754   │ +1.0%
 23    │ United States   │ Americas              │ Northern America      │ 327,096,265     │ 329,064,917     │ +0.6%
 34    │ Indonesia       │ Asia                  │ South-eastern Asia    │ 267,670,543     │ 270,625,568     │ +1.1%
 45    │ Pakistan        │ Asia                  │ Southern Asia         │ 212,228,286     │ 216,565,318     │ +2.0%
 56    │ Brazil          │ Americas              │ South America         │ 209,469,323     │ 211,049,527     │ +0.8%
 67    │ Nigeria         │ Africa                │ Western Africa        │ 195,874,683     │ 200,963,599     │ +2.6%
 78    │ Bangladesh      │ Asia                  │ Southern Asia         │ 161,376,708     │ 163,046,161     │ +1.0%
 89    │ Russia          │ Europe                │ Eastern Europe        │ 145,734,038     │ 145,872,256     │ +0.1%
 910   │ Mexico          │ Americas              │ Central America       │ 126,190,788     │ 127,575,529     │ +1.1%
━━━┷━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━

Here we have listed only the first 10 lines. In total this table has got 233 rows which is to big to get information easily out of it.

We can use the group-by command on 'UN statistical region' to create a table per continental region.

> open countries_by_population.json | from json | group-by "UN continental region"
━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━
 Asia             │ Americas         │ Africa           │ Europe           │ Oceania
──────────────────┼──────────────────┼──────────────────┼──────────────────┼──────────────────
 [table: 51 rows][table: 53 rows][table: 58 rows][table: 48 rows][table: 23 rows]
━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━

Now we can already get some information like "which continental regions are there" and "how many countries are in each region". If we want to see only the countries in the continental region of Oceania we can type:

> open countries_by_population.json | from json | group-by "UN continental region" | get Oceania
━━━━┯━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━
 #  │ rank │ country or area                │ UN continental region │ UN statistical region     │ population 2018 │ population 2019 │ change
────┼──────┼────────────────────────────────┼───────────────────────┼───────────────────────────┼─────────────────┼─────────────────┼────────
  055   │ Australia                      │ Oceania               │ Australia and New Zealand │ 24,898,152      │ 25,203,198      │ +1.2%
  198   │ Papua New Guinea               │ Oceania               │ Melanesia                 │ 8,606,323       │ 8,776,109       │ +2.0%
  2125  │ New Zealand                    │ Oceania               │ Australia and New Zealand │ 4,743,131       │ 4,783,063       │ +0.8%
  3161  │ Fiji                           │ Oceania               │ Melanesia                 │ 883,483889,953         │ +0.7%
  4166  │ Solomon Islands                │ Oceania               │ Melanesia                 │ 652,857669,823         │ +2.6%
  5181  │ Vanuatu                        │ Oceania               │ Melanesia                 │ 292,680299,882         │ +2.5%
  6183  │ New Caledonia                  │ Oceania               │ Melanesia                 │ 279,993282,750         │ +1.0%
  7185  │ French Polynesia               │ Oceania               │ Polynesia                 │ 277,679279,287         │ +0.6%
  8188  │ Samoa                          │ Oceania               │ Polynesia                 │ 196,129197,097         │ +0.5%
  9191  │ Guam                           │ Oceania               │ Micronesia                │ 165,768167,294         │ +0.9%
 10193  │ Kiribati                       │ Oceania               │ Micronesia                │ 115,847117,606         │ +1.5%
 11194  │ Federated States of Micronesia │ Oceania               │ Micronesia                │ 112,640113,815         │ +1.0%
 12196  │ Tonga                          │ Oceania               │ Polynesia                 │ 110,589110,940         │ +0.3%
 13207  │ Marshall Islands               │ Oceania               │ Micronesia                │ 58,41358,791          │ +0.6%
 14209  │ Northern Mariana Islands       │ Oceania               │ Micronesia                │ 56,88256,188          │ −1.2%
 15210  │ American Samoa                 │ Oceania               │ Polynesia                 │ 55,46555,312          │ −0.3%
 16221  │ Palau                          │ Oceania               │ Micronesia                │ 17,90718,008          │ +0.6%
 17222  │ Cook Islands                   │ Oceania               │ Polynesia                 │ 17,51817,548          │ +0.2%
 18224  │ Tuvalu                         │ Oceania               │ Polynesia                 │ 11,50811,646          │ +1.2%
 19225  │ Wallis and Futuna              │ Oceania               │ Polynesia                 │ 11,66111,432          │ −2.0%
 20226  │ Nauru                          │ Oceania               │ Micronesia                │ 10,67010,756          │ +0.8%
 21231  │ Niue                           │ Oceania               │ Polynesia                 │ 1,6201,615           │ −0.3%
 22232  │ Tokelau                        │ Oceania               │ Polynesia                 │ 1,3191,340           │ +1.6%
━━━━┷━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━

headers

Use headers to turn the first row of a table into meaningful column names.

As demonstrated in the following example, it's particularly handy when working with spreadsheets.

# Examples

> open sample_data.ods | get SalesOrders
────┬────────────┬─────────┬──────────┬─────────┬─────────┬───────────┬───────────
 #  │  Column0   │ Column1 │ Column2  │ Column3 │ Column4 │  Column5  │  Column6
────┼────────────┼─────────┼──────────┼─────────┼─────────┼───────────┼───────────
  0 │ OrderDate  │ Region  │ Rep      │ Item    │ Units   │ Unit Cost │ Total
  12018-01-06 │ East    │ Jones    │ Pencil  │ 95.00001.9900189.0500
> open sample_data.ods | get SalesOrders | headers
────┬────────────┬─────────┬──────────┬─────────┬─────────┬───────────┬───────────
 #  │ OrderDate  │ Region  │   Rep    │  Item   │  Units  │ Unit Cost │   Total
────┼────────────┼─────────┼──────────┼─────────┼─────────┼───────────┼───────────
  02018-01-06 │ East    │ Jones    │ Pencil  │ 95.00001.9900189.0500
  12018-01-23 │ Central │ Kivell   │ Binder  │ 50.000019.9900999.4999

help

Use help for more information on a command. Use help commands to list all available commands. Use help <command name> to display help about a particular command.

# Examples

> help
Welcome to Nushell.

Here are some tips to help you get started.
  * help commands - list all available commands
  * help <command name> - display help about a particular command

Nushell works on the idea of a "pipeline". Pipelines are commands connected with the '|' character.
Each stage in the pipeline works together to load, parse, and display information to you.

[Examples]

List the files in the current directory, sorted by size:
    ls | sort-by size

Get information about the current system:
    sys | get host

Get the processes on your system actively using CPU:
    ps | where cpu > 0

You can also learn more at https://www.nushell.sh/book/
> help commands
────┬──────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 #  │ name         │ description
────┼──────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 0alias        │ Define a shortcut for another command.
 1  │ append       │ Append the given row to the table
 2  │ autoview     │ View the contents of the pipeline as a table or list.
 3  │ build-string │ Builds a string from the arguments
 4cal          │ Display a calendar.
 5  │ calc         │ Parse a math expression into a number
...
 83 │ where        │ Filter table to match the condition.
 84which        │ Finds a program file.
 85 │ with-env     │ Runs a block with an environment set. Eg) with-env [NAME 'foo'] { echo $nu.env.NAME }
 86 │ wrap         │ Wraps the given data in a table.
────┴──────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
> help cd
Change to a new path.

Usage:
  > cd (directory) {flags}

Parameters:
  (directory) the directory to change to

Flags:
  -h, --help: Display this help message

Examples:
  Change to a new directory called 'dirname'
  > cd dirname

  Change to your home directory
  > cd

  Change to your home directory (alternate version)
  > cd ~

  Change to the previous directory
  > cd -

histogram

Creates a new table with a histogram based on the column name passed in.

Syntax: histogram <column_name> ...args

# Parameters

  • <column-name>: name of the column to graph by
  • args: column name to give the histogram's frequency column

# Examples

Let's say we have this file random_numers.csv which contains 50 random numbers.

Note: The input doesn't have to be numbers it works on strings too. Try it out.

> open random_numbers.csv
────┬────────────────
 #  │ random numbers
────┼────────────────
  05
  12
  20
...
 471
 481
 492
────┴────────────────

If we now want to see how often the different numbers were generated, we can use the histogram function:

> open random_numbers.csv | histogram "random numbers"
───┬────────────────┬─────────────┬────────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────
 # │ random numbers │    count    │ percentage │ frequency
───┼────────────────┼─────────────┼────────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────
 00857.14%     │ *********************************************************
 1114100.00%    │ ****************************************************************************************************
 22964.29%     │ ****************************************************************
 33642.86%     │ ******************************************
 44321.43%     │ *********************
 551071.43%     │ ***********************************************************************
───┴────────────────┴─────────────┴────────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────

We can also set the name of the second column or sort the table:

> open random_numbers.csv | histogram "random numbers" probability
───┬────────────────┬─────────────┬────────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────
 # │ random numbers │    count    │ percentage │ probability
───┼────────────────┼─────────────┼────────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────
 00857.14%     │ *********************************************************
 1114100.00%    │ ****************************************************************************************************
 22964.29%     │ ****************************************************************
 33642.86%     │ ******************************************
 44321.43%     │ *********************
 551071.43%     │ ***********************************************************************
───┴────────────────┴─────────────┴────────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────

> open random_numbers.csv | histogram "random numbers" probability | sort-by probability
───┬────────────────┬─────────────┬────────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────
 # │ random numbers │    count    │ percentage │ probability
───┼────────────────┼─────────────┼────────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────
 04321.43%     │ *********************
 13642.86%     │ ******************************************
 20857.14%     │ *********************************************************
 32964.29%     │ ****************************************************************
 451071.43%     │ ***********************************************************************
 5114100.00%    │ ****************************************************************************************************
───┴────────────────┴─────────────┴────────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────

Of course, histogram operations are not restricted to just analyzing numbers in files, you can also analyze your directories

> ls -la | histogram type | sort-by count
───┬─────────┬─────────────┬────────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────
 # │ type    │    count    │ percentage │ frequency
───┼─────────┼─────────────┼────────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────
 0 │ Dir     │           54.76%      │ ****
 1 │ Symlink │          2826.67%     │ **************************
 2 │ File    │         105100.00%    │ ****************************************************************************************************
───┴─────────┴─────────────┴────────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────

history

Displays the last 100 commands.

# Example

> history
─────┬────────────────────────────────────────────────────────────────────────
  #  │
─────┼────────────────────────────────────────────────────────────────────────
...
 97date
 98ls
 99ls -la
─────┴────────────────────────────────────────────────────────────────────────

inc

This command increments the value of variable by one.

# Examples

> open rustfmt.toml
─────────┬──────
 edition │ 2018
─────────┴──────
> open rustfmt.toml | inc edition
─────────┬──────
 edition │ 2019
─────────┴──────
> open Cargo.toml | get package.version
0.15.1
> open Cargo.toml | inc package.version --major | get package.version
1.0.0
> open Cargo.toml | inc package.version --minor | get package.version
0.16.0
> open Cargo.toml | inc package.version --patch | get package.version
0.15.2

insert

This command adds a column to any table output. The first parameter takes the heading, the second parameter takes the value for all the rows.

# Examples

> ls | insert is_on_a_computer yes_obviously
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed  │ modified  │ is_on_a_computer
───┼────────────────────────────┼──────┼──────────┼────────┼───────────┼───────────┼──────────────────
 0 │ zeusiscrazy.txt            │ File │          │ 556 B  │ a day ago │ a day ago │ yes_obviously
 1 │ coww.txt                   │ File │          │  24 B  │ a day ago │ a day ago │ yes_obviously
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ a day ago │ a day ago │ yes_obviously
 3 │ abaracadabra.txt           │ File │          │ 401 B  │ a day ago │ a day ago │ yes_obviously
 4 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a day ago │ a day ago │ yes_obviously
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━
> shells | insert os linux_on_this_machine
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path                           │ os
───┼───┼────────────┼────────────────────────────────┼───────────────────────
 0 │ X │ filesystem │ /home/shaurya/stuff/expr/stuff │ linux_on_this_machine
 1 │   │ filesystem │ /                              │ linux_on_this_machine
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━

last

Use last to retrieve the last "n" rows of a table. last has a required amount parameter that indicates how many rows you would like returned. If more than one row is returned, an index column will be included showing the row number. last does not alter the order of the rows of the table.

# Examples

> ps | last 1
─────────┬─────────────
 pid     │ 167
 name    │ loginwindow
 status  │ Running
 cpu     │ 0.0000
 mem     │ 461.2 MB
 virtual │ 7.2 GB
─────────┴─────────────
> ps | last 5
───┬─────┬─────────────────┬─────────┬────────┬──────────┬─────────
 # │ pid │ name            │ status  │ cpu    │ mem      │ virtual
───┼─────┼─────────────────┼─────────┼────────┼──────────┼─────────
 0334 │ knowledge-agent │ Running │ 0.000053.7 MB │  6.7 GB
 1332 │ UserEventAgent  │ Running │ 0.000022.1 MB │  6.6 GB
 2326 │ cfprefsd        │ Running │ 0.00008.1 MB │  5.6 GB
 3325 │ coreauthd       │ Running │ 0.00009.7 MB │  5.0 GB
 4167 │ loginwindow     │ Running │ 0.0000461.2 MB │  7.2 GB
───┴─────┴─────────────────┴─────────┴────────┴──────────┴─────────

lines

This command takes a string from a pipeline as input, and returns a table where each line of the input string is a row in the table. Empty lines are ignored. This command is capable of feeding other commands, such as nth, with its output.

# Usage

> [input-command] | lines

# Examples

Basic usage:

> printf "Hello\nWorld!\nLove, nushell." | lines
━━━┯━━━━━━━━━━━━━━━━
 # │ value
───┼────────────────
 0 │ Hello
 1 │ World!
 2 │ Love, nushell.
━━━┷━━━━━━━━━━━━━━━━

One useful application is piping the contents of file into lines. This example extracts a certain line from a given file.

> cat lines.md | lines | nth 6
## Examples

Similarly to this example, lines can be used to extract certain portions of or apply transformations to data returned by any program which returns a string.

math eval

math eval is a command that takes a math expression from the pipeline and evaluates that into a number. It also optionally takes the math expression as an argument.

This command supports the following operations -

operations:

  • Binary operators: +, -, *, /, % (remainder), ^ (power)
  • Unary operators: +, -, ! (factorial)

functions:

  • sqrt, abs
  • exp, ln, log10
  • sin, cos, tan, asin, acos, atan, atan2
  • sinh, cosh, tanh, asinh, acosh, atanh
  • floor, ceil, round
  • signum
  • max(x, ...), min(x, ...): maximum and minimum of 1 or more numbers

constants:

  • pi
  • e

# Examples

> echo "1+2+3" | math eval
6.0u
> echo "1-2+3" | math eval
2.0
> echo "-(-23)" | math eval
23.0
> echo "5^2" | math eval
25.0
> echo "5^3" | math eval
125.0
> echo "min(5,4,3,2,1,0,-100,45)" | math eval
-100.0
> echo "max(5,4,3,2,1,0,-100,45)" | math eval
45.0
> echo sqrt(2) | math eval
1.414213562373095
> echo pi | math eval
3.141592653589793
> echo e | math eval
2.718281828459045
> echo "sin(pi / 2)" | math eval
1.0
> echo "floor(5999/1000)" | math eval
5.0
> open abc.json
───┬──────
 # │ size
───┼──────
 0816
 11627
 21436
 31573
 4935
 552
 6999
 71639
───┴──────
> open abc.json | format "({size} + 500) * 4"
───┬──────────────────
 # │
───┼──────────────────
 0(816 + 500) * 4
 1(1627 + 500) * 4
 2(1436 + 500) * 4
 3(1573 + 500) * 4
 4(935 + 500) * 4
 5(52 + 500) * 4
 6(999 + 500) * 4
 7(1639 + 500) * 4
───┴──────────────────
> open abc.json | format "({size} + 500) * 4" | math eval
───┬───────────
 # │
───┼───────────
 05264.0000
 18508.0000
 27744.0000
 38292.0000
 45740.0000
 52208.0000
 65996.0000
 78556.0000
───┴───────────
> open abc.json | format "({size} - 1000) * 4" | math eval
───┬────────────
 # │
───┼────────────
 0 │  -736.0000
 12508.0000
 21744.0000
 32292.0000
 4 │  -260.0000
 5 │ -3792.0000
 6 │    -4.0000
 72556.0000
───┴────────────

Note that since math eval uses floating-point numbers, the result may not always be precise.

> echo "floor(5999999999999999999/1000000000000000000)" | math eval
6.0

math

Mathematical functions that generally only operate on a list of numbers (integers, decimals, bytes) and tables. Currently the following functions are implemented:

  • math abs: Returns absolute values of a list of numbers
  • math avg: Finds the average of a list of numbers or tables
  • math ceil: Applies the ceil function to a list of numbers
  • math eval: Evaluates a list of math expressions into numbers
  • math floor: Applies the floor function to a list of numbers
  • math max: Finds the maximum within a list of numbers or tables
  • math median: Finds the median of a list of numbers or tables
  • math min: Finds the minimum within a list of numbers or tables
  • math mode: Finds the most frequent element(s) within a list of numbers or tables
  • math round: Applies the round function to a list of numbers
  • math stddev: Finds the standard deviation of a list of numbers or tables
  • math sum: Finds the sum of a list of numbers or tables
  • math product: Finds the product of a list of numbers or tables
  • math variance: Finds the variance of a list of numbers or tables

However, the mathematical functions like min and max are more permissive and also work on Dates.

# Examples

To get the average of the file sizes in a directory, simply pipe the size column from the ls command to the average command.

# List of Numbers (Integers, Decimals, Bytes)

> ls
 #  │ name               │ type │ size     │ modified
────┼────────────────────┼──────┼──────────┼─────────────
  0 │ CODE_OF_CONDUCT.md │ File │   3.4 KB │ 4 days ago
  1 │ CONTRIBUTING.md    │ File │   1.3 KB │ 4 days ago
  2 │ Cargo.lock         │ File │ 106.3 KB │ 6 mins ago
  3 │ Cargo.toml         │ File │   4.6 KB │ 3 days ago
  4 │ LICENSE            │ File │   1.1 KB │ 4 days ago
  5 │ Makefile.toml      │ File │    449 B │ 4 days ago
  6 │ README.md          │ File │  16.0 KB │ 6 mins ago
  7 │ TODO.md            │ File │      0 B │ 6 mins ago
  8 │ assets             │ Dir  │    128 B │ 4 days ago
  9 │ build.rs           │ File │     78 B │ 4 days ago
 10 │ crates             │ Dir  │    672 B │ 3 days ago
 11 │ debian             │ Dir  │    352 B │ 4 days ago
 12 │ docker             │ Dir  │    288 B │ 4 days ago
 13 │ docs               │ Dir  │    160 B │ 4 days ago
 14 │ features.toml      │ File │    632 B │ 4 days ago
 15 │ images             │ Dir  │    160 B │ 4 days ago
 16 │ justfile           │ File │    234 B │ 3 days ago
 17 │ rustfmt.toml       │ File │     16 B │ 4 days ago
 18 │ src                │ Dir  │    128 B │ 4 days ago
 19 │ target             │ Dir  │    192 B │ 8 hours ago
 20 │ tests              │ Dir  │    192 B │ 4 days ago
> ls | get size | math avg
───┬────────
 # │
───┼────────
 07.2 KB
───┴────────
> ls | get size | math min
───┬─────
 # │
───┼─────
 00 B
───┴─────
> ls | get size | math max
───┬──────────
 # │
───┼──────────
 0113.6 KB
───┴──────────
> ls | get size | math median
───┬───────
 # │
───┼───────
 0320 B
───┴───────
> ls | get size | math sum
───┬──────────
 # │
───┼──────────
 0143.6 KB
───┴──────────
> echo [3 3 9 12 12 15] | math mode
───┬────
 03
 112
───┴────
> echo [2 3 3 4] | math product
72
> echo [1 4 6 10 50] | math stddev
18.1372
> echo [1 4 6 10 50] | math variance
328.96
> echo [1.5 2.3 -3.1] | math ceil
───┬────
 02
 13
 2 │ -3
───┴────
> echo [1.5 2.3 -3.1] | math floor
───┬────
 01
 12
 2 │ -4
───┴────
> echo [1.5 2.3 -3.1] | math round
───┬────
 02
 12
 2 │ -3
───┴────
> echo [1 -2 -3.0] | math abs
───┬────────
 01
 12
 23.0000
───┴────────

# Dates

> ls | get modified | math min
2020-06-09 17:25:51.798743222 UTC
> ls | get modified | math max
2020-06-14 05:49:59.637449186 UT

# Operations on tables

>  pwd | split row / | size
───┬───────┬───────┬───────┬────────────
 # │ lines │ words │ chars │ bytes
───┼───────┼───────┼───────┼────────────
 00155
 1011111
 2011111
 30144
 4021212
 50177
───┴───────┴───────┴───────┴────────────
> pwd | split row / | size | math max
────────────┬────
 lines      │ 0
 words      │ 2
 chars      │ 12
 bytes │ 12
────────────┴────
> pwd | split row / | size | math avg
────────────┬────────
 lines      │ 0.0000
 words      │ 1.1666
 chars      │ 8.3333
 bytes │ 8.3333
────────────┴────────

To get the sum of the characters that make up your present working directory.

> pwd | split row / | size | get chars | math sum
50

# Errors

math functions are aggregation functions so empty lists are invalid

> echo [] | math avg
error: Error: Unexpected: Cannot perform aggregate math operation on empty data

nth

This command returns the nth row of a table, starting from 0. If the number given is less than 0 or more than the number of rows, nothing is returned.

# Usage

> [input-command] | nth <row number>  ...args

# Parameters

  • <row number> the number of the row to return
  • args: Optionally return more rows

# Examples

> ls
────┬────────────────────┬──────┬──────────┬──────────────
 #  │ name               │ type │ size     │ modified
────┼────────────────────┼──────┼──────────┼──────────────
 0  │ CODE_OF_CONDUCT.md │ File │   3.4 KB │ 53 mins ago
 1  │ CONTRIBUTING.md    │ File │   1.3 KB │ 6 mins ago
 2  │ Cargo.lock         │ File │ 113.3 KB │ 53 mins ago
 3  │ Cargo.toml         │ File │   4.6 KB │ 53 mins ago
 4  │ LICENSE            │ File │   1.1 KB │ 3 months ago
 5  │ Makefile.toml      │ File │    449 B │ 5 months ago
 6  │ README.md          │ File │  15.8 KB │ 2 mins ago
 7  │ TODO.md            │ File │      0 B │ 53 mins ago
 8  │ assets             │ Dir  │    128 B │ 5 months ago
 9  │ build.rs           │ File │     78 B │ 4 months ago
 10 │ crates             │ Dir  │    704 B │ 53 mins ago
 11 │ debian             │ Dir  │    352 B │ 5 months ago
 12 │ docker             │ Dir  │    288 B │ 3 months ago
 13 │ docs               │ Dir  │    192 B │ 53 mins ago
 14 │ features.toml      │ File │    632 B │ 4 months ago
 15 │ images             │ Dir  │    160 B │ 5 months ago
 16 │ rustfmt.toml       │ File │     16 B │ 5 months ago
 17 │ src                │ Dir  │    128 B │ 1 day ago
 18 │ target             │ Dir  │    160 B │ 5 days ago
 19 │ tests              │ Dir  │    192 B │ 3 months ago
────┴────────────────────┴──────┴──────────┴──────────────
> ls | nth 0
──────────┬────────────────────
 name     │ CODE_OF_CONDUCT.md
 type     │ File
 size     │ 3.4 KB
 modified │ 54 mins ago
──────────┴────────────────────
> ls | nth 0 2
───┬────────────────────┬──────┬──────────┬─────────────
 # │ name               │ type │ size     │ modified
───┼────────────────────┼──────┼──────────┼─────────────
 0 │ CODE_OF_CONDUCT.md │ File │   3.4 KB │ 54 mins ago
 1 │ Cargo.lock         │ File │ 113.3 KB │ 54 mins ago
───┴────────────────────┴──────┴──────────┴─────────────
> ls | nth 5
──────────┬───────────────
 name     │ Makefile.toml
 type     │ File
 size     │ 449 B
 modified │ 5 months ago
──────────┴───────────────

open

Loads a file into a cell, convert it to table if possible (avoid by appending --raw flag)

# Example

> cat user.yaml
- Name: Peter
  Age: 30
  Telephone: 88204828
  Country: Singapore
- Name: Michael
  Age: 42
  Telephone: 44002010
  Country: Spain
- Name: Will
  Age: 50
  Telephone: 99521080
  Country: Germany
> open user.yaml
━━━┯━━━━━━━━━┯━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━
 # │ Name    │ Age │ Telephone │ Country
───┼─────────┼─────┼───────────┼───────────
 0 │ Peter   │  3088204828 │ Singapore
 1 │ Michael │  4244002010 │ Spain
 2 │ Will    │  5099521080 │ Germany
━━━┷━━━━━━━━━┷━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━
> open user.yaml --raw
- Name: Peter
  Age: 30
  Telephone: 88204828
  Country: Singapore
- Name: Michael
  Age: 42
  Telephone: 44002010
  Country: Spain
- Name: Will
  Age: 50
  Telephone: 99521080
  Country: Germany
> cat user.json
[
    {
        "Name": "Peter",
        "Age": 30,
        "Telephone": 88204828,
        "Country": "Singapore"
    },
    {
        "Name": "Michael",
        "Age": 42,
        "Telephone": 44002010,
        "Country": "Spain"
    },
    {
        "Name": "Will",
        "Age": 50,
        "Telephone": 99521080,
        "Country": "Germany"
    }
]
> open user.json
━━━┯━━━━━━━━━┯━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━
 # │ Name    │ Age │ Telephone │ Country
───┼─────────┼─────┼───────────┼───────────
 0 │ Peter   │  3088204828 │ Singapore
 1 │ Michael │  4244002010 │ Spain
 2 │ Will    │  5099521080 │ Germany
━━━┷━━━━━━━━━┷━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━
> open user.json --raw
[
    {
        "Name": "Peter",
        "Age": 30,
        "Telephone": 88204828,
        "Country": "Singapore"
    },
    {
        "Name": "Michael",
        "Age": 42,
        "Telephone": 44002010,
        "Country": "Spain"
    },
    {
        "Name": "Will",
        "Age": 50,
        "Telephone": 99521080,
        "Country": "Germany"
    }
]

pivot

Pivots the table contents so rows become columns and columns become rows.

# Examples

> ls docs
───┬────────────────────┬──────┬────────┬─────────────
 # │ name               │ type │ size   │ modified
───┼────────────────────┼──────┼────────┼─────────────
 0 │ docs/commands      │ Dir  │ 2.7 KB │ 53 mins ago
 1 │ docs/docker.md     │ File │ 7.0 KB │ 40 mins ago
 2 │ docs/philosophy.md │ File │  912 B │ 54 mins ago
───┴────────────────────┴──────┴────────┴─────────────
> ls docs | pivot
───┬──────────┬───────────────┬────────────────┬────────────────────
 # │ Column0  │ Column1       │ Column2        │ Column3
───┼──────────┼───────────────┼────────────────┼────────────────────
 0 │ name     │ docs/commands │ docs/docker.md │ docs/philosophy.md
 1type     │ Dir           │ File           │ File
 2 │ size     │        2.7 KB │         7.0 KB │              912 B
 3 │ modified │ 53 mins ago   │ 40 mins ago    │ 55 mins ago
───┴──────────┴───────────────┴────────────────┴────────────────────

Use --header-row to treat the first row as column names:

> ls docs | pivot --header-row
───┬───────────────┬────────────────┬────────────────────
 # │ docs/commands │ docs/docker.md │ docs/philosophy.md
───┼───────────────┼────────────────┼────────────────────
 0 │ Dir           │ File           │ File
 12.7 KB │         7.0 KB │              912 B
 253 mins ago   │ 40 mins ago    │ 55 mins ago
───┴───────────────┴────────────────┴────────────────────

Use --ignore-titles to prevent pivoting the column names into values:

> ls docs | pivot --ignore-titles
───┬───────────────┬────────────────┬────────────────────
 # │ Column0       │ Column1        │ Column2
───┼───────────────┼────────────────┼────────────────────
 0 │ docs/commands │ docs/docker.md │ docs/philosophy.md
 1 │ Dir           │ File           │ File
 22.7 KB │         7.0 KB │              912 B
 354 mins ago   │ 41 mins ago    │ 56 mins ago
───┴───────────────┴────────────────┴────────────────────

Additional arguments are used as column names:

> ls docs | pivot foo bar baz
───┬──────────┬───────────────┬────────────────┬────────────────────
 # │ foo      │ bar           │ baz            │ Column3
───┼──────────┼───────────────┼────────────────┼────────────────────
 0 │ name     │ docs/commands │ docs/docker.md │ docs/philosophy.md
 1type     │ Dir           │ File           │ File
 2 │ size     │        2.7 KB │         7.0 KB │              912 B
 3 │ modified │ 55 mins ago   │ 41 mins ago    │ 56 mins ago
───┴──────────┴───────────────┴────────────────┴────────────────────

prepend

This command prepends the given row to the front of the table

Note:

  • prepend does not change a file itself. If you want to save your changes, you need to run the save command
  • if you want to add something containing a whitespace character, you need to put it in quotation marks

# Examples

Let's complete this table with the missing continents:

> open continents.txt | lines
━━━┯━━━━━━━━━━━━━━━
 # │
───┼───────────────
 0 │ Africa
 1 │ South America
 2 │ Australia
 3 │ Europe
 4 │ Antarctica
━━━┷━━━━━━━━━━━━━━━

You can add a new row at the top by using prepend:

> open continents.txt | lines | prepend Asia
━━━┯━━━━━━━━━━━━━━━
 # │
───┼───────────────
 0 │ Asia
 1 │ Africa
 2 │ South America
 3 │ Australia
 4 │ Europe
 5 │ Antarctica
━━━┷━━━━━━━━━━━━━━━

It's not possible to add multiple rows at once, so you'll need to call prepend multiple times:

> open continents.txt | lines | prepend Asia | prepend "North America"
━━━┯━━━━━━━━━━━━━━━
 # │
───┼───────────────
 0 │ North America
 1 │ Asia
 2 │ Africa
 3 │ South America
 4 │ Australia
 5 │ Europe
 6 │ Antarctica
━━━┷━━━━━━━━━━━━━━━

ps

This command shows information about system processes.

Syntax: ps

# Example

> ps
━━━━┯━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━
 #  │ pid   │ name                                                               │ status  │ cpu
────┼───────┼────────────────────────────────────────────────────────────────────┼─────────┼───────────────────
 5010184 │ firefox.exe                                                        │ Running │ 0.000000000000000
 5111584 │ WindowsTerminal.exe                                                │ Running │ 0.000000000000000
 5211052 │ conhost.exe                                                        │ Running │ 0.000000000000000
 537076 │ nu.exe                                                             │ Running │ 0.000000000000000
   ...
 663000 │ Code.exe                                                           │ Running │ 0.000000000000000
 675388 │ conhost.exe                                                        │ Running │ 0.000000000000000
 686268 │ firefox.exe                                                        │ Running │ 0.000000000000000
 698972 │ nu_plugin_ps.exe                                                   │ Running │ 58.00986000000000
━━━━┷━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━

Find processes with the highest cpu time

> ps -l | sort-by cpu_time | last 2
 # │ pid │       name       │ status  │  cpu   │   mem    │ virtual │     cpu_time      │ parent │         exe          │       command
───┼─────┼──────────────────┼─────────┼────────┼──────────┼─────────┼───────────────────┼────────┼──────────────────────┼──────────────────────
 0396 │ Google Chrome    │ Running │ 0.0000271.6 MB │  5.8 GB │ 6hr 20min 28sec   │      1 │ /Applications/Google │ /Applications/Google
   │     │                  │         │        │          │         │ 173ms 641us 315ns │        │ Chrome.app/Contents/ │ Chrome.app/Contents/
   │     │                  │         │        │          │         │                   │        │ MacOS/Google         │ MacOS/Google
   │     │                  │         │        │          │         │                   │        │ Chrome               │ Chrome
 1444 │ Google Chrome He │ Running │ 0.0000398.9 MB │  5.3 GB │ 10hr 36min 17sec  │    396 │ /Applications/Google │ /Applications/Google
   │     │                  │         │        │          │         │ 304ms 66us 889ns  │        │ Chrome.app/Contents/ │ Chrome.app/Contents/
   │     │                  │         │        │          │         │                   │        │ Frameworks/Google    │ Frameworks/Google
   │     │                  │         │        │          │         │                   │        │ Chrome               │ Chrome
   │     │                  │         │        │          │         │                   │        │ Framework.framework/ │ Framework.framework/
   │     │                  │         │        │          │         │                   │        │ Versions/84.0.4147.1 │ Versions/84.0.4147.1
   │     │                  │         │        │          │         │                   │        │ 25/Helpers/Google    │ 25/Helpers/Google
   │     │                  │         │        │          │         │                   │        │ Chrome Helper        │ Chrome Helper
   │     │                  │         │        │          │         │                   │        │ (GPU).app/Contents/M │ (GPU).app/Contents/M
   │     │                  │         │        │          │         │                   │        │ acOS/Google          │ acOS/Google
   │     │                  │         │        │          │         │                   │        │ Chrome Helper (GPU)  │ Chrome Helper (GPU)
───┴─────┴──────────────────┴─────────┴────────┴──────────┴─────────┴───────────────────┴────────┴──────────────────────┴──────────────────────

pwd

Print the current working directory.

-h, --help Display help message.

# Examples

> pwd
/home/me/nushell/docs/commands
> pwd | split column "/" | reject Column1 | pivot | reject Column0
───┬──────────
 # │ Column1
───┼──────────
 0 │ home
 1 │ me
 2 │ projects
 3 │ nushell
 4 │ docs
 5 │ commands
───┴──────────

random

Use random to generate random values

# bool

  • random bool: Generate a random boolean value

# bool Flags

  • -b, --bias <number>: Adjusts the probability of a "true" outcome

# bool Examples

> random bool
false
> random bool --bias 0.75
true

# dice

  • random dice: Generate a random dice roll

# dice Flags

  • d, --dice <integer>: The amount of dice being rolled
  • s, --sides <integer>: The amount of sides a die has

# dice Examples

> random dice
4
> random dice -d 10 -s 12
───┬────
 011
 111
 211
 311
 45
 53
 610
 77
 83
 91
───┴────
> random dice --dice 1024 --sides 16 | histogram | sort-by occurrences
────┬───────┬─────────────┬────────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────
 #  │ value │ occurrences │ percentage │ frequency
────┼───────┼─────────────┼────────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────
  065775.00%     │ ***************************************************************************
  1125977.63%     │ *****************************************************************************
  235977.63%     │ *****************************************************************************
  3166078.95%     │ ******************************************************************************
  4136180.26%     │ ********************************************************************************
  5116281.58%     │ *********************************************************************************
  656281.58%     │ *********************************************************************************
  796281.58%     │ *********************************************************************************
  846382.89%     │ **********************************************************************************
  986484.21%     │ ************************************************************************************
 10106585.53%     │ *************************************************************************************
 11156686.84%     │ **************************************************************************************
 12146788.16%     │ ****************************************************************************************
 1376990.79%     │ ******************************************************************************************
 1417294.74%     │ **********************************************************************************************
 15276100.00%    │ ****************************************************************************************************
────┴───────┴─────────────┴────────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────

# uuid

  • random uuid: Generate a random uuid4 string

# uuid Examples

> random uuid
8af4de39-acbc-42f0-94d1-7cfad6c01f8b

# integer

  • random integer: Generate a random integer

# integer Flags

  • m, --min <integer>: The minimum value to generate
  • x, --max <integer>: The maximum value to generate

# integer Examples

> random integer
42
> random integer 5000..
8700890823
> random integer ..100
73
> random integer 100000..200000
173400

# chars Examples

Generate a random password of length 15

> random chars -l 15
fWBSbE7QtaoJGeo

reject

This command removes or rejects the columns passed to it.

# Examples

> ls
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ created     │ accessed    │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼─────────────┼─────────────┼─────────────
 0 │ zeusiscrazy.txt            │ File │          │ 556 B  │ a month ago │ a month ago │ a month ago
 1 │ coww.txt                   │ File │          │  24 B  │ a month ago │ a month ago │ a month ago
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ a month ago │ a month ago │ a month ago
 3 │ abaracadabra.txt           │ File │          │ 401 B  │ a month ago │ a month ago │ a month ago
 4 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a month ago │ a month ago │ a month ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━
> ls | reject readonly
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━
 # │ name                       │ type │ size   │ created     │ accessed    │ modified
───┼────────────────────────────┼──────┼────────┼─────────────┼─────────────┼─────────────
 0 │ zeusiscrazy.txt            │ File │ 556 B  │ a month ago │ a month ago │ a month ago
 1 │ coww.txt                   │ File │  24 B  │ a month ago │ a month ago │ a month ago
 2 │ randomweirdstuff.txt       │ File │ 197 B  │ a month ago │ a month ago │ a month ago
 3 │ abaracadabra.txt           │ File │ 401 B  │ a month ago │ a month ago │ a month ago
 4 │ youshouldeatmorecereal.txt │ File │ 768 B  │ a month ago │ a month ago │ a month ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━
> ls | reject readonly accessed
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━
 # │ name                       │ type │ size   │ created     │ modified
───┼────────────────────────────┼──────┼────────┼─────────────┼─────────────
 0 │ zeusiscrazy.txt            │ File │ 556 B  │ a month ago │ a month ago
 1 │ coww.txt                   │ File │  24 B  │ a month ago │ a month ago
 2 │ randomweirdstuff.txt       │ File │ 197 B  │ a month ago │ a month ago
 3 │ abaracadabra.txt           │ File │ 401 B  │ a month ago │ a month ago
 4 │ youshouldeatmorecereal.txt │ File │ 768 B  │ a month ago │ a month ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━

rename

Use rename to give columns more appropriate names.

# Examples

> open /etc/passwd | lines | split column ":" | rename user password uid gid gecos home shell
────┬────────┬──────────┬──────┬──────┬────────┬─────────────────┬──────────────────
 #  │ user   │ password │ uid  │ gid  │ gecos  │ home            │ shell
────┼────────┼──────────┼──────┼──────┼────────┼─────────────────┼──────────────────
  0 │ root   │ x        │ 00    │ root   │ /root           │ /bin/bash
  1 │ bin    │ x        │ 11    │ bin    │ /bin            │ /usr/bin/nologin
  2 │ daemon │ x        │ 22    │ daemon │ /               │ /usr/bin/nologin
  3 │ mail   │ x        │ 812   │ mail   │ /var/spool/mail │ /usr/bin/nologin
────┴────────┴──────────┴──────┴──────┴────────┴─────────────────┴──────────────────

reverse

This command reverses the order of the elements in a sorted table.

# Examples

> ls | sort-by name
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed       │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ abaracadabra.txt           │ File │          │ 401 B  │ 23 minutes ago │ 16 minutes ago
 1 │ coww.txt                   │ File │          │  24 B  │ 22 minutes ago │ 17 minutes ago
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ 21 minutes ago │ 18 minutes ago
 3 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ 30 seconds ago │ now
 4 │ zeusiscrazy.txt            │ File │          │ 556 B  │ 22 minutes ago │ 18 minutes ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
> ls | sort-by name | reverse
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed       │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ zeusiscrazy.txt            │ File │          │ 556 B  │ 22 minutes ago │ 19 minutes ago
 1 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ 39 seconds ago │ 18 seconds ago
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ 21 minutes ago │ 18 minutes ago
 3 │ coww.txt                   │ File │          │  24 B  │ 22 minutes ago │ 18 minutes ago
 4 │ abaracadabra.txt           │ File │          │ 401 B  │ 23 minutes ago │ 16 minutes ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
> ls | sort-by size
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed       │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ coww.txt                   │ File │          │  24 B  │ 22 minutes ago │ 18 minutes ago
 1 │ randomweirdstuff.txt       │ File │          │ 197 B  │ 21 minutes ago │ 18 minutes ago
 2 │ abaracadabra.txt           │ File │          │ 401 B  │ 23 minutes ago │ 16 minutes ago
 3 │ zeusiscrazy.txt            │ File │          │ 556 B  │ 22 minutes ago │ 19 minutes ago
 4 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a minute ago   │ 26 seconds ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
> ls | sort-by size | reverse
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed       │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a minute ago   │ 32 seconds ago
 1 │ zeusiscrazy.txt            │ File │          │ 556 B  │ 22 minutes ago │ 19 minutes ago
 2 │ abaracadabra.txt           │ File │          │ 401 B  │ 23 minutes ago │ 16 minutes ago
 3 │ randomweirdstuff.txt       │ File │          │ 197 B  │ 21 minutes ago │ 18 minutes ago
 4 │ coww.txt                   │ File │          │  24 B  │ 22 minutes ago │ 18 minutes ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━

save

This command saves the contents of the pipeline to a file. Use this in combination with the to json, to csv, ... commands to save the contents in the specified format.

Syntax: save (path) {flags}

# Parameters

  • (path) the path to save contents to

# Flags

--raw
  treat values as-is rather than auto-converting based on file extension

# Example

You can save the name of files in a directory like this:

> ls | where type == File | select name | save filenames.csv

Or you can format it in supported formats using one of the to commands:

> ls | where type == File | select name | to csv | save filenames

filename.csv and filenames are both csv formatted files. Nu auto-converts the format if a supported file extension is given.

select

This command displays only the column names passed on to it.

# Examples

> ls
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ created     │ accessed    │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼─────────────┼─────────────┼─────────────
 0 │ zeusiscrazy.txt            │ File │          │ 556 B  │ a month ago │ a month ago │ a month ago
 1 │ coww.txt                   │ File │          │  24 B  │ a month ago │ a month ago │ a month ago
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ a month ago │ a month ago │ a month ago
 3 │ abaracadabra.txt           │ File │          │ 401 B  │ a month ago │ a month ago │ a month ago
 4 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a month ago │ a month ago │ a month ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━
> ls | select name
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │ name
───┼────────────────────────────
 0 │ zeusiscrazy.txt
 1 │ coww.txt
 2 │ randomweirdstuff.txt
 3 │ abaracadabra.txt
 4 │ youshouldeatmorecereal.txt
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━

The order in which you put the column names matters:

> ls | select type name size
━━━┯━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━
 # │ type │ name                       │ size
───┼──────┼────────────────────────────┼────────
 0 │ File │ zeusiscrazy.txt            │ 556 B
 1 │ File │ coww.txt                   │  24 B
 2 │ File │ randomweirdstuff.txt       │ 197 B
 3 │ File │ abaracadabra.txt           │ 401 B
 4 │ File │ youshouldeatmorecereal.txt │ 768 B
━━━┷━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━
> ls | select size type name
━━━┯━━━━━━━━┯━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │ size   │ type │ name
───┼────────┼──────┼────────────────────────────
 0556 B  │ File │ zeusiscrazy.txt
 124 B  │ File │ coww.txt
 2197 B  │ File │ randomweirdstuff.txt
 3401 B  │ File │ abaracadabra.txt
 4768 B  │ File │ youshouldeatmorecereal.txt
━━━┷━━━━━━━━┷━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━

shells

Lists all the active nu shells with a number/index, a name and the path. Also marks the current nu shell.

# Examples

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼─────────────────────────────────────
 0 │   │ filesystem │ /home/jonathanturner/Source/nushell
 1 │   │ filesystem │ /usr
 2 │ X │ filesystem │ /home
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
/> shells
━━━┯━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name                                             │ path
───┼───┼──────────────────────────────────────────────────┼─────────────────────────────────────
 0 │   │ filesystem                                       │ /home/jonathanturner/Source/nushell
 1 │ X │ {/home/jonathanturner/Source/nushell/Cargo.toml} │ /
━━━┷━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

shuffle

Shuffles the rows in a random order.

# Examples

Passing the same input to shuffle multiple times gives different results -

> echo [ a b c d ] | shuffle
───┬───
 0 │ a
 1 │ c
 2 │ d
 3 │ b
───┴───
> echo [ a b c d ] | shuffle
───┬───
 0 │ c
 1 │ b
 2 │ d
 3 │ a
───┴───
> echo [ a b c d ] | shuffle
───┬───
 0 │ c
 1 │ b
 2 │ a
 3 │ d
───┴───

size

This commands gives word count statistics on any text.

# Examples

> open lalala.txt | size
━━━━━━━┯━━━━━━━┯━━━━━━━┯━━━━━━━━━━━━
 lines │ words │ chars │ bytes
───────┼───────┼───────┼────────────
     4107272
━━━━━━━┷━━━━━━━┷━━━━━━━┷━━━━━━━━━━━━
> open the_mysterious_affair_at_styles.txt | size
━━━━━━━┯━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━
 lines │ words │ chars  │ bytes
───────┼───────┼────────┼────────────
  893562352349459361771
━━━━━━━┷━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━

skip-while

# skip-while

Skips rows while the condition matches.

# Usage

> [input-command] | skip-while <condition>

# Examples

If we open a file with a list of contacts, we get all of the contacts.

> open contacts.csv | sort-by "last name"
───┬────────────┬───────────┬──────────────────
 # │ first name │ last name │ email
───┼────────────┼───────────┼──────────────────
 0 │ John       │ Abbot     │ abbot@email.com
 1 │ Chris      │ Beasly    │ beasly@email.com
 2 │ Jane       │ Carver    │ carver@email.com
 3 │ Francis    │ Davis     │ davis@email.com
───┴────────────┴───────────┴──────────────────

To exclude skip contacts with last names starting with 'A' or 'B', use skip-while:

> open contacts.csv | sort-by "last name" |  skip-while "last name" < "C"
───┬────────────┬───────────┬──────────────────
 # │ first name │ last name │ email
───┼────────────┼───────────┼──────────────────
 0 │ Jane       │ Carver    │ carver@email.com
 1 │ Francis    │ Davis     │ davis@email.com
───┴────────────┴───────────┴──────────────────

Note that the order of input rows matters. Once a single row does not match the condition, all following rows are included in the output, whether or not they match the condition:

> open contacts.csv | skip-while "last name" < "C"
───┬────────────┬───────────┬──────────────────
 # │ first name │ last name │ email
───┼────────────┼───────────┼──────────────────
 0 │ Jane       │ Carver    │ carver@email.com
 1 │ Chris      │ Beasly    │ beasly@email.com
 2 │ Francis    │ Davis     │ davis@email.com
───┴────────────┴───────────┴──────────────────

See the where command to filter each individual row by a condition, regardless of order.

skip

Skips the first 'n' rows of a table.

# Usage

> [input-command] | skip (n)

# Examples

If we open a file with a list of contacts, we get all of the contacts.

> open contacts.csv
───┬─────────┬──────┬─────────────────
 # │ first   │ last │ email
───┼─────────┼──────┼─────────────────
 0 │ John    │ Doe  │ doe.1@email.com
 1 │ Jane    │ Doe  │ doe.2@email.com
 2 │ Chris   │ Doe  │ doe.3@email.com
 3 │ Francis │ Doe  │ doe.4@email.com
───┴─────────┴──────┴─────────────────

To ignore the first 2 contacts, we can skip them.

> open contacts.csv | skip 2
───┬─────────┬──────┬─────────────────
 # │ first   │ last │ email
───┼─────────┼──────┼─────────────────
 0 │ Chris   │ Doe  │ doe.3@email.com
 1 │ Francis │ Doe  │ doe.4@email.com
───┴─────────┴──────┴─────────────────

sort-by

# sort-by

The sort-by command sorts the table being displayed in the terminal by a chosen column(s).

sort-by takes multiple arguments (being the names of columns) sorting by each argument in order.

# Flags

  • -i, --insensitive: Sort string-based columns case insensitively

# Examples

> ls | sort-by size
━━━┯━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name │ type │ readonly │ size   │ accessed       │ modified
───┼──────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ az   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 1 │ a    │ File │          │  18 B  │ 4 minutes ago  │ 38 minutes ago
 2 │ ad   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 3 │ ac   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 4 │ ab   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 5 │ c    │ File │          │ 102 B  │ 35 minutes ago │ 35 minutes ago
 6 │ d    │ File │          │ 189 B  │ 35 minutes ago │ 34 minutes ago
 7 │ b    │ File │          │ 349 B  │ 35 minutes ago │ 35 minutes ago
━━━┷━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
> ls | sort-by size name
━━━┯━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name │ type │ readonly │ size   │ accessed       │ modified
───┼──────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ a    │ File │          │  18 B  │ 4 minutes ago  │ 39 minutes ago
 1 │ ab   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 2 │ ac   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 3 │ ad   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 4 │ az   │ File │          │  18 B  │ 4 minutes ago  │ 4 minutes ago
 5 │ c    │ File │          │ 102 B  │ 36 minutes ago │ 35 minutes ago
 6 │ d    │ File │          │ 189 B  │ 35 minutes ago │ 35 minutes ago
 7 │ b    │ File │          │ 349 B  │ 36 minutes ago │ 36 minutes ago
> ls | sort-by accessed
━━━┯━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 # │ name │ type │ readonly │ size   │ accessed       │ modified
───┼──────┼──────┼──────────┼────────┼────────────────┼────────────────
 0 │ b    │ File │          │ 349 B  │ 37 minutes ago │ 37 minutes ago
 1 │ c    │ File │          │ 102 B  │ 37 minutes ago │ 37 minutes ago
 2 │ d    │ File │          │ 189 B  │ 37 minutes ago │ 36 minutes ago
 3 │ a    │ File │          │  18 B  │ 6 minutes ago  │ 40 minutes ago
 4 │ ab   │ File │          │  18 B  │ 6 minutes ago  │ 6 minutes ago
 5 │ ac   │ File │          │  18 B  │ 6 minutes ago  │ 6 minutes ago
 6 │ ad   │ File │          │  18 B  │ 5 minutes ago  │ 5 minutes ago
 7 │ az   │ File │          │  18 B  │ 5 minutes ago  │ 5 minutes ago
━━━┷━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━

Within the Nushell repository...

> ls | sort-by --insensitive name
────┬────────────────────┬──────┬──────────┬──────────────
 #  │ name               │ type │ size     │ modified
────┼────────────────────┼──────┼──────────┼──────────────
  0 │ assets             │ Dir  │    128 B │ 6 months ago
  1 │ build.rs           │ File │     78 B │ 5 months ago
  2 │ Cargo.lock         │ File │ 118.3 KB │ 1 hour ago
  3 │ Cargo.toml         │ File │   5.5 KB │ 1 hour ago
  4 │ CODE_OF_CONDUCT.md │ File │   3.4 KB │ 1 hour ago
  5 │ CONTRIBUTING.md    │ File │   1.3 KB │ 1 hour ago
  6 │ crates             │ Dir  │    832 B │ 1 hour ago
  7 │ debian             │ Dir  │    352 B │ 6 months ago
  8 │ docker             │ Dir  │    288 B │ 4 months ago
  9 │ docs               │ Dir  │    192 B │ 1 hour ago
 10 │ features.toml      │ File │    632 B │ 5 months ago
 11 │ images             │ Dir  │    160 B │ 6 months ago
 12 │ LICENSE            │ File │   1.1 KB │ 4 months ago
 13 │ Makefile.toml      │ File │    449 B │ 6 months ago
 14 │ README.build.txt   │ File │    192 B │ 1 hour ago
 15 │ README.md          │ File │  16.0 KB │ 1 hour ago
 16 │ rustfmt.toml       │ File │     16 B │ 6 months ago
 17 │ src                │ Dir  │    128 B │ 1 week ago
 18 │ target             │ Dir  │    160 B │ 1 day ago
 19 │ tests              │ Dir  │    192 B │ 4 months ago
 20 │ TODO.md            │ File │      0 B │ 1 week ago
 21 │ wix                │ Dir  │    128 B │ 1 hour ago
────┴────────────────────┴──────┴──────────┴──────────────

Within the Nushell repository...

> ls | sort-by --insensitive type name
────┬────────────────────┬──────┬──────────┬──────────────
 #  │ name               │ type │ size     │ modified
────┼────────────────────┼──────┼──────────┼──────────────
  0 │ assets             │ Dir  │    128 B │ 6 months ago
  1 │ crates             │ Dir  │    832 B │ 1 hour ago
  2 │ debian             │ Dir  │    352 B │ 6 months ago
  3 │ docker             │ Dir  │    288 B │ 4 months ago
  4 │ docs               │ Dir  │    192 B │ 1 hour ago
  5 │ images             │ Dir  │    160 B │ 6 months ago
  6 │ src                │ Dir  │    128 B │ 1 week ago
  7 │ target             │ Dir  │    160 B │ 1 day ago
  8 │ tests              │ Dir  │    192 B │ 4 months ago
  9 │ wix                │ Dir  │    128 B │ 1 hour ago
 10 │ build.rs           │ File │     78 B │ 5 months ago
 11 │ Cargo.lock         │ File │ 118.3 KB │ 1 hour ago
 12 │ Cargo.toml         │ File │   5.5 KB │ 1 hour ago
 13 │ CODE_OF_CONDUCT.md │ File │   3.4 KB │ 1 hour ago
 14 │ CONTRIBUTING.md    │ File │   1.3 KB │ 1 hour ago
 15 │ features.toml      │ File │    632 B │ 5 months ago
 16 │ LICENSE            │ File │   1.1 KB │ 4 months ago
 17 │ Makefile.toml      │ File │    449 B │ 6 months ago
 18 │ README.build.txt   │ File │    192 B │ 1 hour ago
 19 │ README.md          │ File │  16.0 KB │ 1 hour ago
 20 │ rustfmt.toml       │ File │     16 B │ 6 months ago
 21 │ TODO.md            │ File │      0 B │ 1 week ago
────┴────────────────────┴──────┴──────────┴──────────────

split row

splits contents over multiple rows via the separator.

Syntax: split row <separator>

# Parameters

  • <separator> the character that denotes what separates rows

# Examples

We can build a table from a file that looks like this

> open table.txt
4, 0, 2, 0, 7, 8

using the split row command.

open table.txt | split row ", "
───┬───
 # │
───┼───
 04
 10
 22
 30
 47
 58
───┴───

str

Applies the subcommand to a value or a table.

# Examples

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ /home/TUX/stuff/expr/stuff
 1 │   │ filesystem │ /
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> shells | str upcase path
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ /HOME/TUX/STUFF/EXPR/STUFF
 1 │   │ filesystem │ /
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> shells | str downcase path
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ /home/tux/stuff/expr/stuff
 1 │   │ filesystem │ /
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> shells | str substring "21, 99" path
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ stuff
 1 │   │ filesystem │
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> shells | str substring "6," path
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ TUX/stuff/expr/stuff
 1 │   │ filesystem │
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> echo "1, 2, 3" | split row "," | str to-int | math sum
6
> echo "nu" | str capitalize
Nu
> echo "Nu    " | str trim
Nu
> echo "Nushell" | str reverse
llehsuN
> shells | str find-replace "TUX" "skipper" path
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ /home/skipper/stuff/expr/stuff
 1 │   │ filesystem │ /
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

start

Opens each file/directory/URL using the default application.

Syntax: start ...args{flags}

# Parameters

  • args: a list of space-separated files to open

# Flags

-a --application <string>
  Specifies the application used for opening the files/directories/urls

# Example

Open index.html in the system's default browser (cross platform):

> start index.html

Open index.html in Firefox (specific path for OSX):

start index.html -a /Applications/Firefox.app

sys

This command gives information about the system nu is running on.

# Examples

> sys
─────────┬─────────────────────────────────────────
 host[row 7 columns]
 cpu     │ [row cores current ghz max ghz min ghz]
 disks   │ [table 4 rows]
 mem     │ [row free swap free swap total total]
 net     │ [table 19 rows]
 battery │ [table 1 rows]
─────────┴─────────────────────────────────────────
> sys | get host
──────────┬──────────────────────────────────────────────────────────────────────────────────────────────────
 name     │ Darwin
 release  │ 19.5.0
 version  │ Darwin Kernel Version 19.5.0: Tue May 26 20:41:44 PDT 2020; root:xnu-6153.121.2~2/RELEASE_X86_64
 hostname │ Josephs-MacBook-Pro.local
 arch     │ x86_64
 uptime5:10:12:33
 sessions │ [table 2 rows]
──────────┴──────────────────────────────────────────────────────────────────────────────────────────────────
> sys | get cpu
─────────────┬────────
 cores       │ 16
 current ghz │ 2.4000
 min ghz     │ 2.4000
 max ghz     │ 2.4000
─────────────┴────────
> sys | get mem
────────────┬─────────
 total      │ 68.7 GB
 free11.1 GB
 swap total │ 0 B
 swap free0 B
────────────┴─────────

tags

The tags commands allows users to access the metadata of the previous value in the pipeline. This command may be run on multiple values of input as well.

As of writing this, the only metadata returned includes:

  • span: the start and end indices of the previous value's substring location
  • anchor: the source where data was loaded from; this may not appear if the previous pipeline value didn't actually have a source (like trying to open a dir, or running ls on a dir)

# Examples

> open README.md | tags
────────┬──────────────────────────────────────────────────
 span   │ [row end start]
 anchor │ /Users/danielh/Projects/github/nushell/README.md
────────┴──────────────────────────────────────────────────
> open README.md | tags | get span
───────┬────
 start │ 5
 end   │ 14
───────┴────
> ls | tags | first 3 | get span
───┬───────┬─────
 # │ start │ end
───┼───────┼─────
 002
 102
 202
───┴───────┴─────

# Reference

More useful information on the tags command can be found by referencing The Nu Book's entry on Metadata (opens new window)

textview config

# textview config

The configuration for textview, which is used to autoview text files, uses bat (opens new window). The textview configuration will not use any existing bat configuration you may have.

# Configuration Points and Defaults

config point definition implemented
term_width The character width of the terminal (default: autodetect) yes
tab_width The width of tab characters (default: None - do not turn tabs to spaces) yes
colored_output Whether or not the output should be colorized (default: true) yes
true_color Whether or not to output 24bit colors (default: true) yes
header Whether to show a header with the file name yes
line_numbers Whether to show line numbers yes
grid Whether to paint a grid, separating line numbers, git changes and the code yes
vcs_modification_markers Whether to show modification markers for VCS changes. This has no effect if the git feature is not activated. yes
snip Whether to show "snip" markers between visible line ranges (default: no) yes
wrapping_mode Text wrapping mode (default: do not wrap), options (Character, NoWrapping) yes
use_italics Whether or not to use ANSI italics (default: off) yes
paging_mode If and how to use a pager (default: no paging), options (Always, QuitIfOneScreen, Never) yes
pager Specify the command to start the pager (default: use "less") yes
line_ranges Specify the lines that should be printed (default: all) no
highlight Specify a line that should be highlighted (default: none). This can be called multiple times to highlight more than one line. See also: highlight_range. no
highlight_range Specify a range of lines that should be highlighted (default: none). This can be called multiple times to highlight more than one range of lines. no
theme Specify the highlighting theme (default: OneHalfDark) yes

# Example textview confguration for config.toml

[textview]
term_width = "default"
tab_width = 4
colored_output = true
true_color = true
header = true
line_numbers = false
grid = false
vcs_modification_markers = true
snip = true
wrapping_mode = "NoWrapping"
use_italics = true
paging_mode = "QuitIfOneScreen"
pager = "less"
theme = "TwoDark"

# Example Usage

> open src/main.rs
> cat some_file.txt | textview
> fetch https://www.jonathanturner.org/feed.xml --raw

# Help

For a more detailed description of the configuration points that textview uses, please visit the bat repo at https://github.com/sharkdp/bat

to csv

Converts table data into csv text.

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to csv
 ,name,path
X,filesystem,/home/shaurya
 ,filesystem,/home/shaurya/Pictures
 ,filesystem,/home/shaurya/Desktop
> open caco3_plastics.csv
━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━━━━
 # │ importer     │ shipper      │ tariff_item │ name         │ origin   │ shipped_at │ arrived_at │ net_weight │ fob_price │ cif_price │ cif_per_net_
   │              │              │             │              │          │            │            │            │           │           │ weight
───┼──────────────┼──────────────┼─────────────┼──────────────┼──────────┼────────────┼────────────┼────────────┼───────────┼───────────┼──────────────
 0 │ PLASTICOS    │ S A REVERTE  │ 2509000000  │ CARBONATO DE │ SPAIN    │ 18/03/2016 │ 17/04/2016 │ 81,000.00  │ 14,417.58 │ 18,252.34 │ 0.23
   │ RIVAL CIA    │              │             │ CALCIO TIPO  │          │            │            │            │           │           │
   │ LTDA         │              │             │ CALCIPORE    │          │            │            │            │           │           │
   │              │              │             │ 160 T AL     │          │            │            │            │           │           │
 1 │ MEXICHEM     │ OMYA ANDINA  │ 2836500000  │ CARBONATO    │ COLOMBIA │ 07/07/2016 │ 10/07/2016 │ 26,000.00  │ 7,072.00  │ 8,127.18  │ 0.31
   │ ECUADOR S.A. │ S A          │             │              │          │            │            │            │           │           │
 2 │ PLASTIAZUAY  │ SA REVERTE   │ 2836500000  │ CARBONATO DE │ SPAIN    │ 27/07/2016 │ 09/08/2016 │ 81,000.00  │ 8,100.00  │ 11,474.55 │ 0.14
   │ SA           │              │             │ CALCIO       │          │            │            │            │           │           │
 3 │ PLASTICOS    │ AND          │ 2836500000  │ CALCIUM      │ TURKEY   │ 04/10/2016 │ 11/11/2016 │ 100,000.00 │ 17,500.00 │ 22,533.75 │ 0.23
   │ RIVAL CIA    │ ENDUSTRIYEL  │             │ CARBONATE    │          │            │            │            │           │           │
   │ LTDA         │ HAMMADDELER  │             │ ANADOLU      │          │            │            │            │           │           │
   │              │ DIS TCARET   │             │ ANDCARB CT-1 │          │            │            │            │           │           │
   │              │ LTD.STI.     │             │              │          │            │            │            │           │           │
 4 │ QUIMICA      │ SA REVERTE   │ 2836500000  │ CARBONATO DE │ SPAIN    │ 24/06/2016 │ 12/07/2016 │ 27,000.00  │ 3,258.90  │ 5,585.00  │ 0.21
   │ COMERCIAL    │              │             │ CALCIO       │          │            │            │            │           │           │
   │ QUIMICIAL    │              │             │              │          │            │            │            │           │           │
   │ CIA. LTDA.   │              │             │              │          │            │            │            │           │           │
 5 │ PICA         │ OMYA ANDINA  │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/01/1900 │ 18/01/2016 │ 66,500.00  │ 12,635.00 │ 18,670.52 │ 0.28
   │ PLASTICOS    │ S.A          │             │ CALCIO       │          │            │            │            │           │           │
   │ INDUSTRIALES │              │             │              │          │            │            │            │           │           │
   │ C.A.         │              │             │              │          │            │            │            │           │           │
 6 │ PLASTIQUIM   │ OMYA ANDINA  │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/01/1900 │ 25/10/2016 │ 33,000.00  │ 6,270.00  │ 9,999.00  │ 0.30
   │ S.A.         │ S.A NIT      │             │ CALCIO       │          │            │            │            │           │           │
   │              │ 830.027.386- │             │ RECUBIERTO   │          │            │            │            │           │           │
   │              │ 6            │             │ CON ACIDO    │          │            │            │            │           │           │
   │              │              │             │ ESTEARICO    │          │            │            │            │           │           │
   │              │              │             │ OMYA CARB 1T │          │            │            │            │           │           │
   │              │              │             │ CG BBS 1000  │          │            │            │            │           │           │
 7 │ QUIMICOS     │ SIBELCO      │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/11/2016 │ 03/11/2016 │ 52,000.00  │ 8,944.00  │ 13,039.05 │ 0.25
   │ ANDINOS      │ COLOMBIA SAS │             │ CALCIO       │          │            │            │            │           │           │
   │ QUIMANDI     │              │             │ RECUBIERTO   │          │            │            │            │           │           │
   │ S.A.         │              │             │              │          │            │            │            │           │           │
 8 │ TIGRE        │ OMYA ANDINA  │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/01/1900 │ 28/10/2016 │ 66,000.00  │ 11,748.00 │ 18,216.00 │ 0.28
   │ ECUADOR S.A. │ S.A NIT      │             │ CALCIO       │          │            │            │            │           │           │
   │ ECUATIGRE    │ 830.027.386- │             │ RECUBIERTO   │          │            │            │            │           │           │
   │              │ 6            │             │ CON ACIDO    │          │            │            │            │           │           │
   │              │              │             │ ESTEARICO    │          │            │            │            │           │           │
   │              │              │             │ OMYACARB 1T  │          │            │            │            │           │           │
   │              │              │             │ CG BPA 25 NO │          │            │            │            │           │           │
━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━━━━
> open caco3_plastics.csv | to csv
importer,shipper,tariff_item,name,origin,shipped_at,arrived_at,net_weight,fob_price,cif_price,cif_per_net_weight
PLASTICOS RIVAL CIA LTDA,S A REVERTE,2509000000,CARBONATO DE CALCIO TIPO CALCIPORE 160 T AL,SPAIN,18/03/2016,17/04/2016,"81,000.00","14,417.58","18,252.34",0.23
MEXICHEM ECUADOR S.A.,OMYA ANDINA S A,2836500000,CARBONATO,COLOMBIA,07/07/2016,10/07/2016,"26,000.00","7,072.00","8,127.18",0.31
PLASTIAZUAY SA,SA REVERTE,2836500000,CARBONATO DE CALCIO,SPAIN,27/07/2016,09/08/2016,"81,000.00","8,100.00","11,474.55",0.14
PLASTICOS RIVAL CIA LTDA,AND ENDUSTRIYEL HAMMADDELER DIS TCARET LTD.STI.,2836500000,CALCIUM CARBONATE ANADOLU ANDCARB CT-1,TURKEY,04/10/2016,11/11/2016,"100,000.00","17,500.00","22,533.75",0.23
QUIMICA COMERCIAL QUIMICIAL CIA. LTDA.,SA REVERTE,2836500000,CARBONATO DE CALCIO,SPAIN,24/06/2016,12/07/2016,"27,000.00","3,258.90","5,585.00",0.21
PICA PLASTICOS INDUSTRIALES C.A.,OMYA ANDINA S.A,3824909999,CARBONATO DE CALCIO,COLOMBIA,01/01/1900,18/01/2016,"66,500.00","12,635.00","18,670.52",0.28
PLASTIQUIM S.A.,OMYA ANDINA S.A NIT 830.027.386-6,3824909999,CARBONATO DE CALCIO RECUBIERTO CON ACIDO ESTEARICO OMYA CARB 1T CG BBS 1000,COLOMBIA,01/01/1900,25/10/2016,"33,000.00","6,270.00","9,999.00",0.30
QUIMICOS ANDINOS QUIMANDI S.A.,SIBELCO COLOMBIA SAS,3824909999,CARBONATO DE CALCIO RECUBIERTO,COLOMBIA,01/11/2016,03/11/2016,"52,000.00","8,944.00","13,039.05",0.25
TIGRE ECUADOR S.A. ECUATIGRE,OMYA ANDINA S.A NIT 830.027.386-6,3824909999,CARBONATO DE  CALCIO RECUBIERTO CON ACIDO ESTEARICO OMYACARB 1T CG BPA 25 NO,COLOMBIA,01/01/1900,28/10/2016,"66,000.00","11,748.00","18,216.00",0.28

To use a character other than ',' to separate records, use --separator:

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to csv --separator ';'
 ;name,path
X;filesystem;/home/shaurya
 ;filesystem;/home/shaurya/Pictures
 ;filesystem;/home/shaurya/Desktop

The string '\t' can be used to separate on tabs. Note that this is the same as using the to tsv command.

Newlines '\n' are not acceptable separators.

Note that separators are currently provided as strings and need to be wrapped in quotes.

It is also considered an error to use a separator greater than one char:

> open pets.txt | from csv --separator '123'
error: Expected a single separator char from --separator
- shell:1:37
1 | open pets.txt | from csv --separator '123'
  |                                      ^^^^^ requires a single character string input

to json

Converts table data into JSON text.

# Flags

  • -p, --pretty <integer>: Formats the JSON text with the provided indentation setting

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to json
[{" ":"X","name":"filesystem","path":"/home/shaurya"},{" ":" ","name":"filesystem","path":"/home/shaurya/Pictures"},{" ":" ","name":"filesystem","path":"/home/shaurya/Desktop"}]
> open sgml_description.json
━━━━━━━━━━━━━━━━
 glossary
────────────────
 [table: 1 row]
━━━━━━━━━━━━━━━━
> open sgml_description.json | to json
{"glossary":{"title":"example glossary","GlossDiv":{"title":"S","GlossList":{"GlossEntry":{"ID":"SGML","SortAs":"SGML","GlossTerm":"Standard Generalized Markup Language","Acronym":"SGML","Abbrev":"ISO 8879:1986","Height":10,"GlossDef":{"para":"A meta-markup language, used to create markup languages such as DocBook.","GlossSeeAlso":["GML","XML"]},"Sections":[101,102],"GlossSee":"markup"}}}}}

We can also convert formats!

> open jonathan.xml
━━━━━━━━━━━━━━━━
 rss
────────────────
 [table: 1 row]
━━━━━━━━━━━━━━━━
> open jonathan.xml | to json
{"rss":[{"channel":[{"title":["Jonathan Turner"]},{"link":["http://www.jonathanturner.org"]},{"link":[]},{"item":[{"title":["Creating crossplatform Rust terminal apps"]},{"description":["<p><img src=\"/images/pikachu.jpg\" alt=\"Pikachu animation in Windows\" /></p>\n\n<p><em>Look Mom, Pikachu running in Windows CMD!</em></p>\n\n<p>Part of the adventure is not seeing the way ahead and going anyway.</p>\n"]},{"pubDate":["Mon, 05 Oct 2015 00:00:00 +0000"]},{"link":["http://www.jonathanturner.org/2015/10/off-to-new-adventures.html"]},{"guid":["http://www.jonathanturner.org/2015/10/off-to-new-adventures.html"]}]}]}]}

to md

Convert table into simple Markdown.

# Flags

  • -p, --pretty: Formats the Markdown table to vertically align items

# Example

> ls | to md
|name|type|size|modified|
|-|-|-|-|
|CODE_OF_CONDUCT.md|File|3.4 KB|2 months ago|
|CONTRIBUTING.md|File|1.4 KB|1 month ago|
|Cargo.lock|File|144.4 KB|2 days ago|
|Cargo.toml|File|6.0 KB|2 days ago|
|LICENSE|File|1.1 KB|2 months ago|
|Makefile.toml|File|449 B|2 months ago|
|README.build.txt|File|192 B|2 months ago|
|README.md|File|15.9 KB|1 month ago|
|TODO.md|File|0 B|2 months ago|
|crates|Dir|896 B|2 days ago|
|debian|Dir|352 B|2 months ago|
|docker|Dir|288 B|1 month ago|
|docs|Dir|256 B|1 month ago|
|features.toml|File|632 B|2 months ago|
|images|Dir|160 B|2 months ago|
|pkg_mgrs|Dir|96 B|1 month ago|
|rustfmt.toml|File|16 B|9 months ago|
|samples|Dir|96 B|1 month ago|
|src|Dir|128 B|2 days ago|
|target|Dir|160 B|1 month ago|
|tests|Dir|192 B|2 months ago|
|wix|Dir|128 B|23 hours ago|

If we provide the -p flag, we can obtain a formatted version of the Markdown table

> ls | to md  -p
|name              |type|size    |modified    |
|------------------|----|--------|------------|
|CODE_OF_CONDUCT.md|File|3.4 KB  |2 months ago|
|CONTRIBUTING.md   |File|1.4 KB  |1 month ago |
|Cargo.lock        |File|144.4 KB|2 days ago  |
|Cargo.toml        |File|6.0 KB  |2 days ago  |
|LICENSE           |File|1.1 KB  |2 months ago|
|Makefile.toml     |File|449 B   |2 months ago|
|README.build.txt  |File|192 B   |2 months ago|
|README.md         |File|15.9 KB |1 month ago |
|TODO.md           |File|0 B     |2 months ago|
|crates            |Dir |896 B   |2 days ago  |
|debian            |Dir |352 B   |2 months ago|
|docker            |Dir |288 B   |1 month ago |
|docs              |Dir |256 B   |1 month ago |
|features.toml     |File|632 B   |2 months ago|
|images            |Dir |160 B   |2 months ago|
|pkg_mgrs          |Dir |96 B    |1 month ago |
|rustfmt.toml      |File|16 B    |9 months ago|
|samples           |Dir |96 B    |1 month ago |
|src               |Dir |128 B   |2 days ago  |
|target            |Dir |160 B   |1 month ago |
|tests             |Dir |192 B   |2 months ago|
|wix               |Dir |128 B   |23 hours ago|

to toml

Converts table data into toml text.

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to toml
[[]]
" " = "X"
name = "filesystem"
path = "/home/shaurya"

[[]]
" " = " "
name = "filesystem"
path = "/home/shaurya/Pictures"

[[]]
" " = " "
name = "filesystem"
path = "/home/shaurya/Desktop"
> open cargo_sample.toml
━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
 dependencies   │ dev-dependencies │ package
────────────────┼──────────────────┼────────────────
 [table: 1 row][table: 1 row][table: 1 row]
━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
> open cargo_sample.toml | to toml
[dependencies]
ansi_term = "0.11.0"
directories = "2.0.2"
byte-unit = "2.1.0"
bytes = "0.4.12"
chrono-humanize = "0.0.11"
chrono-tz = "0.5.1"
clap = "2.33.0"
conch-parser = "0.1.1"
derive-new = "0.5.6"
dunce = "1.0.0"
futures-sink-preview = "0.3.0-alpha.16"
futures_codec = "0.2.2"
getset = "0.0.7"
itertools = "0.8.0"
lalrpop-util = "0.17.0"
language-reporting = "0.3.0"
log = "0.4.6"
logos = "0.10.0-rc2"
logos-derive = "0.10.0-rc2"
nom = "5.0.0-beta1"
ordered-float = "1.0.2"
pretty_env_logger = "0.3.0"
prettyprint = "0.6.0"
prettytable-rs = "0.8.0"
regex = "1.1.6"
rustyline = "4.1.0"
serde = "1.0.91"
serde_derive = "1.0.91"
serde_json = "1.0.39"
sysinfo = "0.8.4"
term = "0.5.2"
tokio-fs = "0.1.6"
toml = "0.5.1"
toml-query = "0.9.0"

[dependencies.chrono]
features = ["serde"]
version = "0.4.6"

[dependencies.cursive]
default-features = false
features = ["pancurses-backend"]
version = "0.26.0"

[dependencies.futures-preview]
features = ["compat", "io-compat"]
version = "0.3.0-alpha.16"

[dependencies.indexmap]
features = ["serde-1"]
version = "1.0.2"

[dependencies.pancurses]
features = ["win32a"]
version = "0.16"

[dev-dependencies]
pretty_assertions = "0.6.1"

[package]
authors = ["The Nu Project Contributors"]
description = "A shell for the GitHub era"
edition = "2018"
license = "ISC"
name = "nu"
version = "0.1.1"

to tsv

Converts table data into tsv text.

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to tsv
  name  path
X filesystem  /home/shaurya
> open caco3_plastics.tsv
━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━━━━
 # │ importer     │ shipper      │ tariff_item │ name         │ origin   │ shipped_at │ arrived_at │ net_weight │ fob_price │ cif_price │ cif_per_net_
   │              │              │             │              │          │            │            │            │           │           │ weight
───┼──────────────┼──────────────┼─────────────┼──────────────┼──────────┼────────────┼────────────┼────────────┼───────────┼───────────┼──────────────
 0 │ PLASTICOS    │ S A REVERTE  │ 2509000000  │ CARBONATO DE │ SPAIN    │ 18/03/2016 │ 17/04/2016 │ 81,000.00  │ 14,417.58 │ 18,252.34 │ 0.23
   │ RIVAL CIA    │              │             │ CALCIO TIPO  │          │            │            │            │           │           │
   │ LTDA         │              │             │ CALCIPORE    │          │            │            │            │           │           │
   │              │              │             │ 160 T AL     │          │            │            │            │           │           │
 1 │ MEXICHEM     │ OMYA ANDINA  │ 2836500000  │ CARBONATO    │ COLOMBIA │ 07/07/2016 │ 10/07/2016 │ 26,000.00  │ 7,072.00  │ 8,127.18  │ 0.31
   │ ECUADOR S.A. │ S A          │             │              │          │            │            │            │           │           │
 2 │ PLASTIAZUAY  │ SA REVERTE   │ 2836500000  │ CARBONATO DE │ SPAIN    │ 27/07/2016 │ 09/08/2016 │ 81,000.00  │ 8,100.00  │ 11,474.55 │ 0.14
   │ SA           │              │             │ CALCIO       │          │            │            │            │           │           │
 3 │ PLASTICOS    │ AND          │ 2836500000  │ CALCIUM      │ TURKEY   │ 04/10/2016 │ 11/11/2016 │ 100,000.00 │ 17,500.00 │ 22,533.75 │ 0.23
   │ RIVAL CIA    │ ENDUSTRIYEL  │             │ CARBONATE    │          │            │            │            │           │           │
   │ LTDA         │ HAMMADDELER  │             │ ANADOLU      │          │            │            │            │           │           │
   │              │ DIS TCARET   │             │ ANDCARB CT-1 │          │            │            │            │           │           │
   │              │ LTD.STI.     │             │              │          │            │            │            │           │           │
 4 │ QUIMICA      │ SA REVERTE   │ 2836500000  │ CARBONATO DE │ SPAIN    │ 24/06/2016 │ 12/07/2016 │ 27,000.00  │ 3,258.90  │ 5,585.00  │ 0.21
   │ COMERCIAL    │              │             │ CALCIO       │          │            │            │            │           │           │
   │ QUIMICIAL    │              │             │              │          │            │            │            │           │           │
   │ CIA. LTDA.   │              │             │              │          │            │            │            │           │           │
 5 │ PICA         │ OMYA ANDINA  │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/01/1900 │ 18/01/2016 │ 66,500.00  │ 12,635.00 │ 18,670.52 │ 0.28
   │ PLASTICOS    │ S.A          │             │ CALCIO       │          │            │            │            │           │           │
   │ INDUSTRIALES │              │             │              │          │            │            │            │           │           │
   │ C.A.         │              │             │              │          │            │            │            │           │           │
 6 │ PLASTIQUIM   │ OMYA ANDINA  │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/01/1900 │ 25/10/2016 │ 33,000.00  │ 6,270.00  │ 9,999.00  │ 0.30
   │ S.A.         │ S.A NIT      │             │ CALCIO       │          │            │            │            │           │           │
   │              │ 830.027.386- │             │ RECUBIERTO   │          │            │            │            │           │           │
   │              │ 6            │             │ CON ACIDO    │          │            │            │            │           │           │
   │              │              │             │ ESTEARICO    │          │            │            │            │           │           │
   │              │              │             │ OMYA CARB 1T │          │            │            │            │           │           │
   │              │              │             │ CG BBS 1000  │          │            │            │            │           │           │
 7 │ QUIMICOS     │ SIBELCO      │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/11/2016 │ 03/11/2016 │ 52,000.00  │ 8,944.00  │ 13,039.05 │ 0.25
   │ ANDINOS      │ COLOMBIA SAS │             │ CALCIO       │          │            │            │            │           │           │
   │ QUIMANDI     │              │             │ RECUBIERTO   │          │            │            │            │           │           │
   │ S.A.         │              │             │              │          │            │            │            │           │           │
 8 │ TIGRE        │ OMYA ANDINA  │ 3824909999  │ CARBONATO DE │ COLOMBIA │ 01/01/1900 │ 28/10/2016 │ 66,000.00  │ 11,748.00 │ 18,216.00 │ 0.28
   │ ECUADOR S.A. │ S.A NIT      │             │ CALCIO       │          │            │            │            │           │           │
   │ ECUATIGRE    │ 830.027.386- │             │ RECUBIERTO   │          │            │            │            │           │           │
   │              │ 6            │             │ CON ACIDO    │          │            │            │            │           │           │
   │              │              │             │ ESTEARICO    │          │            │            │            │           │           │
   │              │              │             │ OMYACARB 1T  │          │            │            │            │           │           │
   │              │              │             │ CG BPA 25 NO │          │            │            │            │           │           │
━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━━━━
> open caco3_plastics.tsv | to tsv
importer        shipper tariff_item     name    origin  shipped_at      arrived_at      net_weight      fob_price       cif_price       cif_per_net_weight
PLASTICOS RIVAL CIA LTDA        S A REVERTE     2509000000      CARBONATO DE CALCIO TIPO CALCIPORE 160 T AL     SPAIN   18/03/2016      17/04/2016    81,000.00        14,417.58       18,252.34       0.23
MEXICHEM ECUADOR S.A.   OMYA ANDINA S A 2836500000      CARBONATO       COLOMBIA        07/07/2016      10/07/2016      26,000.00       7,072.00      8,127.18 0.31
PLASTIAZUAY SA  SA REVERTE      2836500000      CARBONATO DE CALCIO     SPAIN   27/07/2016      09/08/2016      81,000.00       8,100.00        11,474.55      0.14
PLASTICOS RIVAL CIA LTDA        AND ENDUSTRIYEL HAMMADDELER DIS TCARET LTD.STI. 2836500000      CALCIUM CARBONATE ANADOLU ANDCARB CT-1  TURKEY  04/10/2016     11/11/2016      100,000.00      17,500.00       22,533.75       0.23
QUIMICA COMERCIAL QUIMICIAL CIA. LTDA.  SA REVERTE      2836500000      CARBONATO DE CALCIO     SPAIN   24/06/2016      12/07/2016      27,000.00     3,258.90 5,585.00        0.21
PICA PLASTICOS INDUSTRIALES C.A.        OMYA ANDINA S.A 3824909999      CARBONATO DE CALCIO     COLOMBIA        01/01/1900      18/01/2016      66,500.00      12,635.00       18,670.52       0.28
PLASTIQUIM S.A. OMYA ANDINA S.A NIT 830.027.386-6       3824909999      CARBONATO DE CALCIO RECUBIERTO CON ACIDO ESTEARICO OMYA CARB 1T CG BBS 1000   COLOMBIA 01/01/1900      25/10/2016      33,000.00       6,270.00        9,999.00        0.30
QUIMICOS ANDINOS QUIMANDI S.A.  SIBELCO COLOMBIA SAS    3824909999      CARBONATO DE CALCIO RECUBIERTO  COLOMBIA        01/11/2016      03/11/2016    52,000.00        8,944.00        13,039.05       0.25
TIGRE ECUADOR S.A. ECUATIGRE    OMYA ANDINA S.A NIT 830.027.386-6       3824909999      CARBONATO DE  CALCIO RECUBIERTO CON ACIDO ESTEARICO OMYACARB 1T CG BPA 25 NO   COLOMBIA        01/01/1900      28/10/2016      66,000.00       11,748.00       18,216.00       0.28

to url

Converts table data into url-encoded text (opens new window).

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to url
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │ value
───┼───────────────────────────────────────────────────────
 0+=X&name=filesystem&path=%2Fhome%2Fshaurya
 1+=+&name=filesystem&path=%2Fhome%2Fshaurya%2FPictures
 2+=+&name=filesystem&path=%2Fhome%2Fshaurya%2FDesktop
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> open sample.url
━━━━━━━━━━┯━━━━━━━━┯━━━━━━┯━━━━━━━━
 bread    │ cheese │ meat │ fat
──────────┼────────┼──────┼────────
 baguette │ comté  │ ham  │ butter
━━━━━━━━━━┷━━━━━━━━┷━━━━━━┷━━━━━━━━
> open sample.url  | to url
bread=baguette&cheese=comt%C3%A9&meat=ham&fat=butter

to xml

Converts table data into XML text.

# Flags

  • -p, --pretty <integer>: Formats the XML text with the provided indentation setting

# Example

> open jonathan.xml
━━━━━━━━━━━━━━━━
 rss
────────────────
 [table: 1 row]
━━━━━━━━━━━━━━━━
> cat jonathan.xml
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
        <channel>
                <title>Jonathan Turner</title>
                <link>http://www.jonathanturner.org</link>
                <atom:link href="http://www.jonathanturner.org/feed.xml" rel="self" type="application/rss+xml" />

                        <item>
                                <title>Creating crossplatform Rust terminal apps</title>
        <description>&lt;p&gt;&lt;img src=&quot;/images/pikachu.jpg&quot; alt=&quot;Pikachu animation in Windows&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Look Mom, Pikachu running in Windows CMD!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Part of the adventure is not seeing the way ahead and going anyway.&lt;/p&gt;
</description>
<pubDate>Mon, 05 Oct 2015 00:00:00 +0000</pubDate>
<link>http://www.jonathanturner.org/2015/10/off-to-new-adventures.html</link>
<guid isPermaLink="true">http://www.jonathanturner.org/2015/10/off-to-new-adventures.html</guid>
</item>

        </channel>

</rss>
> open jonathan.xml | to xml --pretty 2
<rss version="2.0">
  <channel>
    <title>Jonathan Turner</title>
    <link>http://www.jonathanturner.org</link>
    <link href="http://www.jonathanturner.org/feed.xml" rel="self" type="application/rss+xml">
    </link>
    <item>
      <title>Creating crossplatform Rust terminal apps</title>
      <description>&lt;p&gt;&lt;img src=&quot;/images/pikachu.jpg&quot; alt=&quot;Pikachu animation in Windows&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Look Mom, Pikachu running in Windows CMD!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Part of the adventure is not seeing the way ahead and going anyway.&lt;/p&gt;
</description>
<pubDate>Mon, 05 Oct 2015 00:00:00 +0000</pubDate>
<link>http://www.jonathanturner.org/2015/10/off-to-new-adventures.html</link>
<guid isPermaLink="true">http://www.jonathanturner.org/2015/10/off-to-new-adventures.html</guid>
</item>
</channel>
</rss>

Due to XML and internal representation, to xml is currently limited, it will:

  • Only process table data loaded from XML files (e.g. open file.json | to xml will fail)
  • Drop XML prolog declarations
  • Drop namespaces
  • Drop comments

to yaml

Converts table data into yaml text.

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to yaml
---
- " ": X
  name: filesystem
  path: /home/shaurya
- " ": " "
  name: filesystem
  path: /home/shaurya/Pictures
- " ": " "
  name: filesystem
  path: /home/shaurya/Desktop
> open appveyor.yml
━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━
 image              │ environment    │ install         │ build │ test_script     │ cache
────────────────────┼────────────────┼─────────────────┼───────┼─────────────────┼─────────────────
 Visual Studio 2017[table: 1 row][table: 5 rows] │       │ [table: 2 rows][table: 2 rows]
━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━
> open appveyor.yml | to yaml
---
image: Visual Studio 2017
environment:
  global:
    PROJECT_NAME: nushell
    RUST_BACKTRACE: 1
  matrix:
    - TARGET: x86_64-pc-windows-msvc
      CHANNEL: nightly
      BITS: 64
install:
  - "set PATH=C:\\msys64\\mingw%BITS%\\bin;C:\\msys64\\usr\\bin;%PATH%"
  - "curl -sSf -o rustup-init.exe https://win.rustup.rs"
  - rustup-init.exe -y --default-host %TARGET% --default-toolchain %CHANNEL%-%TARGET%
  - "set PATH=%PATH%;C:\\Users\\appveyor\\.cargo\\bin"
  - "call \"C:\\Program Files (x86)\\Microsoft Visual Studio\\2017\\Community\\VC\\Auxiliary\\Build\\vcvars64.bat\""
build: false
test_script:
  - cargo build --verbose
  - cargo test --all --verbose
cache:
  - target -> Cargo.lock
  - "C:\\Users\\appveyor\\.cargo\\registry -> Cargo.lock"

to

Converts table data into a string or binary. The target format is specified as a subcommand, like to csv or to json.

# Available Subcommands

Subcommands without links are currently missing their documentation.

# Example

> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────
 0 │ X │ filesystem │ /home/shaurya
 1 │   │ filesystem │ /home/shaurya/Pictures
 2 │   │ filesystem │ /home/shaurya/Desktop
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━
> shells | to csv
 ,name,path
X,filesystem,/home/shaurya
 ,filesystem,/home/shaurya/Pictures
 ,filesystem,/home/shaurya/Desktop
> open sample.url
━━━━━━━━━━┯━━━━━━━━┯━━━━━━┯━━━━━━━━
 bread    │ cheese │ meat │ fat
──────────┼────────┼──────┼────────
 baguette │ comté  │ ham  │ butter
━━━━━━━━━━┷━━━━━━━━┷━━━━━━┷━━━━━━━━
> open sample.url  | to url
bread=baguette&cheese=comt%C3%A9&meat=ham&fat=butter

touch

Create one or more files in the current or an already existent directory. It has no effect on existing files. Unlike GNU touch, the access time and the modified time are not updated.

-h, --help Display help message.

# Examples

Create a file in an empty folder. Then touch the file and list files again to observe that the modified time has not been updated.

> ls
> touch file.ext; ls
──────────┬─────────────
 name     │ file.ext
 type     │ File
 size     │ 0 B
 modified │ 0 secs ago
──────────┴─────────────
> touch file.ext; ls
──────────┬───────────
 name     │ file.ext
 type     │ File
 size     │ 0 B
 modified │ 10 secs ago
──────────┴───────────

Create a file within an already existent folder.

> mkdir dir
> touch dir/file.ext; ls dir
──────────┬───────────
 name     │ dir/file.ext
 type     │ File
 size     │ 0 B
 modified │ 0 secs ago
──────────┴───────────

Create three files at once

> touch a b c
> ls
────┬────────────────────┬──────┬──────────┬──────────────
 #  │        name        │ type │   size   │   modified
────┼────────────────────┼──────┼──────────┼──────────────
  0 │ a                  │ File │      0 B │ 0 sec ago
  1 │ b                  │ File │      0 B │ 0 sec ago
  2 │ c                  │ File │      0 B │ 0 sec ago
────┴────────────────────┴──────┴──────────┴──────────────

uniq

Returns unique rows or values from a dataset.

# Examples

Given a file test.csv

first_name,last_name,rusty_at,type
Andrés,Robalino,10/11/2013,A
Andrés,Robalino,10/11/2013,A
Jonathan,Turner,10/12/2013,B
Yehuda,Katz,10/11/2013,A
> `open test.csv | uniq`
━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━━┯━━━━━━
 # │ first_name │ last_name │ rusty_at   │ type
───┼────────────┼───────────┼────────────┼──────
 0 │ Andrés     │ Robalino  │ 10/11/2013 │ A
 1 │ Jonathan   │ Turner    │ 10/12/2013 │ B
 2 │ Yehuda     │ Katz      │ 10/11/2013 │ A
━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━━┷━━━━━━
> `open test.csv | get type | uniq`
━━━┯━━━━━━━━━
 # │
───┼─────────
 0 │ A
 1 │ B
━━━┷━━━━━━━━━

# Counting

--count or -c is the flag to output a count column.

> `open test.csv | get type | uniq -c`
───┬───────┬───────
 # │ value │ count
───┼───────┼───────
 0 │ A     │     3
 1 │ B     │     2
───┴───────┴───────

update

Updates an existing column on a table. First parameter is the column to update and the second parameter is the value to put.

# Examples

> ls
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed  │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼───────────┼───────────
 0 │ zeusiscrazy.txt            │ File │          │ 556 B  │ a day ago │ a day ago
 1 │ coww.txt                   │ File │          │  24 B  │ a day ago │ a day ago
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ a day ago │ a day ago
 3 │ abaracadabra.txt           │ File │          │ 401 B  │ a day ago │ a day ago
 4 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a day ago │ a day ago
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━
> ls | update modified neverrrr
━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━
 # │ name                       │ type │ readonly │ size   │ accessed  │ modified
───┼────────────────────────────┼──────┼──────────┼────────┼───────────┼──────────
 0 │ zeusiscrazy.txt            │ File │          │ 556 B  │ a day ago │ neverrrr
 1 │ coww.txt                   │ File │          │  24 B  │ a day ago │ neverrrr
 2 │ randomweirdstuff.txt       │ File │          │ 197 B  │ a day ago │ neverrrr
 3 │ abaracadabra.txt           │ File │          │ 401 B  │ a day ago │ neverrrr
 4 │ youshouldeatmorecereal.txt │ File │          │ 768 B  │ a day ago │ neverrrr
━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━
> shells
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼────────────────────────────────
 0 │ X │ filesystem │ /home/username/stuff/expr/stuff
 1 │   │ filesystem │ /
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
> shells | update " " X | update path /
━━━┯━━━┯━━━━━━━━━━━━┯━━━━━━
 # │   │ name       │ path
───┼───┼────────────┼──────
 0 │ X │ filesystem │ /
 1 │ X │ filesystem │ /
━━━┷━━━┷━━━━━━━━━━━━┷━━━━━━

Collect all the values of a nested column and join them together

> version | update features {get features | str collect ', '}
───┬─────────┬──────────────────────────────────────────┬───────────────────────────
 # │ version │               commit_hash                │         features
───┼─────────┼──────────────────────────────────────────┼───────────────────────────
 00.20.0  │ fdab3368094e938c390f1e5a7892a42da45add3e │ default, clipboard, trash
───┴─────────┴──────────────────────────────────────────┴───────────────────────────

version

Outputs the nushell version.

# Examples

> version
─────────┬────────
 version │ 0.15.1
─────────┴────────

where

This command filters the content of a table based on a condition passed as a parameter, which must be a boolean expression making use of any of the table columns. Other commands such as ls are capable of feeding where with their output through pipelines.

Where has two general forms:

  • where <column_name> <comparison> <value>
  • where <column_name>

# Where with comparison

In the first form, where is passed a column name that the filter will run against. Next, is the operator used to compare this column to its value. The following operators are supported:

  • < (less than)
  • <= (less than or equal)
  • > (greater than)
  • >= (greater than or equal)
  • != (not equal)
  • == (equal)

Strings have two additional operators:

  • =~ (fuzzy match to allow)
  • !~ (fuzzy match to not allow)

Dates can also be compared using the duration types. For example, where accessed > 2w will check the date in accessed to see if it's greater than 2 weeks ago. Durations currently allow these abbreviations:

  • 1s (one second)
  • 1m (one minute)
  • 1h (one hour)
  • 1d (one day)
  • 1w (one week)
  • 1M (one month)
  • 1y (one year)

# Boolean check

Where with the form | where readonly is used to check boolean values. For example, the command ls --long | where readonly will list only those files that are readonly.

# Usage

> [input-command] | where [condition]

# Examples

> ls | where size > 4kb
───┬────────────┬──────┬──────────┬─────────────
 # │ name       │ type │ size     │ modified
───┼────────────┼──────┼──────────┼─────────────
 0 │ Cargo.lock │ File │ 113.3 KB │ 53 mins ago
 1 │ Cargo.toml │ File │   4.6 KB │ 53 mins ago
 2 │ README.md  │ File │  15.8 KB │ 2 mins ago
───┴────────────┴──────┴──────────┴─────────────
> ps | where cpu > 0
───┬───────┬──────────────────┬─────────┬────────┬──────────┬─────────
 # │ pid   │ name             │ status  │ cpu    │ mem      │ virtual
───┼───────┼──────────────────┼─────────┼────────┼──────────┼─────────
 017917 │ nu_plugin_core_p │ Running │ 4.16782.1 MB │  4.8 GB
 114717 │ Discord Helper ( │ Running │ 1.6842371.9 MB │  8.0 GB
 214713 │ Discord Helper   │ Running │ 0.209927.8 MB │  5.8 GB
 314710 │ Discord          │ Running │ 0.0883105.4 MB │  7.0 GB
 49643 │ Terminal         │ Running │ 4.0313266.4 MB │  7.6 GB
 57864 │ Microsoft.Python │ Running │ 0.9828340.9 MB │  8.0 GB
 624402 │ Code Helper (Ren │ Running │ 1.0644337.3 MB │  8.4 GB
 724401 │ Code Helper (Ren │ Running │ 1.0031593.5 MB │  8.6 GB
 8519 │ EmojiFunctionRow │ Running │ 0.206352.7 MB │  7.5 GB
 9376 │ CommCenter       │ Running │ 0.162030.0 MB │  6.5 GB
───┴───────┴──────────────────┴─────────┴────────┴──────────┴─────────

> ls -l | where accessed <= 1w
───┬────────────────────┬──────┬────────┬──────────┬───────────┬─────────────┬───────┬──────────┬──────────────┬─────────────┬─────────────
 # │ name               │ type │ target │ readonly │ mode      │ uid         │ group │ size     │ created      │ accessed    │ modified
───┼────────────────────┼──────┼────────┼──────────┼───────────┼─────────────┼───────┼──────────┼──────────────┼─────────────┼─────────────
 0 │ CODE_OF_CONDUCT.md │ File │        │ No       │ rw-r--r-- │ josephlyons │ staff │   3.4 KB │ 52 mins ago  │ 52 secs ago │ 52 mins ago
 1 │ CONTRIBUTING.md    │ File │        │ No       │ rw-r--r-- │ josephlyons │ staff │   1.3 KB │ 52 mins ago  │ 4 mins ago  │ 4 mins ago
 2 │ Cargo.lock         │ File │        │ No       │ rw-r--r-- │ josephlyons │ staff │ 113.3 KB │ 52 mins ago  │ 52 mins ago │ 52 mins ago
 3 │ Cargo.toml         │ File │        │ No       │ rw-r--r-- │ josephlyons │ staff │   4.6 KB │ 52 mins ago  │ 52 mins ago │ 52 mins ago
 4 │ README.md          │ File │        │ No       │ rw-r--r-- │ josephlyons │ staff │  15.8 KB │ 52 mins ago  │ 1 min ago   │ 1 min ago
 5 │ TODO.md            │ File │        │ No       │ rw-r--r-- │ josephlyons │ staff │      0 B │ 52 mins ago  │ 52 mins ago │ 52 mins ago
 6 │ crates             │ Dir  │        │ No       │ rwxr-xr-x │ josephlyons │ staff │    704 B │ 4 months ago │ 52 mins ago │ 52 mins ago
 7 │ docs               │ Dir  │        │ No       │ rwxr-xr-x │ josephlyons │ staff │    192 B │ 5 months ago │ 52 mins ago │ 52 mins ago
 8 │ src                │ Dir  │        │ No       │ rwxr-xr-x │ josephlyons │ staff │    128 B │ 5 months ago │ 1 day ago   │ 1 day ago
 9 │ target             │ Dir  │        │ No       │ rwxr-xr-x │ josephlyons │ staff │    160 B │ 5 days ago   │ 5 days ago  │ 5 days ago
───┴────────────────────┴──────┴────────┴──────────┴───────────┴─────────────┴───────┴──────────┴──────────────┴─────────────┴─────────────
> ls -a | where name =~ "yml"
──────────┬─────────────
 name     │ .gitpod.yml
 type     │ File
 size     │ 866 B
 modified │ 1 month ago
──────────┴─────────────

which

Finds a program file.

Usage:

which {flags}

# Parameters

  • application: the name of the command to find the path to

# Flags

  • --all: list all executables

# Examples

which finds the location of an executable:

> which python
─────────┬─────────────────
 arg     │ python
 path    │ /usr/bin/python
 builtin │ No
─────────┴─────────────────
> which cargo
─────────┬────────────────────────────
 arg     │ cargo
 path    │ /home/bob/.cargo/bin/cargo
 builtin │ No
─────────┴────────────────────────────

which will identify nushell commands:

> which ls
─────────┬──────────────────────────
 arg     │ ls
 path    │ nushell built-in command
 builtin │ Yes
─────────┴──────────────────────────
> which which
─────────┬──────────────────────────
 arg     │ which
 path    │ nushell built-in command
 builtin │ Yes
─────────┴──────────────────────────

Passing the all flag identifies all instances of a command or binary

> which ls --all
───┬─────┬──────────────────────────┬─────────
 # │ arg │ path                     │ builtin
───┼─────┼──────────────────────────┼─────────
 0ls  │ nushell built-in command │ Yes
 1ls  │ /bin/ls                  │ No
───┴─────┴──────────────────────────┴─────────

which will also identify local binaries

> touch foo
> chmod +x foo
> which ./foo
─────────┬────────────────────────────────
 arg     │ ./foo
 path    │ /Users/josephlyons/Desktop/foo
 builtin │ No
─────────┴────────────────────────────────

which also identifies aliases

> alias e = echo
> which e
───┬─────┬───────────────┬─────────
 # │ arg │     path      │ builtin
───┼─────┼───────────────┼─────────
 0 │ e   │ Nushell alias │ No
───┴─────┴───────────────┴─────────

and custom commands

> def my_cool_echo [arg] { echo $arg }
> which my_cool_echo
───┬──────────────┬────────────────────────┬─────────
 # │     arg      │          path          │ builtin
───┼──────────────┼────────────────────────┼─────────
 0 │ my_cool_echo │ Nushell custom command │ No
───┴──────────────┴────────────────────────┴─────────

wrap

Wraps data in a table

Syntax: wrap <column>

# Parameters

  • column: the (optional) name of the column the data should be stored in.

# Examples

wrap will give a name to a column of <value> data:

> ls | get name
───┬──────────────
 # │
───┼──────────────
 0 │ americas.csv
 1 │ iso.csv
───┴──────────────
> ls | get name | wrap filename
───┬──────────────
 # │ filename
───┼──────────────
 0 │ americas.csv
 1 │ iso.csv
───┴──────────────

wrap will encapsulate rows as embedded tables:

> ls | select name type size
───┬──────────────┬──────┬─────────
 # │ name         │ type │ size
───┼──────────────┼──────┼─────────
 0 │ americas.csv │ File │   317 B
 1 │ iso.csv      │ File │ 20.8 KB
───┴──────────────┴──────┴─────────

> ls | select name type size | each {wrap details}
───┬────────────────
 # │ details
───┼────────────────
 0[table 1 rows]
 1[table 1 rows]
───┴────────────────

wrap will encapsulate a whole table as an embedded table:

> ls | wrap files
───────┬────────────────
 files │ [table 2 rows]
───────┴────────────────

split column

splits contents across multiple columns via the separator.

Syntax: split column <separator> ...args{flags}

# Parameters

  • <separator>: string that denotes what separates columns
  • args: column names to give the new columns. If not specified they will be set to Column1 Column2 ...

# Flags

--collapse-empty
  Removes empty columns

# Examples

If we have file structured like this:

0.12643678160919541 | 0.6851851851851852 | 0.273972602739726
0.28735632183908044 | 0.09259259259259259 | 0.6986301369863014
0.8045977011494253 | 0.8148148148148148 | 0.7397260273972602
0.28735632183908044 | 0.09259259259259259 | 0.547945205479452
0.6896551724137931 | 0.7037037037037037 | 1.2465753424657535
0.6896551724137931 | 0.8333333333333334 | 0.4657534246575342
0.9080459770114943 | 1.3333333333333333 | 0.4931506849315068
0.9310344827586207 | 1.1296296296296295 | 0.7123287671232876
0.3448275862068966 | 0.018518518518518517 | 0.6575342465753424
1.0459770114942528 | 1.0925925925925926 | 0.6164383561643836

We can build a table from it using the split column command

> open coordinates.txt | lines | split column " | "
───┬─────────────────────┬──────────────────────┬────────────────────
 # │ Column1             │ Column2              │ Column3
───┼─────────────────────┼──────────────────────┼────────────────────
 00.126436781609195410.68518518518518520.273972602739726
 10.287356321839080440.092592592592592590.6986301369863014
 20.80459770114942530.81481481481481480.7397260273972602
 30.287356321839080440.092592592592592590.547945205479452
 40.68965517241379310.70370370370370371.2465753424657535
 50.68965517241379310.83333333333333340.4657534246575342
 60.90804597701149431.33333333333333330.4931506849315068
 70.93103448275862071.12962962962962950.7123287671232876
 80.34482758620689660.0185185185185185170.6575342465753424
 91.04597701149425281.09259259259259260.6164383561643836
───┴─────────────────────┴──────────────────────┴────────────────────

And give names to the columns

> open coordinates.txt | lines | split column " | " x y z
───┬─────────────────────┬──────────────────────┬────────────────────
 # │ x                   │ y                    │ z
───┼─────────────────────┼──────────────────────┼────────────────────
 00.126436781609195410.68518518518518520.273972602739726
 10.287356321839080440.092592592592592590.6986301369863014
 20.80459770114942530.81481481481481480.7397260273972602
 30.287356321839080440.092592592592592590.547945205479452
 40.68965517241379310.70370370370370371.2465753424657535
 50.68965517241379310.83333333333333340.4657534246575342
 60.90804597701149431.33333333333333330.4931506849315068
 70.93103448275862071.12962962962962950.7123287671232876
 80.34482758620689660.0185185185185185170.6575342465753424
 91.04597701149425281.09259259259259260.6164383561643836
───┴─────────────────────┴──────────────────────┴────────────────────

sleep

Delay for a specified amount of time

Syntax: sleep <time> [additional_time]...

# Flags

-h, --help Display help message.

# Examples

Sleep for 3 seconds

> sleep 3sec

Sleep for 1 minute and 2 seconds

> sleep 1sec 1min 1sec