Unix shell


Terminals & Shells

The defacto tool to mange a server is through a terminal emulator (distinct form real terminals, which are a thing of the past).
The terminal emulator is an application that a user sends and reads commands through the shell.
Shells are a suite, usually a language and a command-prompt, that can transform user input into OS commands.
The shell allows you to run other applications without the use of a graphical-user-interface (GUI).
The default terminal on MacOS is called "Teriminal".
The default shell for most Unix/Unix-like OSs is the Bash shell.
If I was on mac, and opened the application "Terminal", I would see a blinking cursor, where I can type text.
If I didn't change the shell (you would know if you did), the commands I enter must be interperetable by the Bash shell.

The shell does very little on its own without the use of applications.
Unless you are in a weird situation, there are often several applications that are available.
The names of these commands are typed into the terminal, and are ran if the shell can find them.

Where am I located?

The command "pwd" is sually available.
If you run the "print workding directory" applications
And press return, it should list your current location. Usually this will be just your home directory.

What files are near me?

I can list files in this directory by entering in the "list" application:
An application like "ls" is more versetile than this.
You can change its behavior by including flags:
ls -l
This will list files in "long" form. Showing additional details.
I can also use ls to list the contents of other directories by including a directory as an argument.
ls /Applications
An "argument" is simply some input that the applications is looking for.
With ls, flags and arguments are optional (that is if I want to list the contents of the current directory).
This command lists the contents in the Applications directory on a MacOS computer.
Here the "/" indicates the root, or base directory.

If I am located in my home directory and there is a Documents directory within that directory I can list the contents of that directory by specifying the full path:
ls /Users/dambam/Documents
Using the current users home shortcut available in Bash
ls ~/Documents
Using the curent users home environment variable
ls $HOME/Documents
We'll talk about environment variables later.
Or by using the querying the directory relative to my current position.
ls Documents
This last command would only work if I was in my home directory, whereas the others should work no matter where I am.
So if I was was located in the root directory, I could a list the contents of Documents by using the following.
ls Users/Documents
Notice the lack of "/" at the beginning.

Getting help

The ls application is very versitile.
To find out some common ways you can use ls, use the "help" flag
ls –help
This help flag is usually available on most unix applications in this longer form, or in the short form "-h", or both:
ls -h
I can get more detailed information by querying the manual on the application.
man ls
This man application is actually being ran here, and it searchers for and then prints the manual for ls when it finds it.
Sometimes manuals for an application have to be installed manually.

Changing directories

I can change my location by using the "change directory command"
cd Documents
Now I can list the contents of Documents by just using
So my current location right now is /Users/dambam/Documents.
If I want to get back to my home directory I could do one of the following
cd /Users/dambam/Documents
cd $HOME
cd ~
cd ..
The first three here should make sense based upon what we've already discussed.
The third one is a shortcut for the cd command only; if I don't include an argument, cd will just move you to your home directory.
The ".." in the last command is a shortcut which means parent directory.
By doing
cd ..
I'm simply moving back one directory.
If I wanted to move back two directories (ie to Users)
I could use
cd ..; cd ..
cd ../..
In the first example, I'm separating two commands with a semicolon.

Similar to this ".." shortcut is the "." shortcut, which means current directory.
I can use it with ls
ls .
Which works the same as
If I do
cd .
I should remain in the current directory.
These can of course be combined
cd ./..
which can be shortened to
cd ..


The Bash language, like other programming languages lets you assign values to variables.
Try the following commands
a='some text'
echo $a Echo is similar to the print command here; I can use it to query the value of the variable. The "$" in from the variable in the second line is required if I'm calling the variable a after its been defined.
You should not include the dollar-sign when defining a variable

Variables can also be used to store text that can be used to run an application.
Here I'm emiting the echo, and Bash will interpret it as the ls command.
I can store the output of commands like so:
By using the $() I'm calling on a 'subshell'.
ANything listed within these brackets gets ran before the rest of the statement.

We saw the $HOME environmnet variable previously.
Environment variables are global variables that are persisent between shell sessions – when I close my terminal, restart my computer, they will usually be there, without me needing to define them.
Another important environment variable is $PATH
This defines the directories, seperated by ":", where Bash will look for applications.
If my $PATH variable is empty, I won't be able to run any applications, unless its in my current directory.
This includes ls.
If this were the situation I could run ls in the following ways, assuming ls is installed in the bin directory
cd bin; ./ls
The period in this second example is required if I'm attempting to run an application from the current directory.


Bash uses the # symbol as a comment; Anything written after the # will not be read by the shell:
ls #this line will run the same as the next line


Although it may not seem like this at first, by operating your OS using a shell can be magnitudes faster than by doing things by GUI.
One of the more obvious reasons why is that you have the use of what is called globbing or wildcards.
In bash, the most common wildcard is the "*"
If I wanted to list only files in mydirectory that have the '.txt' extension:
ls *.txt
In essense this wildcard means "anything text (as long as it has a .txt attached to it"
To list all files that begin with S and end with .py extension
ls S*.py
I can use more than one if I want to
You can see how this can be used to find list things in a fuzzy way, such as with case insenstivity.
The above example to find the SomeFile.py may be better written as
Where ? is a wildcard that matches an individual character rather than a string.

Overall, globbing is a simple implementation of text matching that can extend to more applications.
We'll talk more about this when using "find" and "grep" later, to find files and search within.

More advanced pattern matching is possible with Regular Expressions (regexp).
This greatly extends the idea of using symbols to match text:
.*.mat # match any file that ends with .mat
[Ab].+.mat # match any file that ends with .mat that begins with "A" or "b" folowed by more than one character
/[ab].+.mat/i # same as above but case insensitve
/[ab].+[z]{2}.mat/i # same as above but do not match if name ends with two z's


rsync -a deb:/MNT/STORAGE/BOOKS > books.txt
rsync -a books.txt deb:~
grep -R "LRSIcameraIPD(1)" ~/Code/mat/projects | grep -i mapgenerator


vim filename.sh
/usr/bin/matlab -nodesktop -r "matlabAnalysisScript; exit;"

chmod o+x filename.sh # grants executable permissions for owner
./filename # . is shortcut for current directory

Job control

Most applications will not allow you to run another command in the current shell until the last command completes.
This is because shell commands usually run in the "foreground" by default.
If you want something to run while you do something else, you need to run that first command in the background.
If you know you want to run something in the background you can simply append a "&" at the end of a command
For example, if I wanted to run a matlab command or script I could do something like this

matlab -nodesktop -r "ls; exit;" &

Here -nodesktop simply indicates that I'm running matlab in terminal, -r is the "run" flag and the commands in quotations are what is being ran
When the & is included at the end, you would not see matlab startup as you normally would, instead you would see some output
[1] 2074636
followed by a new bash pompt.
The [1] in this output is the background job number and the 2074636 is the process id number (pid).
If I want to bring matlab to foreground I can move it from background to foregroudn by
fg %1
Where 1 is the job id
If I do not remember which job number matlab is using I can run
To get an idea.

Now that matlab is in foreground, how do I send it to the background again?
First you pause the program by pressing <Ctrl-z>
Your shell should output something like
suspended bin/os/matlab -nodesktop
If I run
I should now see that same line, but with a job id.
[1] + suspended bin/os/matlab -nodesktop
I can then run
bg %1
which will send the application to the background and unsuspend it.
If I instead want to bring it to foreground, I can run
fg %1
This procedure also works if I forgot to include the "&" initially and somethihng is now running in the foreground.

If I do not include the job number, when I run bg or fg, it will assume "current" job – the last job I changed.
The current job is marked by a "+" in the jobs command list, and the last job is marked by a "-"
These can be used instead of job numbers eg:
fg %+
fg %% #A quicker way to say current job
Some other shortcuts are
%! for last background job

If I want to close something runningin the background I can close it with
kill %1
Where 1 can be replaced by the appropriate job number

If I want to <Ctrl-Z> a job 1 running in the background I can run
suspend %1

Standard Streams

Like sockets

One of the biggest attraction to applications for unix shell is the standards/conventions that are implemented.
(For a bigger overview on this, read up on the Unix Philosophy: https://en.wikipedia.org/wiki/Unix_philosophy)
One major convention is how applications are supposed to handle their input/ouput (I/O) streams.
When I run the command
cat file.txt

"file.txt" is the input stream sent to the cat application
The output of the file that cat prints into my terminal is the output stream.
In Unix shells, both inputs and outputs are streamelined on a lower level are implemented in such away that I can use them very flexibly.
In short, Unix shell inputs and ouptuts are sent to very specific places;
Input streams are sent through the the STDIN stream, and output streams are sent to the STDOUT stream, errors are sent to the STDERROR stream.
The standard streams are numbered:

Streams 3 and above are non-standard. Applications should not send messages through these streams, but they are there for the end user to use.
Text sent through standard streams can be redirected using "pipes"
I can redirect the text output of my cat command to another file by using the > pipe
cat file.txt 1> copiedfile.txt
For shorthand I can exclude the 1
cat file.txt > copiedfile.txt
If there was any errors when running this command I can also redirect these to a file
cat file.txt 2> caterrors.txt
There is no shorthand for 2

STDOUT has a special operator, the pipe "|"
This redirects the output of one command to another


process1 2> >(process2)

exec > file
All outputs are redirected as such
[command] > somefile
exec < input
All inputs are redirected as such
[command] < someinput

exit codes

Dangers of terminal

The terminal is the power-user's tool of choice.
When using the terminal, you are as limited as possible in order to increase your productivity.
As such, there are not as many safeguards put in place to prevent you from doing things to make your files inaccessible, unusable, or permenently deleted.

There are only a few things that you should know starting out.

dangers of rm

On most systems the "rm" command does not have a Recycling Bin system put in place;
Once you rm a file, it is gone forever.
The "rm" command should be viewed as dangerous. The rm flag with globbing, as extremeley dangerous.
The rm flag with the -r and -f (recursive and force) flags as innevitable suicide.
You can (in theory) delete everything on your computer by simply running:
rm -rf /*
Or a huge part of your system by
rm -rf *
if you are in a different directory than you think you are.
This second command is a legitimate command that could be used to just remove the contents of a directory (which is why this command exists).
But just because its there, and its convenient, doesn't mean it should be used.

Good rule is to try and avoid rm with either -r -f and * as much as possible.
Do 99% of -r and -f can be avoided by simply not use rm to delete directories.
Instead use rmdir after deleting files with rm.
If you do use rm -rf also include the -i or -I flag which will prompt you for deletion before it actually does.

You can also use other commands to replace rm (such as del) which do have have undelete functionality. These tools are not typically installed by default on most operating systems.

Having a good backup system is highly recommended.
If you do rm something important, and you don't have backups, you can often still get it back, but not always, and its difficult.
When a file gets deleted, the system does not wipe the region of the harddrive, but instead says its free to be overwritten.
Turn off your computer immediately and seek help.


If you are running something as the su (super user) or given access to do so (with sudo),
you are allowed to do virtually everything.
The command from the previous section
rm -rf /*
Will usually only delete only your personal files… unless you are given su access.

If you make edits to system files, make backups before you edit, especially if you are unsure about the consequences of things breaking.

List of common shell applications

Also see: https://en.wikipedia.org/wiki/List_of_Unix_commands

Most basic

man $application # manual
ls $directory # list contents of directory
cd $directroy # change directory
pwd # print current working directory (current location)
echo $variable # print content of a variable/string without running it as an application

Other basic applications

Among ls, cd, pwd, echo, are some other useful OS commands that you should be aware of.

mkdir $directory # create a directory
rmdir $directory # remove directory if its empty
cp $sourcefile $destination # copies on file to another location
mv $sourcefile $destination # moves a file from one place to another. Also used to rename files
cat $file # list contents of a file
rm $file # remove a file. There is not "Recycle bin" system set up with this command.

Be really careful with the rm command especially using the -r and -f flags! You quite easily delete everything on your computer and not be able to recover it.

Remote access applications

ssh # login to a shell on another computer
rsync # Efficiently move files around locally or remotely

Access control applications

chmod $permissionValue $file # changes who can access a file
chown $user:$group $file # changes which user and group owns a file

job control applications


Pager/editor applicatons

less $filename #pagers like less let you graphically view files in a terminal without editing

nano $file # a simple, terminal based file editing program
vi $file
vim $file
emacs $file
Vi, vim, and emacs are editors for powerusers. Vi and most often vim will usually be installed on any server by default.
If you get stuck in vi or vim hit escape a couple of times then enter
And press return
In emacs, you can quit with by typing Control-x, then Control-c.

Finding things

find $baseSearchDirectory -name $pattern # find files that match a pattern
grep "$pattern" $file # find lines in a file that match a pattern

Advanced scripting languages built for dynamic usage in bash shell

sed # typically used to replace mass replace text matching a pattern in a file
awk # typically used to select text seperated by some field
perl # Text processing powerhouse. Has powers of both sed and awk and much more.

Shell Symbols in

%1 job number 1
342 pid (process number) 342

$var local variable var
$VAR environment variable (persistent between shell sessinons)
$0 filename/command name/ of current script or shell being used
$1 first argument/input to a command
$@ all argument/input to a command
${var} array variable "var"
${var[0]} array variable "var" at first index
${var[@]} all elements array variable "var"

!! last command entered
$? exit code of last command
#! "shebang." used at top of script to identify which program should run it eg #!/bin/bash

Stream redirection
> redirect output into file, overwritting everything
< redirect input
>> redirect output into file, appending to what already exists
0< redirect standard input (long-hand for <)
1> redirect standard output (long-hand for >)
2> redirect standard error
0>> same as above, but append not overwrite

Merge Streams
2>&1 merge STDERROR with STDIN
&> in Bash, same as 2>&1, but not available in all shells
>& Same as &>, but &> is prefferable as a standard
<< "here-document" structure. Used to redirect multiline text to std output
<<< "here-string" like process substution "<()", but actually creates a tmp file and removes it
3>&1 1>&2- 2>&3- swap standard input and standard error

Process substitution
<() make command look like a file and send it through STDIN
>() make command look like a file and send it through STDIN

Command substitution
$() run nested command in a subshell

\| take output from last command and make it input of next command

difference between <<< and < <()
< <() runs in parallel

Graphical File Manager

brew install mc
brew install ranger


find ~/Code/mat/ -iname prog.m
man find
man -a find



grep -R "LRSIcameraIPD(1)" ~/Code/mat/projects

Preview of advanced usage

lsNonStdDepsRev.m -> finds functions that depend on the given function
command=['grep -r "' fcnName '(" * | awk ''{print $1}'' | cut -d : -f1 | grep -v "/' fcnName '.m"'];

find ~/Code/mat -iname 'lrsi' -exec sed -i 's/variableName/betterVariableName/g' {} \;

crontab -e # automatically run stuff regularly
at 0100 matlabScript.sh # schedule to run matlab code at 1:00am
parralel 'matlabScript.sh {1} {2} {3}' # parallize matlab code in three threads
renice -n -5 3813 # prioritize processes