- Write, save, and run shell scripts with correct shebangs and permissions
- Use variables, parameters, and command substitution confidently
- Control flow with if, for, while, and case constructs
- Define reusable functions and handle errors properly
- Distinguish between POSIX sh and Bash-specific features
A shell script is a plain text file containing shell commands, saved so that you can run the sequence as a unit. Shell scripting occupies a peculiar niche in the programming world: it is usually considered embarrassing by people who think of themselves as "real" programmers, and yet every real programmer ends up writing scripts constantly. The reason is that nothing else is quite as quick for gluing commands together, automating tedious workflows, and bridging the gap between "I could do this by hand" and "I need to write actual software". A well-chosen hundred lines of Bash can replace a thousand lines of Python in the right circumstances.
Your First Script
Create a file called hello.sh:
#!/bin/bash
echo "Hello, world!"
Make it executable and run it:
chmod +x hello.sh
./hello.sh
# Hello, world!
Three things happened there, and each is worth examining.
The Shebang
The first line, #!/bin/bash, is called a shebang or hashbang. When the kernel is asked to execute a file, it looks at the first two bytes. If they are #!, the kernel reads the rest of the line as the path to an interpreter and runs that interpreter with your script file as an argument. So ./hello.sh effectively becomes /bin/bash ./hello.sh.
This is how the kernel supports scripts written in any language: #!/usr/bin/python3, #!/usr/bin/env node, #!/usr/bin/awk -f, and so on. The env trick — #!/usr/bin/env bash — is a portability idiom: it asks the environment's PATH to locate bash, rather than hard-coding /bin/bash. On some systems, bash lives at /usr/local/bin/bash, and env finds it either way.
The Executable Bit
Without chmod +x, the kernel would refuse to execute the file, even though its contents are perfectly valid. This is Unix's distinction between data and program: a file becomes a program by having its execute bit set, and until then it is just bytes.
The Leading ./
Why ./hello.sh rather than hello.sh? Because the current directory . is not in PATH by default, for good security reasons. If it were, an attacker could drop a malicious program called ls in a shared directory, and the next person to cd there and type ls would run it. The ./ is an explicit acknowledgement that you mean the file in the current directory.
Variables
Shell variables are assigned with = and referenced with $. Crucially, there must be no spaces around the equals sign:
name="Alice" # correct
name = "Alice" # error: "name: command not found"
echo "Hello, $name"
# Hello, Alice
Quoting matters. Double quotes allow variable expansion; single quotes take everything literally:
echo "Hello, $name" # Hello, Alice
echo 'Hello, $name' # Hello, $name
When a variable might be empty or contain spaces, always quote it in double quotes. Unquoted variables are word-split and glob-expanded by the shell, which is almost always a bug waiting to happen:
file="my file.txt"
rm $file # tries to rm "my" and "file.txt" separately!
rm "$file" # removes "my file.txt" as a single name
This is rule one of robust shell scripting: quote your variables.
Command Substitution
You can capture the output of a command into a variable:
today=$(date +%Y-%m-%d)
echo "Today is $today"
# Today is 2026-04-09
The $( ... ) syntax runs the command inside and substitutes its stdout. There is an older backtick form — today=`date +%Y-%m-%d` — which still works but is harder to nest and harder to read. Prefer the dollar-parenthesis form in new scripts.
Arithmetic
For integer maths, use $(( ... )):
count=5
next=$((count + 1))
echo "$next"
# 6
Shell scripting has no native floating-point arithmetic — you must shell out to bc or awk for that.
Positional Parameters
A script can receive arguments, which are available as $1, $2, $3, and so on. The name of the script itself is $0.
#!/bin/bash
echo "Script: $0"
echo "First arg: $1"
echo "Second arg: $2"
echo "All args: $@"
echo "Number of args: $#"
Running ./script.sh apple banana produces:
Script: ./script.sh
First arg: apple
Second arg: banana
All args: apple banana
Number of args: 2
The special variables $@ and $* are both the list of arguments, but they behave differently when quoted. "$@" expands to each argument as a separate quoted word — which is what you almost always want — whereas "$*" expands to all arguments joined into one string.
The exit status of the most recent command is $?:
grep foo file.txt
if [ $? -eq 0 ]; then
echo "Found it"
fi
(In practice you would write if grep -q foo file.txt; then, but the $? form shows the general mechanism.)
Conditionals
Bash has two kinds of test brackets. The traditional POSIX [ ... ] is actually a command called test:
if [ "$name" = "Alice" ]; then
echo "Hi Alice"
elif [ "$name" = "Bob" ]; then
echo "Hi Bob"
else
echo "Who are you?"
fi
Bash also provides an extended form, [[ ... ]], which understands pattern matching, regular expressions, and does not require quoting variables:
if [[ $name == A* ]]; then
echo "Starts with A"
fi
if [[ $name =~ ^[A-Z][a-z]+$ ]]; then
echo "Capitalised word"
fi
Prefer [[ ... ]] in Bash-specific scripts for its safer semantics, but use [ ... ] if you need POSIX portability.
Common test operators:
Loops
A for loop iterates over a list of words:
for fruit in apple banana cherry; do
echo "I like $fruit"
done
To iterate over files:
for file in *.log; do
gzip "$file"
done
A while loop runs as long as a condition is true:
count=1
while [ $count -le 5 ]; do
echo "Iteration $count"
count=$((count + 1))
done
To read a file line by line:
while IFS= read -r line; do
echo "Line: $line"
done < input.txt
The IFS= and -r are the safe incantation for reading lines without mangling whitespace or backslash characters.
A case statement is useful for multi-way branching:
case "$1" in
start) echo "Starting..." ;;
stop) echo "Stopping..." ;;
status) echo "Status..." ;;
*) echo "Usage: $0 {start|stop|status}"
exit 1 ;;
esac
Functions
Shell functions let you give a block of code a name and call it:
greet() {
local name="$1"
echo "Hello, $name!"
}
greet "Alice"
greet "Bob"
Inside the function, $1, $2, etc., refer to the function's own arguments, not the script's. The local keyword limits a variable's scope to the function, which is almost always what you want.
Return values are funny in shell scripts. A function's "return value" is its exit status (0-255), not an arbitrary value. To return data, write it to stdout and capture with command substitution:
current_date() {
date +%Y-%m-%d
}
today=$(current_date)
echo "Today is $today"
Error Handling
By default, Bash scripts are dangerously forgiving: they keep running after errors, silently ignore failures, and substitute empty strings for unset variables. The canonical set of safety flags to put at the top of any serious script is:
set -euo pipefail
Each letter means:
-e— exit immediately if any command fails.-u— treat unset variables as an error.-o pipefail— a pipeline fails if any command in it fails, not just the last one.
With these set, your script stops loudly on the first problem instead of stumbling onward in a broken state. It is not a silver bullet — edge cases exist, especially around conditional commands — but it is a massive improvement over the default.
You can also trap signals to clean up on exit:
cleanup() {
rm -f /tmp/myscript-$$
}
trap cleanup EXIT
The trap ... EXIT registers a function to run when the script exits for any reason.
ShellCheck: The Linter Every Script Deserves
If you take nothing else away from this chapter, take ShellCheck (https://shellcheck.net). It is a static analyser for shell scripts, and it is the single most valuable tool for writing robust shell. It catches the bugs that silently eat weekends: unquoted variables that will explode the moment a filename contains a space, [ $x = "foo" ] tests that blow up on an empty $x, broken redirections, accidental subshells in pipelines, wrong-quoted $@, uses of Bash features in scripts marked #!/bin/sh, and dozens more. It is not exaggerating to say that every shell bug I have ever found in a colleague's script, ShellCheck would have caught in under a second.
Install it with your package manager:
sudo apt install shellcheck # Debian/Ubuntu
sudo dnf install ShellCheck # Fedora
brew install shellcheck # macOS
Then run it on any script:
shellcheck backup.sh
# In backup.sh line 7:
# rm $file
# ^--^ SC2086: Double quote to prevent globbing and word splitting.
Every warning links to a wiki page that explains the rule, shows examples, and suggests a fix. Make a habit of linting every script before you commit it — and better still, wire ShellCheck into your editor so the warnings appear as you type. Plugins exist for Vim, Emacs, VS Code, and every other serious editor. There is no excuse for shipping shell scripts that do not pass ShellCheck clean.
POSIX sh versus Bash
Not every feature discussed here is part of POSIX sh. Bash is a superset: it supports everything POSIX does, plus arrays, the [[ ... ]] extended test, process substitution, brace expansion with step ({1..10..2}), and many other goodies. If you are writing a script that must run on any Unix system — including Alpine with Busybox, Dash on Debian, or ancient Solaris — stick to POSIX features and start your script with #!/bin/sh. If you are writing for Linux with Bash installed, use #!/bin/bash and enjoy the conveniences.
A good portable rule: when your script exceeds a hundred lines or starts using arrays, consider whether a real programming language (Python, Go) would serve you better. Shell scripts shine at short automation; they become painful when the logic gets complex.
A Complete Example
Here is a small but realistic script that backs up a directory to a timestamped tarball, with proper error handling:
#!/bin/bash
set -euo pipefail
# Usage: ./backup.sh <source-dir> [destination-dir]
src="${1:?source directory required}"
dest="${2:-$HOME/backups}"
mkdir -p "$dest"
timestamp=$(date +%Y%m%d-%H%M%S)
name=$(basename "$src")
archive="$dest/${name}-${timestamp}.tar.gz"
echo "Backing up $src to $archive..."
tar -czf "$archive" -C "$(dirname "$src")" "$name"
echo "Done. Size: $(du -h "$archive" | cut -f1)"
Read it carefully and note the idioms: the shebang and set -euo pipefail, the parameter expansion ${1:?...} which fails with an error message if the argument is missing, the ${2:-default} which provides a default, the quoting of every variable, and the use of basename and dirname to manipulate paths.
Shell scripting is a craft, not an art. Write scripts like this one, read ones written by others, and within weeks you will find yourself automating tasks that previously took half-hours in minutes. That is the time-saving magic of the shell, distilled into files you can save, share, and run again.