Shell Scripting - Automating the Command Line
Start here: If you are new to shell scripting, begin with the Interactive Shell Scripting Tutorial — hands-on exercises in a real Linux system. This article is a reference to deepen your understanding afterward.
If you have ever found yourself performing the same repetitive tasks on your computer—renaming batches of files, searching through massive text logs, or configuring system environments—then shell scripting is the magic wand you need. Shell scripting is the bedrock of system administration, software development workflows, and server management.
In this detailed educational article, we will explore the concepts, syntax, and power of shell scripting, specifically focusing on the most ubiquitous UNIX shell: Bash.
Basics
What is the Shell?
To understand shell scripting, you first need to understand the “shell”.
An operating system (like Linux, macOS, or Windows) acts as a middleman between the physical hardware of your computer and the software applications you want to run. It abstracts away the complex details of the hardware so developers can write functional software.
The kernel is the core of the operating system that interacts directly with the hardware. The shell, on the other hand, is a command-line interface (CLI) that serves as the primary gateway for users to interact with a computer’s operating system. While many modern users are accustomed to graphical user interfaces (GUIs), the shell is a program that specifically takes text-based user commands and passes them to the operating system to execute. In the context of this course, mastering the shell is like becoming a “wizard” who can construct and manipulate complex software systems simply by typing words.
Motivation: Why the Shell is Essential
As a software engineer, you need to be familiar with the ecosystem of tools that help you build software efficiently. The Linux ecosystem offers a vast array of specialized tools that allow you to write programs faster and debug log files by combining small, powerful commands. Understanding the shell increases your productivity in a professional environment and provides a foundation for learning other domain-specific scripting languages. Furthermore, the shell allows you to program directly on the operating system without the overhead of additional interpreters or heavy libraries.
The Unix Philosophy
The shell’s power is rooted in the Unix philosophy, which dictates:
- Write programs that do one thing and do it well.
- Write programs to work together.
- Write programs to handle text streams, because that is a universal interface.
By treating data as a sequence of characters or bytes—similar to a conveyor belt rather than a truck—the shell allows parallel processing and the composition of complex behaviors from simple parts.
Essential UNIX Commands
Before writing scripts, you need to know the fundamental commands that you will be stringing together. These are the building blocks of any UNIX environment.
1. File Handling
These are the foundational tools for interacting with the POSIX filesystem:
ls: List directory contents (files and other directories).cd: Change the current working directory (e.g., use..to move to a parent folder).pwd: Print the name of the current/working directory so you don’t get lost.mkdir: Create a new directory.cp: Copy files. Use-r(recursive) to copy a directory and its contents.mv: Move or rename files and directories.rm: Remove (delete) files. Use-rto remove a directory and its contents recursively.rmdir: Remove empty directories (only works on empty ones).touch: Create an empty file or update timestamps.
2. Text Processing and Data Manipulation
Unix treats text streams as a universal interface, and these tools allow you to transform that data:
cat: Concatenate and print files to standard output.grep: Search for patterns using regular expressions.sed: Stream editor for filtering and transforming text (commonly search-and-replace).tr: Translate or delete characters (e.g., changing case or removing digits).sort: Sort lines of text files alphabetically; add-nfor numeric order,-rto reverse.uniq: Filter adjacent duplicate lines; the-cflag prefixes each line with its occurrence count. Because it only compares consecutive lines, you almost always pipesortfirst so that duplicates are adjacent.wc: Word count (lines, words, characters).cut: Extract specific sections/fields from lines.comm: Compare two sorted files line by line.head/tail: Output the first or last part of files.awk: Advanced pattern scanning and processing language.
3. Permissions, Environment, and Documentation
These tools manage how your shell operates and how you access information:
man: Access the manual pages for other commands. This is arguably the most useful command, providing built-in documentation for every other command in the system.chmod: Change file mode bits (permissions). Files in a Unix-like system have three primary types of permissions: read (r), write (w), and execute (x). For security reasons, the system requires an explicit execute permission because you do not want to accidentally run a file from an unknown source. Permissions are often read in “bits” for the owner (u), group (g), and others (o).which/type: Locate the binary or type for a command.export: Set environment variables. ThePATHvariable is especially important; it tells the shell which directories to search for executable programs. You can temporarily update it usingexportor make it permanent by adding the command to your~/.bashrcor~/.profilefile.source/.: Execute commands from a file in the current shell environment.
4. System, Networking, and Build Tools
Tools used for remote work, debugging, and automating the construction process:
ssh: Secure shell to connect to remote machines like SEASnet.scp: Securely copy files between hosts.wget/curl: Download files or data from the internet.make: Build automation tool that uses shell-like syntax to manage the incremental build process of complex software, ensuring that only changed files are recompiled.gcc/clang: C/C++ compilers.tar: Manipulate tape archives (compressing/decompressing).
The Power of I/O Redirection and Piping
The true power of the shell comes from connecting commands. Every shell program typically has three standard stream ports:
- Standard Input (
stdin/0): Usually the keyboard. - Standard Output (
stdout/1): Usually the terminal screen. - Standard Error (
stderr/2): Where error messages go, also usually the terminal.
Redirection
You can redirect these streams using special operators:
>: Redirectsstdoutto a file, overwriting it. (e.g.,echo "Hello" > file.txt)>>: Redirectsstdoutto a file, appending to it without overwriting.<: Redirectsstdinfrom a file. (e.g.,cat < input.txt)2>: Redirectsstderrto a specific file to specifically log errors.2>&1: Redirectsstderrto the standard output stream. Note: order matters —command > file.txt 2>&1sends both streams to the file, whereascommand 2>&1 > file.txtonly redirects stdout to the file while stderr still goes to the terminal.
Piping
The pipe operator | is the most powerful composition tool. It takes the stdout of the command on the left and sends it directly into the stdin for the command on the right.
Example: cat access.log | grep "ERROR" | wc -l
This pipeline reads a log file, filters only the lines containing “ERROR”, and then counts how many lines there are.
Here Documents and Here Strings
Sometimes you need to feed a block of text directly into a command without creating a temporary file. A here document (<<) lets you embed multi-line input inline, up to a chosen delimiter:
cat <<EOF
Server: production
Version: 1.4.2
Status: running
EOF
The shell expands variables inside the block (just like double quotes). To suppress expansion, quote the delimiter: <<'EOF'.
A here string (<<<) feeds a single expanded string to a command’s standard input — a concise alternative to echo "text" | command:
grep "ERROR" <<< "08:15:45 ERROR failed to connect"
Process Substitution
Advanced shell users often utilize process substitution to treat the output of a command as a file. The syntax looks like <(command). For example, H < <(G) >> I allows you to refer to the standard output of command G as a file, redirect it into the standard input of H, and append the output to I.
Writing Your First Shell Script
When you find yourself typing the same commands repeatedly, you should create a shell script. A shell script is written in a plain text file (often ending in .sh) and contains a sequence of commands that the shell executes as a program.
Interpreted Nature
Unlike a compiled language like C++, which is compiled into machine code before execution, shell scripts are interpreted at runtime rather than ahead of time. This allows for rapid prototyping. Bash always reads at least one complete line of input, and reads all lines that make up a compound command (such as an if block or for loop) before executing any of them. This means a syntax error on a later line inside a multi-line compound block is caught before the block starts executing — but an error in a branch that is never reached at runtime may go unnoticed. Use bash -n script.sh to check for syntax errors without running the script.
The Shebang
Every script should start with a “shebang” (#!). This tells the operating system which interpreter should be used to run the script. For Bash scripts, the first line should be:
#!/bin/bash
Execution Permissions
By default, text files are not executable for security reasons. Execute permission is required only if you want to run the script directly as a command:
chmod +x myscript.sh
./myscript.sh
Alternatively, you can bypass the execute-permission requirement entirely by passing the file as an argument to the Bash interpreter directly — no chmod needed:
bash myscript.sh
You can also run a script’s commands within the current shell (inheriting and potentially modifying its environment) using source or the . builtin: source myscript.sh.
Debugging Scripts
When a script behaves unexpectedly, Bash has built-in tracing modes that let you see exactly what the shell is doing:
bash -n script.sh: Reads the script and checks for syntax errors without executing any commands. Always run this first when a script refuses to start.bash -x script.sh(orset -xinside the script): Prints a trace of each command and its expanded arguments tostderrbefore executing it — indispensable for logic bugs. Each traced line is prefixed with+.bash -v script.sh(orset -v): Prints each line of input exactly as read, before expansion — useful for seeing the raw source being interpreted.
You can combine flags: bash -xv script.sh. To turn tracing on for only a section of a script, use set -x before that section and set +x after it.
Error Handling (set -e and Exit Status)
By default, a Bash script will continue executing even if a command fails. Every command returns a numerical code known as an Exit Status; 0 generally indicates success, while any non-zero value indicates an error or failure. Continuing after a failure can be dangerous and lead to unexpected behavior. To prevent this, you should typically include set -e at the top of your scripts:
#!/bin/bash
set -e
This tells the shell to exit immediately if any simple command fails, making your scripts safer and more predictable.
Syntax and Programming Constructs
Bash is a full-fledged programming language, but because it is an interpreted scripting language rather than a compiled language (like C++ or Java), its syntax and scoping rules are quite different.
5. Scripting Constructs
In our scripts, we also treat these keywords as “commands” for building logic:
#!(Shebang): An OS-level interpreter directive on the first line of a script file — not a Bash keyword or command. When the OS executes the file, it reads#!and uses the rest of that line as the interpreter path. Within Bash itself, any line starting with#is simply a comment and is ignored.read: Read a line from standard input into a variable. Common flags:-p "prompt"displays a prompt on the same line,-ssilently hides typed input (useful for passwords), and-n 1returns after exactly one character instead of waiting for Enter.if/then/elif/else/fi: Conditional execution.for/do/done/while: Looping constructs.case/in/esac: Multi-way branching on a single value.local: Declare a variable scoped to the current function.return: Exit a function with a numeric status code.exit: Terminate the script with a specific status code.
Variables
You can assign values to variables without declaring a type. Note that there are no spaces around the equals sign in Bash.
NAME="Ada"
echo "Hello, $NAME"
Parameter Expansion — Default Values and String Manipulation
Beyond simple $VAR substitution, Bash supports a powerful set of parameter expansion operators that let you handle missing values and manipulate strings entirely within the shell, without spawning external tools.
Default values:
# Use "server_log.txt" if $1 is unset or empty
file="${1:-server_log.txt}"
# Use "anonymous" if $NAME is unset or empty, AND assign it
NAME="${NAME:=anonymous}"
String trimming — remove a pattern from the start (#) or end (%) of a value:
path="/home/user/project/main.sh"
filename="${path##*/}" # removes longest prefix up to last / → "main.sh"
noext="${filename%.*}" # removes shortest suffix from last . → "main"
The double form (## / %%) removes the longest match; the single form (# / %) removes the shortest.
Search and replace:
msg="Hello World World"
echo "${msg/World/Earth}" # replaces first match → "Hello Earth World"
echo "${msg//World/Earth}" # replaces all matches → "Hello Earth Earth"
Scope Differences
Unlike C++ or Java, Bash lacks strict block-level scoping (like {} blocks). Variables assigned anywhere in a script — including inside if statements and loops — remain accessible throughout the entire script’s global scope. There are, however, several important isolation boundaries:
- Function-level scoping: variables declared with the
localbuiltin inside a Bash function are visible only to that function and its callees. - Subshells: commands grouped with
( list ), command substitutions$(...), and background jobs run in a subshell — a copy of the shell environment. Any variable assignments made inside a subshell do not propagate back to the parent shell. - Per-command environment: a variable assignment placed immediately before a simple command (e.g.,
VAR=value command) is only visible to that command for its duration, leaving the surrounding scope untouched.
Arithmetic
Math in Bash is slightly idiosyncratic. While a language like C++ operates directly on integers with + or /, arithmetic in Bash needs to be enclosed within $(( ... )) or evaluated using the let command.
x=5
y=10
sum=$((x + y))
echo "The sum is $sum"
Control Structures: If-Statements and Loops
Bash supports standard control flow constructs.
If-Statements:
if [ "$sum" -gt 10 ]; then
echo "Sum is greater than 10"
elif [ "$sum" -eq 10 ]; then
echo "Sum is exactly 10"
else
echo "Sum is less than 10"
fi
[is a shell builtin command: The single bracket[is not special syntax — it is a builtin command, a synonym fortest. Because Bash implements it internally, its arguments must be separated by spaces just like any other command:[ -f "$file" ]is correct, but[-f "$file"]tries to run a command named[-f, which fails. This is why the spaces inside brackets are mandatory, not just stylistic. (An external binary/usr/bin/[also exists on most systems, but Bash uses its builtin by default — you can verify withtype -a [.)
The following table covers the most important tests available inside [ ]:
| Test | Meaning |
|---|---|
-f path |
Path exists and is a regular file |
-d path |
Path exists and is a directory |
-z "$var" |
String is empty (zero length) |
"$a" = "$b" |
Strings are equal |
"$a" != "$b" |
Strings are not equal |
$x -eq $y |
Integers are equal |
$x -gt $y |
Integer greater than |
$x -lt $y |
Integer less than |
! condition |
Logical NOT (negates the test) |
Important: use -eq, -lt, -gt for numbers and = / != for strings. Mixing them produces wrong results silently.
[vs[[: The double bracket[[ ... ]]is a Bash keyword with additional power: it does not perform word splitting on variables, allows&&and||inside the condition, and supports regex matching with=~. Prefer[[ ]]in new Bash scripts.
Loops:
for i in 1 2 3 4 5; do
echo "Iteration $i"
done
For numeric ranges, the C-style for loop (the arithmetic for command) is often cleaner:
for (( i=1; i<=5; i++ )); do
echo "Iteration $i"
done
This is a distinct looping construct from the standalone (( )) arithmetic compound command. In this form, expr1 is evaluated once at start, expr2 is tested before each iteration (loop runs while non-zero), and expr3 is evaluated after each iteration — the same semantics as C’s for loop.
Loop control keywords:
break: Exit the loop immediately, regardless of the remaining iterations.continue: Skip the rest of the current iteration and jump to the next one.
for f in *.log; do
[ -s "$f" ] || continue # skip empty files
grep -q "ERROR" "$f" || continue
echo "Errors found in: $f"
done
Quoting and Word Splitting
How you quote text profoundly changes how Bash interprets it — this is one of the most common sources of bugs in shell scripts.
- Single quotes (
'...'): All characters are literal. No variable or command substitution occurs.echo 'Cost: $5'prints exactlyCost: $5. - Double quotes (
"..."): Spaces are preserved, but$VARIABLEand$(command)are still expanded.echo "Hello $USER"printsHello Ada.
A critical pitfall is word splitting: when you reference an unquoted variable, the shell splits its value on whitespace and treats each word as a separate argument. Consider:
FILE="my report.pdf"
rm $FILE # WRONG: shell splits into two args: "my" and "report.pdf"
rm "$FILE" # CORRECT: the entire value is passed as one argument
Always quote variable references with double quotes to protect against word splitting.
Command Substitution
Command substitution captures the standard output of a command and uses it as a value in-place. The modern syntax is $(command):
TODAY=$(date +%Y-%m-%d)
echo "Backup started on: $TODAY"
The shell runs the inner command in a subshell, then replaces the entire $(...) expression with its output. This is the standard way to assign the results of commands to variables.
Positional Parameters and Special Variables
Scripts receive command-line arguments via positional parameters. If you run ./backup.sh /src /dest, then inside the script:
| Variable | Value | Description |
|---|---|---|
$0 |
./backup.sh |
Name of the script itself |
$1 |
/src |
First argument |
$2 |
/dest |
Second argument |
$# |
2 |
Total number of arguments passed |
$@ |
/src /dest |
All arguments as separate, properly-quoted words |
$? |
(exit code) | Exit status of the most recent command |
When iterating over all arguments, always use "$@" (quoted). Without quotes, $@ is subject to word splitting and arguments containing spaces are silently broken into multiple words:
for f in "$@"; do
echo "Processing: $f"
done
Command Chaining with && and ||
Because every command returns an exit status, you can chain commands conditionally without writing a full if/then/fi block:
&&(AND): The right-hand command runs only if the left-hand command succeeds (exit code0).mkdir output && echo "Directory created"— only prints ifmkdirsucceeded.||(OR): The right-hand command runs only if the left-hand command fails (non-zero exit code).cd /target || exit 1— exits the script immediately if the directory cannot be entered.
This compact chaining idiom is widely used in professional scripts for concise, readable error handling.
Background Jobs
Appending & to a command runs it asynchronously — the shell launches it in the background and immediately returns to the prompt without waiting for it to finish:
./long_running_build.sh &
echo "Build started, continuing with other work..."
Two special variables are useful when managing background processes:
$$: The process ID (PID) of the current shell itself. Often used to create unique temporary file names:tmp_file="/tmp/myscript.$$".$!: The PID of the most recently backgrounded job. Use it to wait for or kill a specific background process.
The jobs command lists all active background jobs; fg brings the most recent one back to the foreground, and bg resumes a stopped job in the background.
Functions — Reusable Building Blocks
When the same logic appears in multiple places, extract it into a function. Functions in Bash work like small scripts-within-a-script: they accept positional arguments via $1, $2, etc. — independently of the outer script’s own arguments — and can be called just like any other command.
greet() {
local name="$1"
echo "Hello, ${name}!"
}
greet "engineer" # → Hello, engineer!
The local Keyword
Without local, any variable set inside a function leaks into and overwrites the global script scope. Always declare function-internal variables with local to prevent subtle bugs:
process() {
local result="$1" # visible only inside this function
echo "$result"
}
Returning Values from Functions
The return statement only carries a numeric exit code (0–255), not data. To pass a string back to the caller, have the function echo the value and capture it with command substitution:
to_upper() {
echo "$1" | tr '[:lower:]' '[:upper:]'
}
loud=$(to_upper "hello") # loud="HELLO"
You can also use functions directly in if statements, because a function’s exit code is treated as its truth value: return 0 is success (true), return 1 is failure (false).
Case Statements — Readable Multi-Way Branching
When you need to check one variable against many possible values, a case statement is far cleaner than a chain of if/elif:
case "$command" in
start) echo "Starting service..." ;;
stop) echo "Stopping service..." ;;
status) echo "Checking status..." ;;
*) echo "Unknown command: $command" >&2; exit 2 ;;
esac
Each branch ends with ;;. The * pattern is the catch-all default, matching any value not handled by earlier branches. The block closes with esac (case backwards).
Exit Codes — The Language of Success and Failure
Every command — including your own scripts — exits with a number. 0 always means success; any non-zero value means failure. This is the opposite of most programming languages where 0 is falsy. Conventional exit codes are:
| Code | Meaning |
|---|---|
0 |
Success |
1 |
General error |
2 |
Misuse — wrong arguments or invalid input |
Meaningful exit codes make scripts composable: other scripts, CI pipelines, and tools like make can call your script and take action based on the result. For example, ./monitor.sh || alert_team only triggers the alert when your monitor exits non-zero.
Shell Expansions — Brace Expansion and Globbing
The shell performs several rounds of expansion on a command line before executing it. Understanding the order helps you predict and control what the shell does.
Brace Expansion
First comes brace expansion, which generates arbitrary lists of strings. It is a purely textual operation — no files need to exist:
mkdir project/{src,tests,docs} # creates three directories at once
cp config.yml config.yml.{bak,old} # copies to two names simultaneously
echo {1..5} # → 1 2 3 4 5 (sequence expression)
Brace expansion happens before all other expansions, so you can combine it freely with variables and globbing.
Supercharging Scripts with Regular Expressions
Because the UNIX philosophy is heavily centered around text streams, text processing is a massive part of shell scripting. Regular Expressions (RegEx) is a vital tool used within shell commands like grep, sed, and awk to find, validate, or transform text patterns quickly.
Globbing vs. Regular Expressions: These look similar but are entirely different systems. Globbing (filename expansion) uses
*,?, and[...]to match filenames — the shell expands these before the command runs (e.g.,rm *.logdeletes all.logfiles). The three special pattern characters are:*matches any string (including empty),?matches any single character, and[opens a bracket expression[...]that matches any one of the enclosed characters — e.g.,[a-z]matches any lowercase letter, and[!a-z]matches any character that is not a lowercase letter. Regular Expressions use^,$,.*,[0-9]+, and similar constructs — they are pattern languages used by tools likegrep,sed, andawk, and also natively by Bash itself via the=~operator inside[[ ]]conditionals (which evaluates POSIX extended regular expressions directly without spawning an external tool). Critically,*means “match anything” in globbing, but “zero or more of the preceding character” in RegEx.
RegEx allows you to match sub-strings in a longer sequence. Critical to this are anchors, which constrain matches based on their location:
^: Start of string. (Does not allow any other characters to come before).$: End of string.
Example: ^[a-zA-Z0-9]{8,}$ validates a password that is strictly alphanumeric and at least 8 characters long, from the exact beginning of the string to the exact end.
Conclusion
Shell scripting is an indispensable skill for anyone working in tech. By viewing the shell as a set of modular tools (the “Infinity Stones” of your development environment), you can combine simple operations to perform massive, complex tasks with minimal effort. Start small by automating a daily chore on your machine, and before you know it, you will be weaving complex UNIX tools together with ease!
Quiz
Shell Commands — What Does It Do?
Match each shell command to its purpose
What does ls do?
What does mkdir do?
What does cp do?
What does mv do?
What does rm do?
What does less do?
What does cat do?
What does sed do?
What does grep do?
What does head do?
What does tail do?
What does wc do?
What does sort do?
What does cut do?
What does ssh do?
What does htop do?
What does pwd do?
What does chmod do?
Shell Commands Flashcards
Which Shell command would you use for the following scenarios?
You need to see a list of all the files and folders in your current directory. What command do you use?
You are currently in your home directory and need to navigate into a folder named ‘Documents’. Which command achieves this?
You want to quickly view the entire contents of a small text file named ‘config.txt’ printed directly to your terminal screen.
You need to find every line containing the word ‘ERROR’ inside a massive log file called ‘server.log’.
You wrote a new bash script named ‘script.sh’, but when you try to run it, you get a ‘Permission denied’ error. How do you make the file executable?
You want to rename a file from ‘draft_v1.txt’ to ‘final_version.txt’ without creating a copy.
You are starting a new project and need to create a brand new, empty folder named ‘src’ in your current location.
You want to view the contents of a very long text file called ‘manual.txt’ one page at a time so you can scroll through it.
You need to create an exact duplicate of a file named ‘report.pdf’ and save it as ‘report_backup.pdf’.
You have a temporary file called ‘temp_data.csv’ that you no longer need and want to permanently delete from your system.
You want to quickly print the phrase ‘Hello World’ to the terminal or pass that string into a pipeline.
You want to know exactly how many lines are contained within a file named ‘essay.txt’.
You need to perform an automated find-and-replace operation on a stream of text to change the word ‘apple’ to ‘orange’.
You have a space-separated log file and want a tool to extract and print only the 3rd column of data.
You want to store today’s date (formatted as YYYY-MM-DD) in a variable called TODAY so you can use it to name a backup file dynamically.
A variable FILE holds the value my report.pdf. Running rm $FILE fails with a ‘No such file or directory’ error for both ‘my’ and ‘report.pdf’. How do you fix this?
You are writing a script that requires exactly two arguments. How do you check how many arguments were passed to the script so you can print a usage error if the count is wrong?
You want to create a directory called ‘build’ and then immediately run cmake .. inside it, but only if the directory creation succeeded — all in a single command.
At the start of a script, you need to change into /deploy/target. If that directory doesn’t exist, the script must abort immediately — write a defensive one-liner.
You want to delete all files ending in .tmp in the current directory using a single command, without listing each filename explicitly.
Shell Pipelines
Practice connecting UNIX commands together with pipes to solve real tasks.
You want to count how many lines in server.log contain the word ‘ERROR’.
You have a file names.txt with one name per line. Print only the unique names, sorted alphabetically.
You have a file names.txt with one name per line. Print each unique name alongside a count of how many times it appears.
List all running processes and show only those belonging to user tobias.
Print the 3rd line of config.txt without using sed or awk.
List the 5 largest files in the current directory, with the biggest first, showing only their names.
You want to replace every occurrence of http:// with https:// in links.txt and save the result to links_secure.txt.
Print only the unique error lines from access.log that contain the word ‘ERROR’, sorted alphabetically.
Count the total number of files (not directories) inside the current directory tree.
Show the 10 most recently modified files in the current directory, newest first.
Extract the second column from comma-separated data.csv, sort the values, and print only the unique ones.
Convert the contents of readme.txt to uppercase and save the result to readme_upper.txt.
Print every line from app.log that does NOT contain the word ‘DEBUG’.
You have two files, file1.txt and file2.txt. Print all lines from both files that contain the word ‘success’, sorted alphabetically with duplicates removed.
Shell Scripting & UNIX Philosophy Quiz
Test your conceptual understanding of shell environments, data streams, and scripting paradigms beyond basic command memorization.
A developer needs to parse a massive log file, extract IP addresses, sort them, and count unique occurrences. Instead of writing a 500-line Python script, they use cat | awk | sort | uniq -c. Why is this approach fundamentally preferred in the UNIX environment?
A script runs a command that generates both useful output and a flood of permission error messages. The user runs script.sh > output.txt, but the errors still clutter the terminal screen while the useful data goes to the file. What underlying concept explains this behavior?
A C++ developer writes a Bash script with a for loop. Inside the loop, they declare a variable temp_val. After the loop finishes, they try to print temp_val expecting it to be undefined or empty, but it prints the last value assigned in the loop. Why did this happen?
You want to use a command that requires two file inputs (like diff), but your data is currently coming from the live outputs of two different commands. Instead of creating temporary files on the disk, you use the <(command) syntax. What is this concept called and what does it achieve?
A script contains entirely valid Python code, but the file is named script.sh and has #!/bin/bash at the very top. When executed via ./script.sh, the terminal throws dozens of ‘command not found’ and syntax errors. What is the fundamental misunderstanding here?
A developer uses the regular expression [0-9]{4} to validate that a user’s input is exactly a four-digit PIN. However, the system incorrectly accepts ‘12345’ and ‘A1234’. What crucial RegEx concept did the developer omit?
You are designing a data pipeline in the shell. Which of the following statements correctly describe how UNIX handles data streams and command chaining? (Select all that apply)
You’ve written a shell script deploy.sh but it throws a ‘Permission denied’ error or fails to run when you type ./deploy.sh. Which of the following are valid reasons or necessary steps to successfully execute a script as a standalone program? (Select all that apply)
In Bash, exit codes are crucial for determining if a command succeeded or failed. Which of the following statements are true regarding how Bash handles exit statuses and control flow? (Select all that apply)
When you type a command like python or grep into the terminal, the shell knows exactly what program to run without you providing the full file path. How does the $PATH environment variable facilitate this, and how is it managed? (Select all that apply)
A developer writes LOGFILE="access errors.log" and then runs wc -l $LOGFILE. The command fails with ‘No such file or directory’ errors for both ‘access’ and ‘errors.log’. What is the root cause?
A script is invoked with ./deploy.sh production 8080 myapp. Inside the script, which variable holds the value 8080?
A script contains the line: cd /deploy/target && ./run_tests.sh && echo 'All tests passed!'. If ./run_tests.sh exits with a non-zero status code, what happens next?
Which of the following statements correctly describe Bash quoting and command substitution behavior? (Select all that apply)
Arrange the pipeline fragments to build a command that extracts all ERROR lines from a log, sorts them, removes duplicates, and counts how many unique errors remain.
grep 'ERROR' server.log|sort|uniq|wc -l
Arrange the lines to write a shell script that validates a command-line argument, prints an error to stderr if missing, and exits with a non-zero code.
#!/bin/bashif [ $# -lt 1 ]; then echo "Error: no filename given" >&2 exit 1fiecho "Processing $1..."
Arrange the pipeline fragments to find the 5 most frequently occurring IP addresses in an access log.
grep -oE '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' access.log|sort|uniq -c|sort -rn|head -5
Arrange the fragments to redirect both stdout and stderr of a deployment script into a single log file.
./deploy.sh>output.log2>&1
Arrange the pipeline to count how many files under src/ contain the word TODO.
grep -rl 'TODO' src/|wc -l
Arrange the fragments to grant execute permission on a script and immediately run it.
chmod +x script.sh&&./script.sh
After finishing these quizzes, you are now ready to practice in a real Linux system. Try the Interactive Shell Scripting Tutorial!