Redirection and piping are fundamental concepts in Linux that enable powerful command-line workflows. By controlling input, output, and error streams, you can combine simple commands to perform complex data processing tasks efficiently. Mastering these operators is essential for effective system administration and automation.
Standard Streams in Linux
Every Linux process has three standard streams for communication:
Standard Input
Input stream for commands, typically from keyboard.
File Descriptor: 0
Default Source: Keyboard
Redirection: <
Standard Output
Normal output stream for commands.
File Descriptor: 1
Default Destination: Terminal
Redirection: >, >>
Standard Error
Error output stream for commands.
File Descriptor: 2
Default Destination: Terminal
Redirection: 2>, 2>>
Output Redirection Operators
Overwrite Redirect
Redirect stdout to a file, overwriting existing content.
Examples:
ls -l > listing.txtecho "Hello" > greeting.txtdate > current_time.txt
Note: Creates file if it doesn't exist
Append Redirect
Redirect stdout to a file, appending to existing content.
Examples:
echo "New entry" >> log.txtdate >> system_log.txtwho >> users.txt
Use Case: Log files, data collection
Error Redirect
Redirect stderr to a file, overwriting existing content.
Examples:
ls /nonexistent 2> error.txtgrep pattern file 2> grep_errors.logscript.sh 2> stderr.log
Use Case: Error logging
Input Redirection Operators
Input Redirect
Use file content as stdin for a command.
Examples:
sort < unsorted.txtwc -l < document.txtgrep "error" < logfile.txt
Alternative: Often replaced with pipes
Here Document
Use inline text as stdin until a delimiter.
text
DELIMITER
Examples:
cat << EOF
Line 1
Line 2
EOFsort << END
banana
apple
END
Use Case: Scripts, multi-line input
Here String
Use a string as stdin for a command.
Examples:
grep "hello" <<< "hello world"wc -w <<< "count these words"tr 'a-z' 'A-Z' <<< "lowercase"
Use Case: Quick string processing
Piping Operator
Command Pipeline Flow
stdout
stdin
stdin
Pipe Operator
Connect stdout of one command to stdin of another command.
Examples:
ls -l | grep ".txt"- Find text filesps aux | sort -nrk 3 | head -5- Top 5 CPU processescat logfile | grep "ERROR" | wc -l- Count errorsdmesg | tail -20 | less- Page through last 20 kernel messages
Power: Combine simple commands for complex tasks
Advanced Redirection Techniques
| Operator | Description | Syntax | Example | Use Case |
|---|---|---|---|---|
| &> | Redirect both stdout and stderr | command &> file | script.sh &> output.log |
Complete output capture |
| 2>&1 | Redirect stderr to stdout | command > file 2>&1 | ls /root > output 2>&1 |
Combine outputs |
| >&2 | Redirect stdout to stderr | echo "error" >&2 | echo "Debug" >&2 |
Script error messages |
| |& | Pipe both stdout and stderr | command1 |& command2 | script.sh |& grep "error" |
Process all output |
| >/dev/null | Discard output completely | command > /dev/null | ping host > /dev/null |
Silent operation |
| 2>/dev/null | Discard error output only | command 2> /dev/null | rm file 2> /dev/null |
Suppress errors |
•
> - Overwrite (think "greater than" - replaces content)•
>> - Append (double arrow - adds to end)•
2> - Error redirect (2 for stderr)•
| - Pipe (vertical bar - connects commands)•
< - Input redirect (arrow points to command)
Practical Examples and Workflows
Real-World Redirection Scenarios
# 1. System Administration
# Monitor disk space and save to file
df -h > disk_usage.txt
# Check running processes and filter
ps aux | grep nginx > nginx_processes.txt
# Backup directory with verbose output
tar -czf backup.tar.gz /home/user/ &> backup.log
# 2. Log Analysis
# Count unique IP addresses in web logs
cat access.log | awk '{print $1}' | sort | uniq -c | sort -nr > top_ips.txt
# Extract errors from multiple log files
cat *.log | grep -i "error" > all_errors.txt
# Monitor log file in real-time
tail -f /var/log/syslog | grep "ssh"
# 3. Data Processing
# Sort a file and remove duplicates
sort data.txt | uniq > unique_data.txt
# Count lines, words, and characters
wc -l < document.txt
cat document.txt | wc -w
# Convert text to uppercase
cat file.txt | tr 'a-z' 'A-Z' > uppercase.txt
# 4. File Management
# Find large files and save list
find /home -size +100M -type f > large_files.txt 2>/dev/null
# Compare two directories
diff <(ls dir1) <(ls dir2)
# Create a file with multiple lines
cat > config.txt << EOF
server_name example.com
port 8080
timeout 30
EOF
# 5. Network Operations
# Test connectivity silently
ping -c 3 google.com > /dev/null 2>&1 && echo "Online" || echo "Offline"
# Download file with progress and log
wget http://example.com/file.zip -O file.zip > download.log 2>&1
# Check open ports
netstat -tulpn | grep :80 > http_ports.txt
Common Use Cases
System Monitoring
- Resource Tracking:
ps aux --sort=-%mem | head -10 > top_memory.txt - Disk Monitoring:
df -h | grep -v tmpfs > disk_report.txt - Network Stats:
netstat -s | grep "segments" > network_stats.txt - Service Status:
systemctl list-units --type=service | grep running > running_services.txt
Log Management
- Error Extraction:
grep -i "error" /var/log/syslog > errors.txt - Log Rotation:
cat /var/log/messages.1 >> /var/log/messages.archive - Real-time Monitoring:
tail -f /var/log/nginx/access.log | grep "404" > 404_errors.txt - Log Analysis:
awk '{print $1}' access.log | sort | uniq -c | sort -nr > frequent_visitors.txt
Data Processing
- Text Processing:
cat data.csv | cut -d, -f1,3 | sort > sorted_data.txt - Data Filtering:
grep -v "^\s*#" config.txt > clean_config.txt - Format Conversion:
iconv -f UTF-8 -t ASCII file.txt > file_ascii.txt - Data Validation:
grep -E '^[A-Za-z0-9]+$' usernames.txt > valid_users.txt
Advanced Piping Techniques
# Compare outputs of two commands
diff <(ls dir1) <(ls dir2)# Sort and compare two files
comm <(sort file1.txt) <(sort file2.txt)Named Pipes (FIFO):
# Create a named pipe
mkfifo my_pipe
# Write to pipe in one terminal
ls -l > my_pipe
# Read from pipe in another terminal
cat < my_pipeTee Command:
# Split output to multiple destinations
ls -l | tee listing.txt | wc -l
# Append to file and show on screen
echo "New entry" | tee -a log.txt
# Multiple files
dmesg | tee file1.txt file2.txt > /dev/nullXargs for Batch Processing:
# Process multiple files
find . -name "*.txt" | xargs wc -l
# Run command for each line
cat urls.txt | xargs -n 1 wget
# Parallel processing
cat files.txt | xargs -P 4 -n 1 gzip
•
> will overwrite files without warning - always double-check• Pipes create subprocesses - variable changes won't affect parent shell
• Some commands buffer output which can affect real-time piping
• Complex pipelines can be difficult to debug - test step by step
• Be careful with
rm commands in pipelines• Use
set -o pipefail in scripts to catch pipeline errors
Troubleshooting Common Issues
# Check file permissions
ls -l output.txt
# Use sudo if needed
sudo dmesg > kernel_log.txtNo Output or Empty Files:
# Command might not produce output
ls /empty_dir > file.txt
# Check if command has output
ls /empty_dir | wc -l
# Verify redirection syntax
echo "test" > file.txt
cat file.txtPipeline Hangs:
# Command might be waiting for input
grep "pattern" # Hangs waiting for stdin
# Use Ctrl+C to interrupt
# Provide input or use different approach
echo "text" | grep "pattern"Error Messages in Wrong Place:
# Redirect errors properly
ls /nonexistent 2> error.log
# Combine stdout and stderr
command > output.log 2>&1
# Discard errors completely
command 2> /dev/null
Key Takeaways
Redirection and piping are powerful tools that transform simple Linux commands into sophisticated data processing pipelines. By mastering operators like >, >>, <, and |, you can efficiently manipulate data streams, automate tasks, and build complex workflows. Remember to handle both stdout and stderr appropriately, use pipes to combine commands creatively, and always test complex pipelines step by step.
Next Step: Explore shell scripting to automate complex redirection and piping workflows, creating reusable solutions for common system administration tasks.