Note: This is a follow-up to my previous article "6 Techniques I Use to Create a Great User Experience for Shell Scripts". If you haven't read it yet, you might want to check it out first!
After publishing my previous article on how I create a great user experience for shell scripts, the Hacker News community provided valuable feedback and suggestions. I've compiled these insights into six additional techniques to further enhance your shell scripts:
1. Make your script interoperable and pipeline friendly
Ensuring your scripts work well in various environments is crucial. Here are four key practices:
a) Use #!/usr/bin/env bash
#!/usr/bin/env bash uses the user's PATH to find the bash executable, which is helpful when bash might be installed in different locations on different systems.
#!/bin/bash directly specifies the bash location, which might not exist on all systems.
b) Use tput for portable color output
Example:
if command -v tput &>/dev/null && [ -t 1 ] && [ -z "${NO_COLOR:-}" ]; then
    RED=$(tput setaf 1)
    GREEN=$(tput setaf 2)
    RESET=$(tput sgr0)
else
    RED=""
    GREEN=""
    RESET=""
fi
c) Respect the $NO_COLOR environment variable
As shown above, check for $NO_COLOR before using colors.
d) Output status messages to stderr
Example:
echo "Error: Invalid input" >&2
2. Use printf instead of echo
echo can behave unexpectedly, especially with certain flags or special characters. printf is more consistent across different systems:
printf "Hello, %s!\n" "$name"
3. Use shellcheck to find bash pitfalls
ShellCheck is an excellent tool for identifying common issues in shell scripts. For example, it found this potential problem in my original evaluate.sh script:
# Before
rm -rf $dir/*
# After (ShellCheck suggestion)
rm -rf "$dir"/*
Always run your scripts through ShellCheck before considering them complete.
4. Be aware of double bracket vs single bracket behavior
Double brackets [[]] are a Bash extension and offer more features than single brackets []. They're generally safer and more flexible. Here are key differences:
# Pattern matching fails with single brackets
file="my-special-file.txt"
if [ "$file" = *.txt ]; then    # Always false! Literal comparison
    echo "Single brackets don't do pattern matching"
fi
if [[ $file == *.txt ]]; then   # Works! Pattern matching
    echo "Double brackets support pattern matching"
fi
# Word splitting behavior differs
path="/some path/with spaces"
if [ -d "$path" ]; then         # Must quote variables
    echo "Single brackets need quotes"
fi
if [[ -d $path ]]; then         # Quotes optional (but still recommended)
    echo "Double brackets handle spaces"
fi
# Logical operators are different
if [ "$a" = "1" ] && [ "$b" = "2" ]; then     # Shell operators needed
    echo "Single brackets use shell && ||"
fi
if [[ $a == 1 && $b == 2 ]]; then             # Built-in operators
    echo "Double brackets have logical operators"
fi
# Regular expressions only work in double brackets
if [[ $number =~ ^[0-9]+$ ]]; then            # Regex support
    echo "Double brackets support regex with =~"
fi
Double brackets are more forgiving and feature-rich, but they're Bash-specific. If you need POSIX compatibility (to run on other shells like dash or sh), use single brackets. Otherwise, double brackets are generally safer and more convenient.
Note: Even with double brackets, it's still considered good practice to quote your variables to maintain consistency and prevent surprises when scripts are modified later.
5. Use exit 2 instead of exit 1 for usage errors
It's a convention to use exit code 2 for command line syntax errors:
if [ $# -eq 0 ]; then
    echo "Usage: $0 <argument>" >&2
    exit 2
fi
6. Make use of bash boilerplates
Bash boilerplates can provide a solid foundation for your scripts. One popular option is bash3boilerplate. Here's a simple example:
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
__dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
__file="${__dir}/$(basename "${BASH_SOURCE[0]}")"
__base="$(basename ${__file} .sh)"
arg1="${1:-}"
# Your code here
Wrapping Up
If you enjoyed learning these techniques, then check out my previous article, "6 Techniques I Use to Create a Great User Experience for Shell Scripts". It covers: 1) Comprehensive Error Handling and Input Validation, 2) Clear and Colorful Output, 3) Detailed Progress Reporting, 4) Strategic Error Handling with "set -e" and "set +e", 5) Platform-Specific Adaptations, and 6) Timestamped File Outputs for Multiple Runs.
Also, check out the Hacker News discussion of that post which inspired this follow-up post.
Thank you to the Hacker News community for these valuable insights. Happy scripting!