Contents

Bashing Bash

Bashing Bash

Bash a cryptic, unwieldy language that for some bizarre reason became the standard on many linux machine. This post will be a bit different than my usual posts. Today, its’ Bash vs my sanity, with our favorite naïve junior developer, Bobby, he will take the driving seat in discovering the wasteland of shell scripting.

We will hope just like in mad max, to reach the other side of the wasteland of where a sane world of alternatives to bash. So lets strap in and …

Rant TIME!!!

Thunder cracks the smoke clear like the Terminator he appears my personal nightmare Bash. Bash and its siblings installed on linux machines, it is unpredictable, clunky and just plain weird. Yes shell is installed everywhere but each OS has different cli’s installed some have curl, others have wget. Even the flavors of the CLI tools don’t match, grep can have different ‘arguments’ depending on the OS and no way to check this. The only thing that is certain in bash is that it is inconsistent.

Round 1: Arithmetics

Bobby just trying to do some simple math. Arithmetics this can’t be hard he tough. He was wrong:

x = 1 # x: command not found BOBBY: ?
z= 1 # 1: command not found BOBBY: ??
f =1 # f: command not found BOBBY: ???
y=1 # BOBBY: ah oke so no spaces?

So only y=1 works because it is space sensitive unlike every sane language. This is a classical landmine one step wrong and your whole script blows up in your face. The best part of it, bash doesn’t halt by default on error so when you forget to set set +x you might never notice an error.

So now Bobby tires to to arithmetics:

#!/bin/bash
x=1
x+=1
echo $x # 11 Bobby: Uh, i was expected 2

Welcome to Bash, Bobby, you are doing string operations not arithmetics. You have to use the command called let:

let x=1
let x+=1
echo $x # 2 Bobby: oke this is what i expected

Bobby oh cool what if i do:

let x=1
x+=1
echo $x # 11 Bobby: kinda makes sense i guess.

Ah i wonder how it is in my favorite shell ZShell:

let x=1
x+=1
echo $x # 2 Bobby: eum oke

# how about:

let x=1
y=2
y+=$x
x+=$y
echo " x: $x \n y: $y"
# x: 22
# y: 21

Thank you Bobby but lets keep ourself limited to bash, we will have to pick our battles and i don’t want to die on that hill. Bobby but why?: Oke, in zsh let is sticky and depends on the ‘subject’ of the operation, so any interaction with y is string based. Interactions with x is arithmetic, so "1"+"2" => 21 and then you add 1 to it to get 22 + 1 = >22

Bobby why is this not 2? :

let x=1
let y=2
x+=y # Oops, forgot the $ and let
echo $x # Outputs 1

Bash doesn’t even tell us that there is an error. It just silently skips errors and continues along. Bobby have to enable error checking set +x, Nothing. Have you check the error code echo $? outputs 0… Bash pretends like everything is fine. Useful right? There really isn’t a way to catch this error it seems.

Bobby i got it to work:

let x+=y # not no $ to mark the variable?
echo $x # Outputs 3

Me maybe we could try a tool to help us with this, shellcheck. Adds the snippet to a file called test.bash.

$ shellcheck test.bash

In test.bash line 1:
let x=1
^-----^ SC2219 (style): Instead of 'let expr', prefer (( expr )) .


In test.bash line 2:
let y=2
^-----^ SC2219 (style): Instead of 'let expr', prefer (( expr )) .
     ^-- SC2034 (warning): y appears unused. Verify use (or export if used externally).


In test.bash line 3:
let x+=y # Oops, forgot the $
^------^ SC2219 (style): Instead of 'let expr', prefer (( expr )) .

Well that wasn’t very helpful, and really is this (()) any clearer than let?

Lets continue with the arithmetics, alternative commands to let:

expr 1 + 1 # 2 executes and prints an expression
echo $(( 1 + 1)) # 2 executes the expression and fills it in

Bobby, do you remember how to take a size of a string?

size "hello"
# bash: hello: No such file or directory

len "hello"
# Command 'len' not found, but there are 16 similar ones.

echo "hello" | wc -c
# 6 i expected 5...

x="hello"
echo ${#x}
# 5 Aha

functions

Alright, Bobby, lets dive into Bash functions!

Oke i can do it its probably like other languages: function print(x) { echo "$x"}

No Bobby isn’t like other languages this is Bash!

function say-hello() { 
  echo $1; 
};

say-hello "world" "i'm ignored" # outputs: world
echo "$?" # 0 no error

Look at that. No warning, no error—just silence. Bash doesn’t care about extra parameters; it doesn’t even acknowledge their existence. Named parameters? Not in this language.

The best we can do to fail when no parameter are passed:

function say-hello() {
  local msg=${1?} # this requires at least one parameter. more about local later.
  echo $msg  
}

Bobby what is the scope of the msg variable?

Scope? Yea, Good luck with that

So how does variable scoping inside a function? Lets find out:

function f0() {
  echo "$x"
}

function f1() { 
  x=1
  echo $x # outputs: 1
};

function f2() {
  x=2
  echo $x # outputs: 2
  f0 # outputs 2

  f1 # outputs 1
  echo $x # outputs: 1
}

f2

Bobby, see that? Variables aren’t local. Bash functions try to find any matching variable and declare them global by default. Introducing local:

function f1() { 
 local x=1
 echo $x # outputs: 1
};

function f2() {
  local x=2
  echo $x # outputs: 2
  f1
}

f2

Oke, looks great, The variables are local, But wait this isn’t how locals actually work, in Bash means make a shadowed variable within the child function, not encapsulated. Check this out:

function f1() { 
 let x+=1
 echo $x # 3
};

function f2() {
  local x=2
  echo $x # 2
  f1
  echo $x # 3
}

f2

Yea so no real locals like other langues, Isn’t Bash fun bobby?

Can i go home? i’m getting tired of this… No bobby we agreed to go to the bottom of this, we are just getting started!

So how about this?

local x=10 # local: can only be used in a function

Damn you!!!

Higher Order functions Bobby asked can we do higher order functions,

function higher-order() {
  $1 "hello"
}

function say() {
  echo $1
}

higher-order say # "hello"

So yes we can pass functions around but how do we return from a function? return is reserved for error codes, so we are back to echo and the risk of polluting stdout.

function get-value() {
  echo "This is my return value"
}

result=$(get-value)
echo $result # Outputs: This is my return value

Sometimes stdout get polluted by debug messages:

function create-file() {
  echo "creating file $1"
  touch "$1"
}

function create-file-and-add-data() {
  create-file "$1"
  echo "$2" >> "$1"
  cat "$1"
}

result=$(create-file-and-add-data "hello.txt" "hello")
echo $result # Outputs: 
# creating file hello.txt hello

See, Bobby, this isn’t elegant. And did I mention that unlike variables Bash functions don’t get inherited by subshells?

function printLine() {
  echo "line: $1"
}

cat file | xargs -I {} printLine {} # doesn't work
export -f printLine
cat file | xargs -I {} printLine {} # now it works

So, if you want to pass functions to subshells, you’ll have to export -f every single one. This is the joy of Bash: one command forward, three exports back.

passing arguments

Passing arguments in bash is like playing russian roulette:

message="I, want to say that i love you"
function text-to-wife() { echo $1; };
text-to-wife $message # letter-to-my-fiance.txt start-gambling.com ...
text-to-wife "$message" # i want to say that i love you

When you forget to quote $message, Bash splits on whitespace leaving you with a lonely message I'. You could also solve this by using “$@” but this might also has its quirks. Lets say you want to correct your message using the convention of * msg:

message="* I, want to say that i love you"
function text-to-wife() { echo "$@"; };
text-to-wife $message # send-love-message-to-wife.sh send-love-message-to-barbara.sh I,want to say that i love you

Bash will just happily start sniffing your current directory to check your filesystem and share your personal information.

CLI flags? You’ll have to build that yourself:

while [ "$1" != "" ]; do
  case $1 in
  --force)
      SKIP_VERIFICATION=true
      ;;
  *)
      usage
      exit 1
      ;;
  esac
  shift  # remove the current value for `$1` and use the next
done

Writing an arguments parsing code often takes longer than the function itself especially considering support of flags like --flag x --flag=x and -f x -f=x the above doesn’t support this. Without build in flag support, apart from setopts, means reinventing the wheel every time.

Quoting rules from HELL

Bash has two quoting styles: ' for string literals and " for interpolated strings. This is fine until you need to use a " in a interpolated string:

name="bobby"
echo 'hello: $name' # hello: $name
echo "hello: $name" # hello: bobby

Now imagine embedding JSON:

body="hello"
msg="{
  \"message\": \"$body\"
}"

A better alternative:

body="hello"
msg=$(cat << EOF
  {
    "message": "$body"
  }
EOF
)

Bobby, just be glad you weren’t born into Perl scripting.

control flows

Lets talks about control flows. They feel like they come from a different decade.

# why a [[]] (it actually an alias for 'test')
if [[]] 
then # why the then?

else 
# why no then?
fi # why fi?

Why close the block with fi considering that functions use {} for the block scope and not noitcnuf.

It seems conditionals always close with the reverse opening keyword. Maybe they tough it was a good idea to reserve double the amount of keywords to increase the language complexity score.

case $1 in
    req|reqs|requirements) TASK="Functional Requirements";;
    met|meet|meetings) TASK="Meetings with the client";;
esac

For loops also have there own antique syntax:

# why do
for x in "@xs"; do
  # why not rof? or just end
done

Still, the inconsistency is maddening.

Looping over Arrays? Arrays in Bash are implicit, based on spaces, so without “$@”, everything breaks down. Need to iterate reliably? Consider switching languages.

Error Handling: Bash Style

In Bash, checking errors is tough. Try this:

echo "hello"
$? # 0 means success
grep -k "blabla" # -k doesn't exist
$? # 2 fails

The exit code given by $? tells you when something failed, but that’s about it. If you are lucky an error is printed to ‘stderr’. Worst case it prints to stdout polluting your return value of your functions, Yeey.

Now cosidering pipes, they are messier:

grep -k "blabla" | echo "hello" # hello
$? # Outputs: 0 (Even though grep failed)

You’ll have pipefail to safe you:

set -o pipefail

grep -k "blabla" | echo "hello" # hello # still outputs "hello" polluting the stdout
$? # 2

Great, now you know the pipe failed, but echo still runs. Every time you want reliable error handling, you have to add the checks yourself. The only real way to handle errors properly is to create a chain of && and ||.

set -o pipefail
# A complicated example
{
  cat "other-file" \
  || { # if failed recover using this:
      touch "other-file" && cat "other-file" # use this as the 'recovery" by creating a file
    } 
} \
  | grep "x" | sed 's/x=1/x=2/' || echo "No edit made which is fine" >2 # continue the function, and print to stderr inn case of an err

This is tedious and unintuitive. Instead of focusing on your logic, you’re babysitting Bash’s quirks.

IDE Support, Autocomplete

Using Bash in a modern IDE? Even premium IDE’s like Intellij offer basic syntax checking at best, thanks to tools like shellcheck. Forget about autocomplete or meaningful suggestions. Shell scripts aren’t first-class citizens here. Even tough bash has ‘autocomplete’ features in the terminal IDE’s doesn’t seem to do the heavy work to us the same tools to make suggestions. It is a better idea to first write it in the terminal and then backport it to the script file.

No libraries

Bash modules? libraries nope. Maybe namespaces. Forget about that. To reuse Bash code you will have to rely on external CLI tools, and each CLI is its own DSL with its own rules. There’s no standard library, so you end up cobbling together logic from different command-line tools and reading countless –help manuals.

Bobby how do i get the help print?

go help build # go is weird
go build -race # notice single -

grep -v --fixed-strings # short flags have -, long flags --
grep -h # no help for -h, but it has --help

find --help # find uses --help but others are long flags with a single - go figure...

Quality improvements

As painful as Bash can be, there are ways to make it a bit more bearable. If you have to use Bash, these tips might help you keep it under control. Although i recommend you to implement a guideline like google styleguide. They recommend against bash and limit the size to 100 lines because scripts tend to grow so you have to keep it under control.

use a ‘main’ function

Use a main function this converts scripts into reusable modules which can be sourced from other scripts. Mind you functions are not scoped so if you use short function names at some point they will collide.

Here is an example:

# script1.sh
# prints output to stderr
function debug() { echo "$@" 1>&2; }

function main() {
  local x=1 # at least this is limited to only people that run the script
}

if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
    main "$@"
fi

Then the above file can be sourced in another file:

source script1.sh

# mind you this main will overwrite the imported main
function main() {
  debug "hello"
}

# sourcing here would call the others script main

if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
    main "$@"
fi

It isn’t perfect but the bash script becomes more structured and testable.

A few debug functions to help

Having some basic logging can help you make scripts easier to troubleshoot. And avoid getting your stdout tainted:

# debug.sh

# log functions to stderr:
function INFO()  { echo "[${date}] INFO:" "$@" 1>&2;  }
function WARN()  { echo "[${date}] WARN:" "$@" 1>&2;  }
function ERROR() { echo "[${date}] ERROR:" "$@" 1>&2; }

Another Ace up my sleeve is an alias called breakpoint, i define this in my ~/.bashrc:

# a breakpoint alias
alias breakpoint='
    while read -p"Debugging(Ctrl-d to exit)> " debugging_line
    do
        eval "$debugging_line"
    done'
'

Why all the hassle to define your own breakpoint. Bash doesn’t have a build in debugger. Its an alias so it gets inlined in the function allowing you to read any local variable or function.

Use a tool manager.

If you’re using Bash in a CI environment, using a tool manager like Aqua or Mise can be a real life saver. They help you ensure a consistent version of tools between your local project and the CI system. This makes it easy to setup the right tools for the project.

Alternatives

Let’s be real: Bash was never build to handle complex task. Scripts tend to start as a quick patch of a few lines of logic. But there is nothing more permanent than a temporary solution. So a script starts with a few lines and ends up as a 200 line behemoth. Where it is hard to setup the compatible cli tools and then you are stuck parsing json with jq or yq.

Thankfully you don’t have to stay in the Bash wasteland forever. Here are two programming languages which makes scripting not only possible but even enjoyable.

Python

Compared to bash python feels like a luxury. It has a clean syntax, rich standard library and a very large ecosystem. As long as there isn’t a real need for speed and concurrency like many rest calls python is an ideal candidate. Its good for a quick and dirty script all the way up to a service running in production.

Python advantages

Python has an overall clean syntax and there isn’t too much magic involved. The standard library is powerful and there many useful libraries like requests and for scripts there is typer For testing i prefer pytest over the standard library.

Try out python

import typer

app = typer.Typer()

@app.command()
def hello(name: str):
    print(f"Hello {name}")

@app.command()
def goodbye(name: str, formal: bool = False):
    if formal:
        print(f"Goodbye Ms. {name}. Have a good day.")
    else:
        print(f"Bye {name}!")

if __name__ == "__main__":
    app()

Can be run with:

python main.py hello Alice
using a venv
.venv/bin/python main.py hello Alice

python quirks

Fragmented tooling For a long time there have been many tools to manage python dependencies. To cut it simple i would recommend to use uv for venvs, this allows you to choose the python version.

uv .venv --python 3.12 # with uv you can select the python version
uv pip install -r requirements.txt

For formatting i would install ruff or black. There are many options for type checker but i would stick to the default one mypy.

But for simple scripts you only need the venv to spoil the system python. But there are many tools to improve the developer environment when the script grows into a program.

The main issue really becomes ‘distributing’ the program to other people because you can’t package it into a single binary.

golang

I’ve experience with python and Golang, I think golang is a fantastic choice when you actually want a script that can scale into a full program. Since it compiles into a single binary it becomes trivial to distribute and release.

But you will have to step over its quirky syntax, you will think its missing feature X or Y, but at the end of the day you might not really need them or there is a simple workaround.

Try out go

Advantages of go

  • Compiled it produces a single binary making it easy to distribute
  • Build in concurrency using goroutines
  • A full ecosystem in the standard go tool.
  • Fast compilation speed.

Getting started with go scripting:

Here’s a typical setup for a Go scripting project:

mkdir -p project/scripts && cd project # create a project

go mod init project # create a module this is like a python venv
mkdir scripts/hello-world # the script directory

Then initialize the go.mod file using:

module scripts

go 1.23.2 # sets the go version to use

toolchain go1.23.2 # sets the go tool version to use

Create a simple hello-world script in Go:

# in scripts/hello-world/main.go
package main

func main() {
    println("Hello, Go!")
}

Run it with:

go run project/scripts/hello-world

To install external libraries there is:

go get github.com/stretchr/testify

So each script can live it its own folder, golang only allows a single main per folder.

I would also recommend the following libraries:

  • conc easy concurrency in go
  • testify easy asserts for testing
  • cobra for more complicated CLI tools.

Go has great tooling

The go command is very good since most of it comes out of the box:

# running scripts
go run script-dir

# building a single binary
go build script-dir -o script
go install script-dir # installs it on the path

# modules:
go get install # installs libraries
go mod tidy # cleans up the unused dependencies

# testing
go test ./... # runs all the tests (any x_test.go file)

# code generation
go generate ./...

# code formatting
go fmt ./...

# linting
go vet ./...

The only extra tool i recommend is golangci-lint a package combining many different linting tools.

Go disadvantages

Go has a quirky syntax and a limited feature set, To give you the famous quote about golang from Rob Pike:

Gofmt’s style is no one’s favorite, yet gofmt is everyone’s favorite.

The main advantage that unlike javascript you don’t have a continuous crunch to find the latest new tools, feature and libraries. It will be trivial to upgrade multiple major versions of golang in a single step since they promise exceptional backwards compatibility.

Error handling is very verbose but it is a double edged knife, there are no unexpected exceptions but you will have to put

data, err := os.ReadFile("/tmp/data")
if err != nil {
  return err
}

But if you can step over these issues you will have an good time writing go.

in my wildest dreams:

In an ideal world we would script in something like the scala CLI, but with support for native binary building:

Here we can install dependencies special include statements, Later on when we want to make it a real program we can put all the dependencies into a single dependency file and continue to build on the script:

// import a library with version:
//> using lib "com.lihaoyi::pprint::0.6.6"
import pprint

object Maps {
  def main(args: Array[String]): Unit = {
    // pretty printing the map type
    println("Maps in Scala have the shape " + pprint.tprint[Map[_,_]]) // prints: "Maps in Scala have the shape Map[_, _]"
  }
}

The issue with scala is that it doesn’t have a mature native ecosystem.

Ideally a script would be a single file package with all the information in it to run it.

If only golang had a REPL it might become trivial to build this and end up with the best of both worlds. Could you image this Bobby:

# without the whole mod files etc:
go install my-script.go

my-script stay "hello"

# or 
go repl
> name := "Bobby"
> fmt.Printf("hello world: %s", name)

No answer, Where is bobby, he is sleeping, its late and sadly its time to go home.

Conclusion: Reach for other tools

Bash has been the default scripting language for decades, but its no longer suited for the complicated needs of modern development. Although python and go will be more verbose than bash the maintainability is lightyear ahead.

Ideally the scripts of the future would be a single file containing the dependencies needed. Only a single tool should be installed on the machine allowing you to run any script you want. Even the tests should be included in the file making it really easy to maintain and test.

So whenever you have a Bash script that is growing beyond 20 lines consider switching over to a modern language like Python, Go or Javascript.

Inspiration to write this:

Thank you many other bloggers and stack overflow snippets to inspire me to make this post.