What do you advice for shell usage?
- Do you use bash? If not, which one do you use? zsh, fish? Why do you do it?
- Do you write
or
? Do you write fish exclusive scripts?
- Do you have two folders, one for proven commands and one for experimental?
- Do you publish/ share those commands?
- Do you sync the folder between your server and your workstation?
- What should’ve people told you what to do/ use?
- good practice?
- general advice?
- is it bad practice to create a handful of commands like
podup
andpoddown
that replacepodman compose up -d
andpodman compose down
orpodlog
aspodman logs -f --tail 20 $1
orpodenter
forpodman exec -it "$1" /bin/sh
?
Background
I started bookmarking every somewhat useful website. Whenever I search for something for a second time, it’ll popup as the first search result. I often search for the same linux commands as well. When I moved to atomic Fedora, I had to search for rpm-ostree
(POV: it was a horrible command for me, as a new user, to remember) or sudo ostree admin pin 0
. Usually, I bookmark the website and can get back to it. One day, I started putting everything into a .bashrc
file. Sooner rather than later I discovered that I could simply add ~/bin
to my $PATH
variable and put many useful scripts or commands into it.
For the most part I simply used bash. I knew that you could somehow extend it but I never did. Recently, I switched to fish because it has tab completion. It is awesome and I should’ve had completion years ago. This is a game changer for me.
I hated that bash would write the whole path and I was annoyed by it. I added PS1="$ "
to my ~/.bashrc
file. When I need to know the path, I simply type pwd
. Recently, I found starship which has themes and adds another line just for the path. It colorizes the output and highlights whenever I’m in a toolbox/distrobox. It is awesome.
Use
shellcheck
Use
set -x
For debugging
#!/usr/bin/env bash
A folder
dotfiles
as git repository and adotfiles/install
that soft links all configurations into their places.Two files,
~/.zshrc
(without secrets, could be shared) and another for secrets (sourced by.zshrc
if exist secrets).dotfiles
Thanks! I’ll check them out. I knew the cooncept existed but so far I didn’t dig deep into managing them. This is my start I guess https://wiki.archlinux.org/title/Dotfiles
This is the way!
why?
!/usr/bin/env
will look inPATH
forbash
, andbash
is not always in/bin
, particularly on non-Linux systems. For example, on OpenBSD it’s in /usr/local/bin, as it’s an optional package.If you are sure
bash
is in/bin
and this won’t change, there’s no harm in putting it directly in your shebang.because bash isn’t always in
/usr/bin/bash
.On macOS the version on
/usr/bin/bash
is very old (bash 3 I think?), so many users install a newer version with homebrew which ends up in PATH, which/usr/bin/env
looks at.Protip: I start every bash script with the following two lines:
#!/usr/bin/env bash set -euo pipefail
set -e makes the script exit if any command (that’s not part of things line if-statements) exits with a non-zero exit code
set -u makes the script exit when it tries to use undefined variables
set -o pipefail will make the exit code of the pipeline have the rightmost non-zero exit status of the pipeline, instead of always the rightmost command.
Nice, thx!
/bin/sh
is always/bin/sh
.
I use bash and I usually put /bin/bash in my scrtipts, because that’s where I know it works. /bin/sh is only if it works on many/all shells.
I don’t have many such scripts, so I just have one. I don’t really share them, as they are made for my usecase. If I do create something that I think will help others, then yes, I share them in git somewhere.
I do have a scripts folder in my Nextcloud that I sync around with useful scripts.
Some of your examples can probably just be made into aliases with
alias alias_name="command_to_run"
.thx! Why do you think that aliases are better for it?
I moved away from aliases because I have a neat command management where each command is one script.
I can’t speak for anyone else, but for me, it’s just one file to backup to keep all your custom commands (.bashrc) while it would be many files if you have a script for each.
I can’t see the benefit of having a script for just one command (with arguments) unless those arguments contain variables.
I use Bash for scripts, though my interactive shell is Fish.
Usually I use
/usr/bin/env bash
as shebang. This has the advantage of searching your PATH for Bash instead of hardcoding it.My folders are only differentiated by those in my PATH and those not.
Most of my scripts can be found here. They are purely desktop use, no syncing to any servers. Most would be useless there.
For good practice, I’d recommend using
set -euo pipefail
to make Bash slightly less insane and use shellcheck to check for issues.
This is personal preference, but you could avoid Bashisms like [[ and stick to POSIX sh. (Use/usr/bin/env sh
then.)With shortened commands the risk is that you might forget how the full command works. How reliant you want to be on those commands being present is up to you. I wouldn’t implement them as scripts though, just simple aliases instead.
Scripts only make sense if you want to do something slightly more complex over multiple lines for readability.#/usr/bin/env bash typo?
thx for the tips!
I prefer single files over aliases since I can more easily manage each command.
You’re right, it’s
#!
- I use
bash
, because I never had the time to learn anything else. - Like @jlsalvador@lemmy.ml said, I use the
shebang.
- Nope
- Also nope
- Nope. Shell scripts reside in Git repos on Gitlab/Gitea/Forgejo and are checked out using Ansible playbooks onto the servers as necessary.
- For scripts? Python. Read this blog post by the great @isotopp@chaos.social. For interactive use?
bash
is just fine for me, though I’ve customized it using Starship and created some aliases to have colored/pretty output where possible. - Use
shellcheck
before running your scripts in production, err on the side of caution,set -o pipefail
. There are best practices guides for Bash, use those and you’ll probably be fine. - Be prepared to shave yaks. Take breaks, touch grass, pet a dog. Use
set -x
inside your Bash script orbash -x scriptname
on the CLI for debugging. Remember that you can always fallback to interactive CLI to test/prepare commands before you put them into your script. Think before you type. Test. Optimize only what needs optimization. Use long options for readability. And remember: Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows your address. - Nope, it’s absolutely not bad practice to create aliases to save you some typing in interactive shell. You shouldn’t use them inside your scripts though, because they might/will not be available in other environments.
I switched to fish because it has tab completion Yeah, so does Bash, just install it.
Oh, I also “curate” a list of Linux tools that I like, that are more modern alternatives to “traditional” Linux tools or that provide information I would otherwise not easily get. I’ll post i
Spoiler
Debian-Packages available
mtr iputils-tracepath iproute2 zsh httpie aria2 icdiff progress diffoscope atop powertop ntopng ethtool nethogs vnstat ss glances discus dstat logwatch swatch multitail lynis ncdu (du-clone), alias du=“ncdu --color dark -rr -x --exclude .git --exclude node_modules” nnn (fully-featured terminal file manager. It’s tiny, nearly 0-config and incredibly fast. https://github.com/jarun/nnn) slurm calcurse newsbeuter tig (“ncurses TUI for git. It’s great for reviewing and staging changes, viewing history and diffs.”) qalc ttyrec taskwarrior ttytter ranger ipcalc pandoc moreutils googler weechat pdftk abcde dtrx tload ttyload cockpit sar ht (hte Hex Editor) dhex ack (grep-clone) silversearcher-ag (grep-clone) ripgrep (“recursively searches file trees for content in files matching a regular expression. It’s extremely fast, and respects ignore files and binary files by default.”, https://github.com/BurntSushi/ripgrep) exa (statt ls) https://the.exa.website/ (“replacement for ls with sensible defaults and added features like a tree view, git integration, and optional icons.”) fzf (CLI fuzzy finder), alias preview=“fzf --preview ‘bat --color "always" {}’” fd (simple, fast and user-friendly alternative to ‘find’, https://github.com/sharkdp/fd) entr (watch-clone) csvkit (awk-clone) ccze (log coloring) surfraw hexyl (“hex viewer that uses Unicode characters and colour”, https://github.com/sharkdp/hexyl) jq (“awk for JSON. It lets you transform and extract information from JSON documents”, https://stedolan.github.io/jq/) pass (“password manager that uses GPG to store the passwords”, https://github.com/lunaryorn/mdcat) restic (“backup tool that performs client side encryption, de-duplication and supports a variety of local and remote storage backends.”, https://restic.net/) mdp (Markdown Presentation on CLI) grepcidr qrencode caca-utils (show images on the CLI) fbi ( & fbgs) (show images in Framebuffer device) fbcat (take screnshot on framebuffer device) nmap micro (CLI Text Editor, ab Debian 11, https://micro-editor.github.io) masscan (https://github.com/robertdavidgraham/masscan) socat (Nachfolger von netcat, https://www.heise.de/select/ix/2017/11/1509815804306324) dc3dd (patched version of GNU dd with added features for computer forensics) smem (memory reporting tool) free (Show Linux server memory usage) mpstat (Monitor multiprocessor usage on Linux, part of sysstat package) pmap (Montor process memory usage on Linux, part of the procps) monit (Process supervision) oping & noping saidar (Curses-basiertes Programm für die Anzeige von Live-Systemstatistiken) reptyr (Tool for moving running programs between ptys) gron (https://github.com/tomnomnom/gron, makes JSON greppable, kann HTTP-Requests absetzen) jc (https://github.com/kellyjonbrazil/jc, CLI tool and python library that converts the output of popular command-line tools and file-types to JSON or Dictionaries. This allows piping of output to tools like jq and simplifying automation scripts.) bat (cat-clone), alias cat=‘bat’ (“alternative to the common (mis)use of cat to print a file to the terminal. It supports syntax highlighting and git integration.”, https://github.com/sharkdp/bat) ioping (https://github.com/koct9i/ioping, simple disk I/0 latency measuring tool, auch für disk seek rate/iops/avg) vd (Visidata, multipurpose terminal utility for exploring, cleaning, restructuring and analysing tabular data. Current supported sources are TSV, CSV, fixed-width text, JSON, SQLite, HTTP, HTML, .xls, and .xlsx) pdfgrep duf https://github.com/muesli/duf (combined df and du, ncurses-based) nala (apt-alternate, https://gitlab.com/volian/nala, https://christitus.com/stop-using-apt/) iprange tldr rmlint nvtop (https://github.com/Syllo/nvtop, GPUs process monitoring for AMD, Intel and NVIDIA) lf (lf (as in “list files”) is a terminal file manager written in Go with a heavy inspiration from ranger file manager)
** no Deb pkg avail**
oh-my-zsh (http://ohmyz.sh) webmin observium cheat (https://github.com/cheat/cheat, create and view interactive cheatsheets on the command-line.) bropages ipbt / its-playback-time todo earthquake suplemon Newsroom unity ired wpe prettyping (ping), alias ping=‘prettyping --nolegend’ diff-so-fancy (diff-clone) q (query CSV Files with SQL) https://harelba.github.io/q/ gping (ping with a graph in CLI) http-prompt (install via pip) alt (“finding the alternate to a file. E.g. the header for an implementation or the test for an implementation. I use it paired with Neovim”, https://github.com/uptech/alt) chars (“shows information about Unicode characters matching a search term.”, https://github.com/antifuchs/chars) dot (“dotfiles manager. It maintains a set of symlinks according to a mappings file”, https://github.com/ubnt-intrepid/dot) dust (“alternative du -sh. It calculates the size of a directory tree, printing a summary of the largest items.”, https://github.com/bootandy/dust) eva (“command line calculator similar to bc, with syntax highlighting and persistent history.”, https://github.com/NerdyPepper/eva) hyperfine (“command line benchmarking tool. It allows you to benchmark commands with warmup and statistical analysis.”, https://github.com/sharkdp/hyperfine) mdcat (“renders Markdown files in the terminal”, https://github.com/lunaryorn/mdcat) podman (“alternative to Docker that does not require a daemon. Containers are run as the user running Podman so files written into the host don’t end up owned by root. The CLI is largely compatible with the docker CLI.”, https://podman.io/) skim (“fuzzy finder. It can be used to fuzzy match input fed to it. I use it with Neovim and zsh for fuzzy matching file names.”) z (“tracks your most used directories and allows you to jump to them with a partial name.”, https://github.com/rupa/z) alias wetter_graph=‘finger dresden@graph.no’ alias wetter_color=‘curl wttr.in’ alias maps_cli=‘telnet mapscii.me’ https://github.com/say4n/crappybird https://asciicker.com cbonsai https://gitlab.com/jallbrit/cbonsai GNU poke binary editor http://www.jemarch.net/poke / https://git.savannah.gnu.org/cgit/poke.git gdu GoDiskUsage https://github.com/dundee/gdu Cirrus CLI https://github.com/cirruslabs/cirrus- tuxi https://github.com/Bugswriter/tuxi personal CLI assistant ngrep https://github.com/jpr5/ngrep topgrade https://github.com/r-darwish/topgrade ndiff https://nmap.org/ndiff/ compare nmap scans natlas https://github.com/natlas/natlas sift https://sift-tool.org grep-alternative xplr https://github.com/sayanarijit/xplr (hackable, minimal, fast TUI file explorer, stealing ideas from nnn and fzf) croc https://github.com/schollz/croc (allows any two computers to simply and securely transfer files and folders, great for forensics) slidev https://sli.dev (HTML5 presentations) lfs https://github.com/Canop/lfs (df alternative) vtop (https://github.com/MrRio/vtop) gtop (https://github.com/aksakalli/gtop) up (Ultimate Plumber https://github.com/akavel/up) ttyd (https://github.com/tsl0922/ttyd, Share your terminal over the web) nms (no more secrets, https://github.com/bartobri/no-more-secrets, A command line tool that recreates the famous data decryption effect seen in the 1992 movie Sneakers.) xsv (https://github.com/BurntSushi/xsv, A fast CSV command line toolkit written in Rust.) fx (https://github.com/antonmedv/fx, Terminal JSON viewer) ccat (https://github.com/owenthereal/ccat, colorized cat mit Syntax Highlighting) elta (https://github.com/dandavison/delta, A syntax-highlighting pager for git, diff, and grep output. VORSICHT: Paket einer anderen Software mit gleichem Namen unter Debian Bullseye als Paket verfügbar!) dyff (https://github.com/homeport/dyff, /ˈdʏf/ - diff tool for YAML files, and sometimes JSON) skim (https://github.com/lotabout/skim, Fuzzy finder in Rust) choose (https://github.com/theryangeary/choose, A human-friendly and fast alternative to cut and (sometimes) awk) sd (https://github.com/chmln/sd, wie sed, Intuitive find & replace CLI, mit regex) map (https://github.com/soveran/map)
___
Rest of the list:
Tools pt. 2
- skim (https://github.com/lotabout/skim, Fuzzy finder in Rust)
- choose (https://github.com/theryangeary/choose, A human-friendly and fast alternative to cut and (sometimes) awk)
- sd (https://github.com/chmln/sd, wie sed, Intuitive find & replace CLI, mit regex)
- map (https://github.com/soveran/map, Map lines from stdin to commands, gemütliche Variante von xargs mit einfacherer Syntax und weniger Funktionsumfang)
- crush (https://github.com/liljencrantz/crush, Crush is a command line shell that is also a powerful modern programming language. Kann u.a. SQL-Statements)
- xxh (https://github.com/xxh/xxh, Bring your favorite shell wherever you go through the ssh.)
- starship (https://starship.rs, Shell-Prompt anpassen mit Nerdfont)
- q (https://github.com/natesales/q, A tiny & colorful command line DNS client with support for UDP, TCP, DoT, DoH, DoQ and ODoH.)
- gping (https://github.com/orf/gping, Ping, but with a graph)
- broot (https://github.com/Canop/broot, A new way to see and navigate directory trees : https://dystroy.org/broot)
- dust (https://github.com/bootandy/dust, intuitive du colored)
- dutree (https://github.com/nachoparker/dutree, a tool to analyze file system usage written in Rust)
- lsd (https://github.com/Peltoche/lsd, next-gen ls)
- mcfly (https://github.com/cantino/mcfly, Fly through your shell history using neural nets)
- procs (https://github.com/dalance/procs, A modern replacement for ps written in Rust, color, human readable, multi-column keword search)
- bottom (https://github.com/ClementTsang/bottom, top replacement, cross-platform graphical process/system monitor, zoom support)
- btop++ (https://github.com/aristocratos/btop, resource monitor CPU, RAM, IO, processes, IN SCHICK!!!, C+±continuation of bpytop https://github.com/aristocratos/bpytop)
- musikcube (https://github.com/clangen/musikcube, cross-platform, terminal-based music player, audio engine, metadata indexer, and server in c++ with an ncurses TI, incl.Android App)
- viu (https://github.com/atanunq/viu, Terminal image viewer with native support for iTerm and Kitty, auch animated gif)
- glow (https://github.com/charmbracelet/glow, Render markdown on the CLI)
- falsisign (https://gitlab.com/edouardklein/falsisign, For bureaucratic reasons, a colleague of mine had to print, sign, scan and send by email a high number of pages. To save trees, ink, time, and to stick it to the bureaucrats, I wrote this script.)
- ponysay (https://github.com/erkin/ponysay, wie cowsay mit bunten Ponies)
- sniffnet (https://github.com/GyulyVGC/sniffnet, cross-platform application to monitor your network traffic with ease, Debian-Pakete von GitHub verfügbar)
- netop (https://github.com/ZingerLittleBee/netop, monitor network traffic with bpf)
- corefreq (https://github.com/cyring/CoreFreq, CPU monitoring software for 64-bits Processors.)
- ctop (https://github.com/bcicen/ctop, Top-like interface for container metrics)
- dua (https://github.com/Byron/dua-cli, View disk space usage and delete unwanted data, fast.)
- dust (https://github.com/bootandy/dust, A more intuitive version of du in rust)
- helix editor
- lnav (https://github.com/tstack/lnav Log navigator)
- bottom (github.com/ClementTsang/bottom, another cross-platform graphical process/system monitor)
- broot (https://github.com/Canop/broot, a different than ranger/lf approach to navigating folders)
- mdr (https://github.com/michaelmure/mdr, a markdown viewer)
- eza (https://github.com/eza-community/eza, modern ls, with cool features like file icons)
- ouch (https://github.com/ouch-org/ouch, It’s a CLI tool for compressing and decompressing for various formats. such as .tar .zip 7z .gz .xz .lzma .bz .bz2 .lz4 .sz .zst .rar)
- spotify-tui (https://github.com/Rigellute/spotify-tui, Spotify CLI frontend (Spotify via terminal))
- toilet (http://caca.zoy.org/wiki/toilet, turn text into ASCII art)
DNS tools:
- viewdns.info
- dnslytics.com
- dnsspy.io
- leafdns.com
- dnsdumpster.com
- intodns.com
- www.zonecut.net/dns
- xip.io
- nip.io
- ptrarchive.com
- www.whatsmydns.net
- ceipam.eu/en/dnslookup.php
- spyse.com/tools/dns-lookup
- www.buddyns.com/delegation-lab
Good stuff for pentesters and security researchers:
- contained.af
- cryptohack.org
- 0x00sec.org
- hack.me
- chall.stypr.com
- crackmes.one
- hackxor.net
- tryhackme.com
- ctftime.org
- ctflearn.com
- picoctf.org
### .bashrc ### CUSTOM FUNCTIONS # https://www.linuxjournal.com/content/boost-productivity-bash-tips-and-tricks ftext () { grep -iIHrn --color=always "$1" . | less -R -r } duplicatefind (){ find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | \ xargs -I{} -n1 find -type f -size {}c -print0 | \ xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate } generateqr (){ # printf "$@" | curl -F-=\<- qrenco.de printf "$@" | qrencode -t UTF8 -o - }
My brian has too little ram to process the list of packages 😂 good to know the rest!
Neither does mine, but, I keep it to test a new tool from time to time.
- I use
Am I missing something - doesn’t bash have tab completion or of the box?
It does. It’s not quite as fancy as the completion in fish/zsh which employ a TUI, but it’s reliable in most situations
hardly
That’s the way I do it:
#!/usr/bin/env nix #! nix shell nixpkgs#nushell <optionally more dependencies> --command nu <script content>
But those scripts are only used by me
This is the way
I use bash as my interactive shell. When ~20 years ago or so I encountered “smart” tab completion for the first time, I immediately disabled that and went back to dumb completion, because it caused multi-second freezes when it needed to load stuff from disk. I also saw it refuse to complete filenames because they had the wrong suffix. Maybe I should try to enable that again, see if it works any better now. It probably does go faster now with the SSDs.
I tried OpenBSD at some point, and it came with some version of ksh. Seems about equivalent to bash, but I had to modify some of my .bashrc so it would work on ksh. I would just stick to the default shell, whatever it is, it’s fine.
I try to stick to POSIX shell for scripts. I find that I don’t need bashisms very often, and I’ve used systems without bash on them. Most bash-only syntax has an equivalent that will work on POSIX sh. I do use bash if I really need some bash feature (I recently wanted to
set -o pipefail
, which dash cannot do apparently, and the workaround is really annoying).Do not use
if you’re not writing bash-only scripts. This will break on Debian, Ubuntu, BSD, busybox etc. because /bin/sh is not bash on those systems.
Do not use #!/bin/sh if you’re not writing bash-only scripts
Actually
is for bourne shell compatible scripts. Bash is a superset of the bourne shell, so anything that works in bourne should work in bash as well as in other bourne compatible shells, but not vice versa. Bash specific syntax is often referred to as a “bashism”, because it’s not compatible with other shells. So you should not use bashisms in scripts that start with
.
The trouble is that it is very common for distros to links
/bin/sh
to/bin/bash
, and it used to be that bash being called as/bin/sh
would change its behavior so that bashisms would not work, but this doesn’t appear to be the case anymore. The result is that people often write what they think are bourne shell scripts but they unintentionally sneak in bashisms… and then when those supposed “bourne shell” scripts get run on a non-bash bourne compatible shell, they fail.Oh I wanted to say, “Do not use
if you’re
notwriting bash-only scripts”. I think I reformulated that sentence and forgot to remove the not. Sorry about the confusion. You’re exactly right of course. I have run into scripts that don’t work on Debian, because the author used bashisms but still specified /bin/sh as the interpreter.Oh I wanted to say, “Do not use #!/bin/sh if you’re not writing bash-only scripts”
Hah, I was wondering if that was wat you actually meant. The double negation made my head spin a bit.
I have run into scripts that don’t work on Debian, because the author used bashisms but still specified /bin/sh as the interpreter.
The weird thing is that
man bash
still says:When invoked as sh, bash enters posix mode after the startup files are read. ... --posix Change the behavior of bash where the default operation differs from the POSIX standard to match the standard (posix mode). See SEE ALSO below for a reference to a document that details how posix mode affects bash's behavior.
But if you create a file with a few well known bashisms, and a
shebang, it runs the bashisms just fine.
A good idea i have been spreading around relevant people lately is to use ShellCheck as you code in Bash, integrate it in your workflow, editor or IDE as relevant to you (there’s a commandline tool as well as being available for editors in various forms), and pass your scripts through it, trying to get the warnings to go away. That should fix many obvious errors and clean up your code a bit.
- Do you use bash? If not, which one do you use? zsh, fish? Why do you do it?
- Do you write #!/bin/bash or #!/bin/sh? Do you write fish exclusive scripts?
I use bash, and I use
for my scripts. Some are POSIX compliant, some have bashisms. But I really don’t care about bashisms, since I explicitly set the bash as interpreter. So no, no fish exclusive scripts, but some “bash exclusive” scripts. Since fish is aimed towards being used as interactive shell I don’t see a real reason to use it as interpreter for scripts anyways.
- Do you have two folders, one for proven commands and one for experimental?
- Do you publish/ share those commands?
- Do you sync the folder between your server and your workstation?
I have my scripts in
$HOME/.scripts
and softlink them from a directory in$PATH
. Some of the scripts are versioned using Git, but the repository is private and I do not plan sharing them because the repoand the scripts scripts contain some not-tho-share information and mostly are simply not useful outside my carefully crafted and specific environment. If I want to share a script, I do it individually or make a proper public Git repository for it.Since my server(s) and my workstations have different use cases I do not share any configuration between them. I share some configuration between different workstations, though. My dotfiles repository is mainly there for me to keep track of changes in my dotfiles.
is it bad practice to create a handful of commands
It becomes bad practice if it is against your personal or corporate guidelines regarding best practices. While it is not particularly bad or insecure, etc. to create bash scripts containing a single command, maybe use an alias instead. The
$1
is automatically the first parameter after typing the alias in the shell.alias podup="podman compose up -d" alias poddown="podman compose down" alias podlog="podman logs -f --tail 20"
Not quite sure about the podman syntax, if
podman exec /bin/sh -it "$1"
also works, you can usealias podenter="podman exec /bin/sh -it
, Otherwise a simple function would do the trick.Bash script for simple things (although Fish is my regular shell) and Node or Python scripts for complex things. Using
works just like it would for Bash so you know.
I recommend writing everything in Bourne shell (
/bin/sh
) for a few reasons:- Bash is more capable, which is nice, but if you’re fiddling with complex data structures, you probably should be using a more maintainable language like Python.
- Bash is in most places, but crucially not everywhere. Docker-based deployments for example often use Ash which is very similar to Bash, but lacks support for arrays and a few other things.
- Bourne’s limitations force you to rethink your choices regularly. If you find yourself hacking around a lack of associative arrays for example, it’s probably time to switch to a proper language.
Also to bits of advice.
- Use shellcheck. There’s a website that’ll check your script for you as well as a bunch of editor extensions that’ll do it in real time. You will absolutely write better, safer code with it.
- If your script exceeds 300 lines. Stop and rewrite it in a proper language. Your future self will thank you.
Do you use bash?
Personally I use Bash for scripting. It strikes the balance of being available on almost any system, while also being a bit more featureful than POSIX. For interactive use I bounce between bash and zsh depending on which machine I’m on.
Do you write
or
?
I start my shell scripts with
. This is the best way of ensuring that the same bash interpreter is called that the user expects (even if more than one is present or if it is in an unusual location)
Do you have two folders, one for proven commands and one for experimental?
By commands, do you mean bash scripts? If so, I put the ones I have made relatively bulletproof in
~/bin/
, as bash usually makes them automatically on the path with this particular folder name. If I’m working on a script and I don’t think it’s ready for that, or if it goes with a specific project/workflow, I will move it there.Do you sync the folder between your server and your workstation?
No. I work on lots of servers, so for me it’s far more important to know the vanilla commands and tools rather than expect my home-made stuff to follow me everywhere.
good practice? general advice?
Pick a bash style guide and follow it. If a line is longer than 80 characters, find a better way of writing that logic. If your script file is longer than 200 lines, switch to a proper programming language like Python. Unless a variable is meant to interact with something outside of your script, don’t name it an all caps name.
is it bad practice to create a handful of commands like
podup
andpoddown
that replacepodman compose up -d
andpodman compose down
orpodlog
aspodman logs -f --tail 20 $1
orpodenter
forpodman exec -it "$1" /bin/sh
?This is a job for bash aliases.
Good advice. I’ll add that any time you have to parse command line arguments with any real complexity you should probably be using Python or something. I’ve seen bash scripts where 200+ lines are dedicated to just reading parameters. It’s too much effort and too error prone.
It depends. Parsing commands can be done in a very lightweight way if you follow the bash philosophy of positional/readline programming rather than object oriented programming. Basically, think of each line of input (including the command line) as a list data structure of space-separated values, since that’s the underlying philosophy of all POSIX shells.
Bash is basically a text-oriented language rather than an object-oriented language. All data structures are actually strings. This is aligned with the UNIX philosophy of using textual byte streams as the standard interface between programs. You can do a surprising amount in pure bash once you appreciate and internalize this.
My preferred approach for CLI flag parsing is to use a
case-esac
switch block inside awhile
loop where each flag is a case, and then within the block for each case, you use theshift
builtin to consume the args like a queue. Again, it works well enough if you want a little bit of CLI in your script, but if it grows too large you should probably migrate to a general purpose language.Hoho, now do that in POSIX shell.
I had a rude awakening the day I tried it, but my scripts are bulletproof now (I think) so I don’t mind at this point
Imma be real, I never remember which parts of bash aren’t POSIX. Luckily it doesn’t matter in my line of work, but it’s good to be aware of if you have a job that often has you in machines running other types of UNIX.
Arguments don’t work the same way and POSIX doesn’t have the concept of arrays outside of @
Here’s a simple example of what I mean:
#! /usr/bin/env bash while [[ -n $1 ]]; do case $1 in -a) echo "flag A is set" ;; -b|--bee) echo "flag B is set" ;; esac shift done
I use bash for scripts almost exclusively even though i use zsh interactively (startup scripts for zsh are an obvious exception).
The vast majority of my scripts start with
set -e -u
which makes the script exit if a command (that is not in a few special places like an if) exits with an error status code and also complains about unbound variables when you use them.
Use
bash -n
and
shellcheck
to test your script for errors and problems if you try it.
Always use curly braces for variables to avoid issues with strings after the variable name being interpreted as part of the variable name.
Always use 10# before numbers in $(()) expressions to avoid leading zeroes turning your decimal number variables into octal ones.
Always use
while read -r foo do ... done < <(command ...)
instead of
command ... | while read -r foo do ... done
to avoid creating a subshell where some changes you make will not affect your script outside the loop.
In
while read -r foo do ... done < ...
loops always make sure you redirect all stdin from /dev/null or otherwise close it with suitable parameters or the content of your loop will eat some of the lines you meant for the read. Alternatively fill a bash array in the loop and then use a for loop to call your commands and do more complex logic.
When using temporary directories or similar resources use
cleanup() { ... } trap cleanup EXIT
handlers to clean up after the script in case it dies or is killed (by SIGTERM or SIGINT,…; obviously not SIGKILL).
When writing scripts for cronjobs take into account that the environment (PATH In particular) might be more limited. Also take into account that stderr output and non-zero exit status can lead to an email about the cronjob.
Use pushd and popd instead of cd (especially cd …), redirect their output to /dev/null. This will prevent your scripts from accidentally running later parts of the script in a wrong directory.
There are probably many other things to consider but that is just standard stuff off the top of my head.
If you do need any sort of data structure and in particular arrays of data structures use a proper programming language. I would recommend Rust since a compiled language is much easier to run on a variety of systems than the Python so many others here recommend, especially if you need to support the oldest supported version of an OS and the newest one at the same time.
Great list! I would add “always surround variables with quotes in case the value contains spaces”.
Good point, forgot one of the basics.
Also, to make your scripts more readable and less error prone use something like
if [[ $# -gt 0 ]] && [[ "$1" == "--dry-run" ]]; then dry_run=1 shift else dry_run=0 fi if [[ $# != 3 ]]; then echo "Usage: $0 [ --dry-run ] <description of foo> <description of bar> <description of baz>" >&2 exit 1 fi foo="$1" shift bar="$1" shift baz="$1" shift
at the start of your script to name your parameters and provide usage information if the parameters did not match what you expected. The shift and use of
$1
at the bottom allows for easy addition and removal of parameters anywhere without renumbering the variables.Obviously this is only for the 90% of scripts that do not have overly complex parameter needs. For those you probably want to use something like getopt or another language with libraries like the excellent clap crate in Rust.
Thank you very much!