Studying and awk came up.
Spent about an hour and I see some useful commands that extend past what “cut” can do. But really when dealing with printf() format statements is anyone using awk scripts for this?
Or is everyone just using their familiar scripting language. I’d reach for Python for the problems being presented as useful for awk.
I used awk to migrate users from one system to another. I created template scripts for setting up the user in the new system, I dumped the data from the old system, then used awk to process the dump and create scripts for each user in the new system. That was a fun project.
I use awk all the time, nothing too fancy, but when you need to pull out elements of text it’s usually way easier than using cut.
awk {’ print $3 '} will pull the third element based on your IFS variable (internal field separater, default is whitespace)
awk {’ print $NF ‘} gets you the last element, and awk {’ print $(NF-1) '} gets you one element from the last, and so on.
Basic usage but so fast and easy for so many everyday command line things.
You can also add to the output. I use it frequently to pull a list of files, etc, from another file, and then do something like generate another script from that output. This is a weak example, but one I can think of off my head. Not firing up my work laptop to search for better examples until after the holidays. LOL.
awk {‘print "ls -l "$1’}
And then send that to a file that I can then execute.
All the time. Not always by choice!
A lot of my work involves writing scripts for systems I do not control, using as light a touch as is realistically possible. I know for a fact Python is NOT installed on many of my targets, and it doesn’t make sense to push out a whole Python environment of my own for something as trivial as string manipulation.
awk is super powerful, but IMHO not powerful enough to justify its complexity, relative to other languages. If you have the freedom to use Python, then I suggest using that for anything advanced. Python skills will serve you better in a wider variety of use cases.
No, I don’t need it because of jc and nushell
Best use-case of AWK is that you can avoid using grep for picking a Nth word in specific line. I tend to ask GPT4 to write one-liner for me. Works super great.
Awk has the advantage over Perl/Python/etc. that it’s standardized by POSIX. Therefore you can rely on it on all operating systems. It’s pretty much the only advanced scripting language available that is POSIX – the alternative would be some heavy shell scripting or almost-unreadable sed.
Therefore you can rely on it on all operating systems.
… all except that one OS which we don’t like to talk about but annoyingly remains the most popular consumer OS. :P
Android?
awk often can be found in my scripts.
I don’t tend to use awk in scripts as I do tend to do them in Python, but I do use awk on almost daily basis in one-liners.
Probably the most common thing for me is so I can read a config file without annoying comments and big line spaces.
grep -v "^#" krb5.conf | awk NF
I use awk constantly at work. Very useful in my opinion, and really powerful if you dig into it.
Perl kinda killed awk and sed.
Then python kinda killed perl.
Yes, but for very specific case. I used to write highly portable scripts that could be executed in different environments (various linux distros, including minimal containers, freebsd and even solaris 10). I coudn’t use bash, perl, python and even gawk. Only POSIX shell (I always tested my scripts with dash and ksh93, for solaris 10 compatibility - with its jsh), portable awk (tested with original-awk, gawk and mawk) and portable sed (better fotget it if you need to support solaris).
Before that I didn’t understand why should I need awk if I know perl. And awk really sucks. Once I had to replace a perl one-liner with awk script of ~30 lines for portability.
P.S. I never use
awk
just forprint $1
as many do. It’s an overkill.cut
is better for this use-case IMO. Awk is good for whencut
won’t cut it.
Nearly every day. There was a time when I’d reach for Ruby, but in the end, the stability, ubiquity, and portability of the traditional Unix tools - among whom awk is counted - turned out to be more useful. I mainly underuse its power, though; it serves as a column aggregator or re-arranger, for the most part.
No, but I heavily use perl still… I feel like you can’t really call yourself a Linux person without knowing perl and python both. Knowing awk can’t hurt though.
Really? I disliked Perl for 3 decades on unix and Linux and I’ve never felt like I have been held back by not knowing or using it. I don’t remember the last time I saw a Perl script, let alone needed to understand one.
I use sed a lot
Yes, for things too complex to do in sed but not complex enough to need a “normal” programming language like python.