Deobfuscate your bash

Who hasn’t written some read-only magical Bash voodoo, only to find you need to decrypt your own creation later on? Luckily, Explain Shell can help with that. Here’s an example from my .bash_history file:


for fn in *; do echo cat $fn | sed “s|’ ‘$URL’||g” | sed “s|curl -X POST -d ‘||g” ; done

And Explain Shell’s explanation: it’s no substitute for knowing Bash but it sure helps.

Bonus: while reading my bash history file, I realized I accidentally copy&paste a lot of code to my terminals. There are way too many “template <FOO>” entries in there…

Bonus II: It’s a good thing they wrote “shell” in a different color. I was wondering why I had “explains hell” in my bookmarks.

Advertisements

Quickly sharing files in Linux via HTTP

Isn’t it awful when you have to share a file too big for email and don’t know how? You’d think by 2016 we’d have that figured out. Actually we do, many times over. Just pick a standard that works for you!

If you don’t want to read many pages on file transfer standards (Samba? What’s that?) you can try this little snippet:

python -m SimpleHTTPServer $PORT

This will create an http server sharing the current directory. HTTP, luckily, is one of those things that tend to work everywhere, always.

Bonus: some other ways of doing the same thing at https://gist.github.com/willurd/5720255


Public service announcement: searching your terminal’s output

Short tip today, but a life-changer one: you don’t need to copy&paste your terminal’s scrollback to search on it, you can do it in place. At least in terminator that’s possible (and I hear it’s also doable in Gnome’s default terminal application). Just press Ctrl+Shift+F. No more copy and pasting to vim!


Bash tip: idiom to get the first error code from a pipe

When writing a bash script, often times you’ll end up with something like this:

real_command | filter_stuff | prettify | do_something_else

The problem arises when you try to figure out if your command succeeded or not. If you `echo $?` you’ll get the return code for the last chain in the pipe. You don’t really care about the output value of do_something_else, do you?

I haven’t found a solution I really like to this problem, but this idiom is handy:

out=`real_command` && echo $out | filter_stuff | prettify | do_something_else

echo $?

Now $? will hold the value of real_command, and you can actually use it to diagnose the real problem.


Globing in bash

There is a pretty common and unnecessary pattern used by bash scripts: whenever you need to loop through a list of file names in a path, you might tempted to write something like this.


for fname in $(ls | grep foo); do echo $fname; done

You can save some typing by using bash-globbing:


for fname in *foo*; do echo $fname; done

Not only the script should be cleaner and faster, bash will take care of properly expanding the file names and you won’t have to worry about things like filenames with spaces. This should also be portable to other shells too.

Want to know more about bash globbibg? Check out http://www.linuxjournal.com/content/bash-extended-globbing


Bash traps: almost like RAII for bash

Everywhere, but specially in bash, cleaning up is annoying and error prone. Resource leaks can be common if your bash script is interrupted half-way. Do you need to execute something always, even if your script fails or gets killed? Try using traps:

#!/bin/bash

foobar() {
    echo "See ya!"
}

trap "foobar" EXIT

It doesn’t mater how you end this script, “foobar” will always be executed. Want to read more about bash traps? Check http://linuxcommand.org/wss0160.php


Force a program to output to stdout

Silly but handy CLI trick on Linux: Some programs don’t have an option to output to stdout. Gcc comes to mind. In that case the symlink ‘/dev/stdout’ will come in handy: /dev/stdout will be symlinked to stdout for each process.

With this trick you could, for example, run “gcc -S foo.cpp -o /dev/stdout”, to get the assembly listing for foo.cpp.

You probably shouldn’t use this trick on anything other than CLI scripting stuff (keep in mind /dev/stdout might be closed or not accessible for some processes).


A random slideshow in Ubuntu

The other day I wanted to use my tv for a slideshow of my travel pictures. Something simple, just select a folder and have a program like Shotwell create a slideshow with a random order on my tv. Of course, Ubuntu and double screen equals fail. For some reaason all the programs I tried either were incapable of using the tv as the slideshow screen (even after cloning screens… now that’s a wtf) or where not able to recursively use all the pictures in a folder.

feh to the rescue. It’s not pretty, but feh seems to be exactly what I was looking for. It’s a CLI application for Linux and after some RTFM I came up with this script:

feh ~/Pictures \
     --scale-down \
     --geometry 1920x760 \
     --slideshow-delay 9 \
     --recursive \
     --randomize \
     --auto-zoom \
     --draw-filename \
     --image-bg black 

You can probably figure out by yourself what each option means. If not, just man feh.


Counting lines per second with bash

The other day I wanted to quickly monitor the status of a production nginx after applying some iptables rules and changing some VPN stuff. It’s easy to know if you completely screwed up the server: the number of requests per second will drop to zero, all requests will have an httpstatus different from 200, or some other dramatic and easy to measure side effect.

What happens if you broke something in a slightly more subtle way? Say, you screwed up something in ipsec (now, I wonder how that can happen…) and now networking is slow. Or iptables now enforces some kind of throttling in a way you didn’t expect. To detect this type of errors I wrote a quick bash script to output how many lines per second are added to a file. This way I was able to monitor if the throughput of my nginx install didn’t decrease after my config changes, without installing a full fledged solution like zabbix.

I didn’t find anything like this readily available, so I’m posting it here in case someone else finds it useful.

#!/bin/bash

# Time between checks
T=5

# argv[1] will be the file to check
LOG_FILE=$1

while true; do
    tmp=`mktemp`
    # tail a file into a temp. -n0 means don't output anything at the start so
    # we can sleep $T seconds and we don't need to worry about previous entries
    tail -n0 -f $LOG_FILE > $tmp 2>/dev/null & sleep $T;
    kill $! > /dev/null 2>&1;
    echo "Requests in $LOG_FILE in the last $T seconds: `cat $tmp | wc -l`";
    rm $tmp;
done

Bash scripting and getops

Did you ever write a bash script and thought it looked too clean? Yeah, me neither. Anyway, now you can make it look even worse by using getopt. As an upside, you’ll be able to read command line options from a user without having to resort to nasty hacks, like hardcoding the switch position into the argv.

getopt should be installed by default in most Linux distros, and you can even run it as a command line program. It’s quite easy to use on a bashcript. For example, something like:

while getopts "bar" opt; do
    case "$opt" in
        b) echo "Option b is set"
           ;;
        a) echo "Option a is set"
           ;;
        r) echo "Option r is set"
           ;;
    esac
done

It won’t look pretty but it does get the job done. According to “man getopt” it supports things like short & long options and defaults; if you need something more complex, you should probably be using a proper language instead of a bash script.