I say in general, because I ran into a specific situation. For a plot-generating process, we 1) pointed to the version of the plot-maker that didn't use
libcairo
, so it couldn't be used without X11, and 2) accidentally removed lots from the completed file, where we list the plots we've done so we don't redo them, which lead to 3) remaking lots of plots that exist with a plot-maker that couldn't make plots, meaning files that existed got replaced with zero-sized files.Once we figured out what the problem was, I wrote this one-liner, which ran for over an hour and 40 minutes.
Basically, it looped over and over again to keep constant watch (the for loop), each time taking an ls -l (the -S, order by size, not strictly being necessary but left over from my initial checking), breaking it apart with Perl, printing the file name ($F[8]) if the file size ($F[4]) was 0. I pipe the output into wc -l, which counts the number of lines. I then sleep for a minute (as the number of zero-length files was over 7000 at first run, the ls-l took quite some time, so the reports were much longer than a minute apart), and do it again.for i in {1..100} ; do ( ls -lS */nanodrop/*png | perl -lane ' print $F[8] if 0 == $F[4] ' | wc -l ); sleep 60 ; done
Granted, I could've done it all in Perl, but why re-implement tools that exist in the shell when you don't have to? I could've done the Perl stuff in Awk, if only I knew Awk better, but I could get to the point where I knew the damage and the rate of repair much faster with tools I knew. Really, the only thing I needed to learn was that perl -lane was what I needed to get the lines broken easily.
No comments:
Post a Comment