Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Stick up your best shell scripts

2»

Comments

  • Closed Accounts Posts: 580 ✭✭✭karlr42


    I present a little app I came up with while messing around with conky. As you may know, that involves editing a file, running conky, killing it from the gnome-system-monitor, then editing agian.. It gets tedious. So, I wrote this perl script that acts like the "kill" command, but based on process name, not PID.
    #!/usr/bin/perl
    #Kill the process named as the first argument.
    my $doomed = @ARGV[0]; #get the name of the process
    open (FILE, ">processes.txt") or die($!);
    print (FILE `ps -e`); #get list of processes
    close FILE,open (FILE, "<processes.txt") or die($!);
    foreach(<FILE>){
        if (/$doomed/){ #find line with the selected process
            /(.*?)\s/; #get the PID
            print"Killing $doomed with a PID ".
                "of $1..\n";
            $pid = $1; #the PID 
            @args= ("kill","$pid");
            $status = system(@args);#try to kill and capture
                        #exit status
            if ($status eq "0"){
                print "Killed.\n";
                exit;#end the program
            }else{
                die("Unable to kill $doomed.\n");
            }
        }  
    }
    #only way to get here is to not have died- meaning the if statement
    #was never entered, hence:
    print "No process named $doomed found currently running.\n";
    
    And some sample output:
    karlreid@karlreid-laptop:~$ killthis conky
    Killing conky with a PID of 24584..
    Killed.
    karlreid@karlreid-laptop:~$ 
    
    It can kill root processes too(obviously, be careful :)):
    karlreid@karlreid-laptop:~$ sudo conky
    [sudo] password for karlreid: 
    Conky: /home/karlreid/.conkyrc: 7: config file error
    Conky: use_spacer should have an argument of left, right, or none.  'yes' seems to be some form of 'true', so defaulting to right.
    Conky: diskio device '19' does not exist
    Conky: forked to background, pid is 24749
    karlreid@karlreid-laptop:~$ 
    Conky: desktop window (10000b4) is subwindow of root window (13b)
    Conky: window type - normal
    Conky: drawing to created window (3200001)
    Conky: drawing to double buffer
    
    karlreid@karlreid-laptop:~$ killthis conky
    Killing conky with a PID of 24749..
    Unable to kill conky.
    karlreid@karlreid-laptop:~$ sudo ~/bin/killthis conky
    Killing conky with a PID of 24749..
    Conky: received SIGINT or SIGTERM to terminate. bye!
    Killed.
    karlreid@karlreid-laptop:~$ 
    
    
    karlreid@karlreid-laptop:~$ killthis nonexistentprocess
    No process named nonexistentprocess found currently running.
    karlreid@karlreid-laptop:~$ 
    


  • Registered Users, Registered Users 2 Posts: 1,038 ✭✭✭rob1891


    good exercise in script writing, but I think you should know about a command named "pkill" sometimes (BSD) "killall" .... :(


  • Closed Accounts Posts: 580 ✭✭✭karlr42


    :hmm, killall conky ... :O:
    :Hangs head in abject embarrassment and shame::o
    Ah well, it was fun!


  • Closed Accounts Posts: 21 tenox


    Generates a random password / passphrase:
    #!/bin/bash
    M="0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz"
    L="8"
    
    while [ $L -gt 0 ]
    do
            PASS="$PASS${M:$(($RANDOM%${#M})):1}"
            let L-=1
    done
    
    echo "$PASS"
    


  • Registered Users, Registered Users 2 Posts: 1,129 ✭✭✭pljudge321


    Two scripts i cut in half and then joined to improve on the original:

    It'll rip the audio from a youtube url and leave it on your desktop in mp3.

    Depends on youtube-dl and ffmpeg
    #!/bin/bash
    cd ~/
    read -p "YouTube url? " ur
    read -p "Name? One word only!! " nv
    echo;echo;
    youtube-dl -o ${nv} "${ur}"
    ffmpeg -i "${nv}" -acodec libmp3lame -ab 160k -ac 2 -ar 44100 "${nv}.mp3"
    mv "${nv}.mp3" ~/Desktop/"${nv}.mp3"
    rm ${nv}
    echo;echo;
    echo "Your new MP3 file is saved on your Desktop."
    read


  • Registered Users, Registered Users 2 Posts: 545 ✭✭✭ravydavygravy


    This goes out to a bunch of websites and downloads the webcomics.
    Run it once a day at 10am to get them delivered to your inbox....
    #!/bin/sh
    
    MAILTO="your.email@address.here"
    
    ########################################################################################
    ## Setup Proxy - if you need one, otherwise comment this line out
    ########################################################################################
    export http_proxy='http://1.2.3.4:8080'
    
    ########################################################################################
    ## Penny Arcade - Nerd/Gamer Humour
    ########################################################################################
    URLBASE="http://www.penny-arcade.com"
    wget -O page.html ${URLBASE}/comic/
    IMGURL=`grep simplebody page.html | awk -F'"' '{print $4}'`
    wget -O pennyarcade.jpg ${URLBASE}/${IMGURL}
    
    ########################################################################################
    ## Dilbert - IT Work Humour
    ########################################################################################
    URLBASE="http://www.dilbert.com"
    wget -O page.html ${URLBASE}
    IMGURL=`grep -A1 STR_Content page.html | tail -1 | awk -F'"' '{print $4}'`
    wget -O dilbert.gif ${URLBASE}/${IMGURL}
    
    ########################################################################################
    ## XKCD - Geek Humour
    ########################################################################################
    URLBASE="http://xkcd.com"
    wget -O page.html ${URLBASE}
    IMGURL=`grep -A7 rnd_btn_t page.html | tail -1 | awk -F'"' '{print $2}'`
    wget -O xkcd.png ${IMGURL}
    
    ########################################################################################
    ## Dinosaur Comics - Bizzare Humour
    ########################################################################################
    URLBASE="http://www.qwantz.com"
    wget -O page.html ${URLBASE}
    IMGURL=`grep 'qwantz.com/comics/' page.html | grep -v facebook | awk -F'"' '{print $2}'`
    wget -O dc.png ${IMGURL}
    
    ########################################################################################
    ## Redmeat - Bizzare Humour
    ########################################################################################
    URLBASE="http://www.redmeat.com/redmeat/current"
    wget -O page.html ${URLBASE}/index.html
    IMGURL=`grep -A1 'weeklyStrip' page.html | tail -1 | awk -F'"' '{print $2}'`
    wget -O redmeat.gif ${URLBASE}/${IMGURL}
    
    ########################################################################################
    ##  Least I could do....
    ########################################################################################
    URLBASE="http://www.leasticoulddo.com"
    wget -O page.html ${URLBASE}/
    IMGURL=`grep -A1 'comic' page.html | grep 'archive.leasticoulddo.com' | tail -1 | awk -F'"' '{print $2}'`
    wget -O licd.gif ${IMGURL}
    
    ########################################################################################
    ##  Garfield Minus Garfield
    ########################################################################################
    URLBASE="http://garfieldminusgarfield.net/"
    wget -O page.html ${URLBASE}/
    IMGURL=`grep "media.tumblr" page.html | awk -F'"' '{print $4}' | head -1`
    wget -O garfieldminusgarfield.png ${IMGURL}
    
    ########################################################################################
    ##  Bigger than Cheeses
    ########################################################################################
    URLBASE="http://www.biggercheese.com/"
    wget -O page.html ${URLBASE}/
    IMGURL=`cat page.html | grep -A1 OnlineComics.net | grep img | awk -F'"' '{print $4}'`
    wget -O btc.png ${URLBASE}/${IMGURL}
    
    ########################################################################################
    ##  Cyanide and Happiness
    ########################################################################################
    URLBASE="http://www.explosm.net/comics/"
    wget -O page.html ${URLBASE}/
    IMGURL=`cat page.html | grep Archive | awk -F'"' '{print $32}'`
    wget -O cah.png ${IMGURL}
    
    ########################################################################################
    ##  PvP
    ########################################################################################
    URLBASE="http://www.pvponline.com/"
    wget -O page.html ${URLBASE}/
    IMGURL=`grep 'pvponline.com/comics' page.html | awk -F'"' '{print $2}'`
    wget -O pvp.gif ${IMGURL}
    
    ########################################################################################
    ## Email the comics
    ########################################################################################
    
    echo "Hullo,"                                                               > body.txt
    echo                                                                       >> body.txt
    echo "Please find attached your daily dose of comic humour"                >> body.txt
    echo                                                                       >> body.txt
    echo "Todays batch includes:"                                              >> body.txt
    echo                                                                       >> body.txt
    echo " * Penny Arcade"                                                     >> body.txt
    echo " * Dilbert"                                                          >> body.txt
    echo " * Dinosaur Comics"                                                  >> body.txt
    echo " * XKCD"                                                             >> body.txt
    echo " * Redmeat"                                                          >> body.txt
    echo " * Least I could Do"                                                 >> body.txt
    echo " * Garfield minus Garfield"                                          >> body.txt
    echo " * Bigger than Cheeses"                                              >> body.txt
    echo " * Cyanide and Happiness"                                            >> body.txt
    echo " * PvP (Player vs Player)"                                           >> body.txt
    echo                                                                       >> body.txt
    echo "If you have a favourite comic you'd like added, let Dave know..."    >> body.txt
    echo                                                                       >> body.txt
    echo "Regards"                                                             >> body.txt
    echo "Daves Comic Trawling Web Monkey"                                     >> body.txt
    
    cat body.txt | mutt -s "Comics for today - `date -I`" -a pennyarcade.jpg -a dilbert.gif -a xkcd.png -a dc.png -a redmeat.gif -a licd.gif -a garfieldminusgarfield.png -a btc.png -a cah.png -a pvp.gif $MAILTO
    
    ########################################################################################
    ## Clean up
    ########################################################################################
    
    rm -f body.txt page.html *.jpg *.png *.gif
    

    Should work on any liux box with mutt and wget installed :-)


  • Registered Users, Registered Users 2 Posts: 1,129 ✭✭✭pljudge321


    Heres one in a similar vein to the one i posted above but that i wrote myself.

    It works by exploiting the way firefox stores flash files in the /tmp/ folder.

    To use it open the youtube, dailymotion, etc page you want to acquire the song from, assuming its open licensed of course and wait until the video loads fully and then simply run the script. This will work with multiple tabs as well.
    #!/bin/sh

    find /tmp -maxdepth 1 -name "Flash*" -type f -prune -exec ffmpeg -i '{}' -acodec libmp3lame -ab 160k -ac 2 -ar 44100 '{}.mp3' \;
    find /tmp -maxdepth 1 -name "*.mp3" -prune -type f -exec mv '{}' ~/Desktop/ \;


  • Registered Users, Registered Users 2 Posts: 740 ✭✭✭z0oT


    Here's one that'll replace the spaces in filenames with an underscore. It's handy if you want to loop one program through a directory that doesn't like spaces in filenames.
    if [ -n "$1" ]
    then
      if [ -d "$1" ] 
      then
        cd "$1"
      else
        echo invalid directory
        exit
      fi
    fi
    
    for i in *
    do
      OLDNAME="$i"
      NEWNAME=`echo "$i" | tr ' ' '_' | tr A-Z a-z | sed s/_-_/-/g`
      if [ "$NEWNAME" != "$OLDNAME" ]
      then
        TMPNAME="$i"_TMP 
        echo ""
        mv -v -- "$OLDNAME" "$TMPNAME"
        mv -v -- "$TMPNAME" "$NEWNAME"
      fi
      if [ -d "$NEWNAME" ] 
      then
        echo Recursing lowercase for directory "$NEWNAME"
        $0 "$NEWNAME"
      fi
    done
    


  • Registered Users, Registered Users 2 Posts: 545 ✭✭✭ravydavygravy


    z0oT wrote: »
    Here's one that'll replace the spaces in filenames with an underscore. It's handy if you want to loop one program through a directory that doesn't like spaces in filenames.

    <snip>

    Decent script, but slightly redundant:

    Usage: rename <thing-to-match> <thing-to-replace-it-with> <filelist>

    So, for example:
    shell$ rename ' ' '_' *


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,745 ✭✭✭Eliot Rosewater


    This is really the only proper shell script I ever scripted. It worked in Ubuntu 9.04 for me. Apologies if its crap and if youve any criticism hit me with it :)

    This script takes a C++ source file, compiles it, and then runs it. It works under a strictish naming convention. The source file is $1.cpp and then the output is $1.

    Any further parameters will be passed onto the program. Example:
    $ ./cr.sh helloWorld "param 1" "param 2"
    This will compile ./helloWorld.cpp to ./helloWorld using g++, then run
    $ ./helloWorld "param 1" "param 2"

    If the source code is older than the compiled code, that is to say the source hasnt been updated since the last compile, then it skips compiling and just runs the compiled file. Handy for testing through terminal (which is how Im learning C++)
    #!/bin/sh
    
    #Script to compile and run C++ source file
    # ./cr.sh [source filename without .cpp extension] [arguments to be passed to compiled program]
    
    #Check if file exists
    if [ ! -e "$1.cpp" ]
    then
    	echo "Error: $1.cpp not a file!"
    	exit 1
    fi
    
    #Does program file exist? If not compile.
    if [ ! -e "$1" ]
    then
    	g++ ./$1.cpp -o $1
    else
    	#Has source been modified since last compile? If so compile
    	if test $(stat --format=%Y $1) -lt $(stat --format=%Y $1.cpp)
    	then
    		g++ ./$1.cpp -o $1
    	fi
    fi
    
    #Grab arguementes to pass to C++ program
    TEMP=`echo $* | awk '{print substr( $0, index($0," ")+1, length($0)-index($0," ")+1 ) }'`
    
    
    if [ ! -e $2 ]
    then
    	./$1 $TEMP
    else
    	./$1
    fi
    

    Any comments such as better error handling will be really appreciated :)


  • Registered Users, Registered Users 2 Posts: 545 ✭✭✭ravydavygravy


    Cheating slightly - its perl, not bash.... but maybe someone will find it useful. In the sys admin line of work, you spend a lot of time watching logs - so this script colorises the output to make it easy to notice errors, exceptions, whatever...

    You could use it like so: tail -f /var/log/mylogfile | color.pl
    #!/usr/bin/perl
    
    use Term::ANSIColor qw(:constants);
    use Switch;
    
    while( defined ($line = <STDIN>) )
    {
        switch ($line)
        {
            ## Obviously, add your own regular expressions and colors here.... 
            case /Exception/           { print YELLOW, ON_RED,    "$line", RESET; }
            case /[Ee]rror/            { print YELLOW, ON_BLACK,  "$line", RESET; }
            case /Success/             { print GREEN, ON_BLACK,   "$line", RESET; }
            else                       { print WHITE, ON_BLACK    "$line", RESET; }
        }
    }
    

    If you need something more configurable, try http://kassiopeia.juls.savba.sk/~garabik/software/grc.html


  • Registered Users, Registered Users 2 Posts: 1,038 ✭✭✭rob1891


    betafrog wrote: »
    Don;t forget the ubiquitous
    :(){ :|:&};:
    

    Copy and past into a terminal and watch the ensuing fun :p
    It is like that big red button that says do not press ;-)

    Doesn't end badly though. For the umpteenth time I was left with a load of VIM .swp files to go through so I finally got around to doing a little script to check them automatically ...
    #!/bin/bash
    
    TMPDIR=/tmp
    
    for i in .*.swp
    do
        FILE=`echo $i | sed -r 's/^\.(.+)\.swp/\1/'` #strip to get original name
    
        vi -e -s -r $FILE -c "wq! $TMPDIR/$FILE.recovered" #write the recovered file
    
    
        diff -u $FILE $TMPDIR/$FILE.recovered
    
        if [ $? -eq 0 ]; then
            rm $TMPDIR/$FILE.recovered
            rm $i
        else
            echo "See $TMPDIR/$FILE.recovered and delete $i manually"
        fi
    done
    


  • Registered Users, Registered Users 2 Posts: 545 ✭✭✭ravydavygravy


    Wife about to go into labour? Unsure how to help? What not write a bash script to time her contractions... (seemed like a good idea at the time...)
    #!/bin/sh
    
    STIME=-1
    COUNTER=0
    
    ## Set these threshold parameters to whatever you want...
    
    C_THRESH=2     ## More than 2 contractions...
    P_THRESH=600   ## less than 10 mins apart...
    D_THRESH=30    ## lasting at least 60 seconds each...
    
    echo
    echo "###################################################################################"
    echo "##              L a b o u r   C o n t r a c t i o n   T i m e r                  ##"
    echo "###################################################################################"
    echo
    
    while [ 1 == 1 ]
    do
        echo -n "####: Press any key to start next contraction..."
        read
    
        ### OK, lets go...
    
        if [ $STIME -eq -1 ]
        then
            OTIME=`date +%s`
        else
            OTIME=$STIME
        fi
    
        PTIME=`date +%H:%M:%S`
        STIME=`date +%s`
    
        echo -n "####: Started - press any key to end..."
        read
    
        ## Work out details
    
        ETIME=`date +%s`
    
        DURATION=`expr $ETIME - $STIME`
        PERIOD=`expr $STIME - $OTIME`
    
        echo "INFO: Contraction Over - started at $PTIME ($PERIOD secs since last time) - lasted for $DURATION seconds"
    
        if [ $PERIOD -lt $P_THRESH -a $DURATION -gt $D_THRESH ]
        then
            COUNTER=`expr $COUNTER + 1`
    
            if [ $COUNTER -gt $C_THRESH ]
            then 
                echo "CRIT: Consistent frequency and duration *strongly* suggest hospital visit...."
            else
                echo "WARN: Frequency and duration suggest possible hospital visit...."
            fi
        else
            COUNTER=0
        fi
    done
    


  • Registered Users, Registered Users 2 Posts: 2,763 ✭✭✭Sheeps


    Wife about to go into labour? Unsure how to help? What not write a bash script to time her contractions... (seemed like a good idea at the time...)
    #!/bin/sh
    
    STIME=-1
    COUNTER=0
    
    ## Set these threshold parameters to whatever you want...
    
    C_THRESH=2     ## More than 2 contractions...
    P_THRESH=600   ## less than 10 mins apart...
    D_THRESH=30    ## lasting at least 60 seconds each...
    
    echo
    echo "###################################################################################"
    echo "##              L a b o u r   C o n t r a c t i o n   T i m e r                  ##"
    echo "###################################################################################"
    echo
    
    while [ 1 == 1 ]
    do
        echo -n "####: Press any key to start next contraction..."
        read
    
        ### OK, lets go...
    
        if [ $STIME -eq -1 ]
        then
            OTIME=`date +%s`
        else
            OTIME=$STIME
        fi
    
        PTIME=`date +%H:%M:%S`
        STIME=`date +%s`
    
        echo -n "####: Started - press any key to end..."
        read
    
        ## Work out details
    
        ETIME=`date +%s`
    
        DURATION=`expr $ETIME - $STIME`
        PERIOD=`expr $STIME - $OTIME`
    
        echo "INFO: Contraction Over - started at $PTIME ($PERIOD secs since last time) - lasted for $DURATION seconds"
    
        if [ $PERIOD -lt $P_THRESH -a $DURATION -gt $D_THRESH ]
        then
            COUNTER=`expr $COUNTER + 1`
    
            if [ $COUNTER -gt $C_THRESH ]
            then 
                echo "CRIT: Consistent frequency and duration *strongly* suggest hospital visit...."
            else
                echo "WARN: Frequency and duration suggest possible hospital visit...."
            fi
        else
            COUNTER=0
        fi
    done
    

    I will have lost a little bit of faith in humanity and I'll worry for your children if you actually did this.


  • Registered Users, Registered Users 2 Posts: 545 ✭✭✭ravydavygravy


    Sheeps wrote: »
    I will have lost a little bit of faith in humanity and I'll worry for your children if you actually did this.

    I worry for my children too, but hey, what you gonna do.... You can't pick your dad.

    (Baby finally arrived today, after ~28 hours of irregular pre-hospital contractions, so the script worked out being very useful - given that wife was not very diligent at writing times down, but would happily press enter at the start and end of the contractions, until we left for hospital...)


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 10,407 ✭✭✭✭justsomebloke


    Wife about to go into labour? Unsure how to help? What not write a bash script to time her contractions... (seemed like a good idea at the time...)

    I think it's time to remove you as a friend on facebook. oh and congrats:D


  • Registered Users, Registered Users 2 Posts: 545 ✭✭✭ravydavygravy


    To get us back OT - a short and sweet bash webserver....
    while [ true ]; do echo "SERV: The time is `date`" | nc -l 8080; done
    

    Connect to http://127.0.0.1:8080/ to check the output...


  • Moderators, Science, Health & Environment Moderators, Society & Culture Moderators Posts: 3,372 Mod ✭✭✭✭andrew


    Woah, I got linked here from Reddit!


  • Registered Users, Registered Users 2 Posts: 1,038 ✭✭✭rob1891




  • Registered Users, Registered Users 2 Posts: 86 ✭✭RobbieM


    Hi, If this is the wrong forum pls move me!
    Normally i can figure stuff out but with this I'm stuck. Maybe someone here has seen this before:
    set x ="basename path/file.txt .txt" ; echo $x
    and I want the answer to be "file" not "basename path/file.txt .txt"
    I am new to tcsh. Cant find the answer anywhere...


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,038 ✭✭✭rob1891


    you need to use backticks and not quotation marks:
    set x =`basename path/file.txt .txt` ; echo $x
    

    Backticks do what you are intending, the line inside the ticks is executed and the output sent to your variable x.

    Rob


  • Registered Users, Registered Users 2 Posts: 86 ✭✭RobbieM


    rob1891 wrote: »
    you need to use backticks and not quotation marks:
    set x =`basename path/file.txt .txt` ; echo $x
    

    Backticks do what you are intending, the line inside the ticks is executed and the output sent to your variable x.

    Rob

    Jeeperss!! Thanks, I love u!! Everybody, give him some thanks!! backticks! brilliant! I knew there must be a way! Brilliant!


  • Registered Users, Registered Users 2 Posts: 590 ✭✭✭bman


    RobbieM wrote: »
    Jeeperss!! Thanks, I love u!! Everybody, give him some thanks!! backticks! brilliant! I knew there must be a way! Brilliant!

    Well you can't say he's not appreciative. Makes up for all the times you supply an answer and receive no thanks :D .


  • Registered Users, Registered Users 2 Posts: 740 ✭✭✭z0oT


    Small snippet I added to root's bashrc to account for Debian not automounting NTFS hard drives properly.
    function ntfsmount {
    
    # Names Given
    name1 = portable
    name2 = 1tb
    
    # UUIDs
    UUID1=DE44D1ED44D1C883
    UUID2=3736C195022556C6
    
    if [ "$1" = "$name1" ]; then
        umount -l /dev/disk/by-uuid/$UUID1
        
            if [ "$2" = "force" ]; then    
            mount -t ntfs-3g -o force,defaults,locale=en_IE.UTF-8 /dev/disk/by-uuid/$UUID1 /media/Portable
            else
            mount -t ntfs-3g -o defaults,locale=en_IE.UTF-8 /dev/disk/by-uuid/$UUID1 /media/Portable    
            fi
        
        elif [ "$1" = "name2" ]; then
        umount -l /dev/disk/by-uuid/$UUID2   
        
            if [ "$2" = "force" ]; then
                    mount -t ntfs-3g -o force,defaults,locale=en_IE.UTF-8 /dev/disk/by-uuid/$UUID2 /media/Ext_HD 
                    else
                    mount -t ntfs-3g -o defaults,locale=en_IE.UTF-8 /dev/disk/by-uuid/$UUID2 /media/Ext_HD   
                    fi
        else
        echo "Hard Drive UUID Not Listed"
    fi
    }
    


  • Registered Users, Registered Users 2 Posts: 740 ✭✭✭z0oT


    A very simple one I put together just solely to download the videos from the speed demos archives easily ( http://speeddemosarchive.com/ ) - 1 command per set of videos per game
    ./<script name> <url of 1st video> <number of videos> <file name prefix>
    #!/bin/bash
    
    length=${#1}
    url=$( echo $1 | cut -c1-$(expr $length - 6))
    extension=$( echo $1 | cut -c$(expr $length - 3)-$length )
    
    for i in $(seq 1 1 9)
    do
    wget -nc -O $3'_'$i$extension $url'0'$i$extension
    done
    
    for i in $(seq 10 1 $2)
    do
    wget -nc -O $3'_'$i$extension $url$i$extension
    done
    


  • Registered Users, Registered Users 2 Posts: 740 ✭✭✭z0oT


    Also here's one to sort out broken downloads from Rapidshare that can happen when using Flashget (under wine). In many cases if a download fails, for instance if I was to run out of trafficshare etc., a ~30K size file is left behind which is no good to anyone. What this does is check for the files of the "broken" size and regenerate the file containing the links to the files that need to be downloaded again.
    Mind you, I could use wget to download using a cookie instead of Flashget in the first place, but it lacks some handy features Flashget has. :pac:
    #!/bin/bash
    
    original_file=links.lst
    broken_file_size=34K
    broken_file_list=BrokenFiles.txt
    new_links_file=NewLinksFile.lst
    
    for f in $(ls *.rar)
    do
    filesize=$(ls -lah | grep $f | awk '{print $5}')
    if [ "$filesize" = "$broken_file_size" ]; then
    echo $f >> $broken_file_list
    fi
    done
    
    while [ 1 ]
    do
    read filename || break
    link=$(grep $filename $original_file)
    echo $link >> $new_links_file
    done < $broken_file_list
    
    unix2dos -u $new_links_file
    


  • Closed Accounts Posts: 1,504 ✭✭✭barura


    Thanks man! That's going to be really handy! :D

    /goes back to lurking


  • Registered Users, Registered Users 2 Posts: 1,889 ✭✭✭evercloserunion


    #!/bin/bash
    
    echo "Hello, world!"
    


  • Closed Accounts Posts: 13,404 ✭✭✭✭sKeith


    Below is a script i use for backing up the OS part of file server linux boxes,
    My main data is stored under /mnt which is excluded this backup
    #!/bin/bash
    
    # Script to backup OS to cd.
    
    # Ensure user is Super User
    
    if [ $(id -u) = 0 ];
    then echo Script to Backup image to cd
    else echo You nee to be superuser to use this script
    exit
    fi
    echo ensuring you have most uptodate tools
    
    # Ive commented out the following line bacause most people have good settings already
    #route add default gw 192.168.1.1 #(your routers ip address)
    
    apt-get update
    apt-get install mkisofs
    apt-get install cdrecord
    clear
    echo Collecting data...
    
    cd /
    tar cpzf /tmp/mybackup.tgz / --exclude=/export --exclude=/proc --exclude=/lost+found --exclude=/mnt --exclude=/sys  --exclude=/tmp --t
    ls /tmp/mybackup.tgz -sh
    eject /dev/cdrom
    read -n 1 -p "Place Blank CD onto CD tray then press any key."
    eject -t /dev/cdrom
    echo .
    echo making cd image
    mkisofs -o /tmp/mybackup.iso /tmp/mybackup.tgz
    echo burning cd image
    cdrecord dev=/dev/cdrom /tmp/mybackup.iso
    eject /dev/cdrom
    echo Finished
    


    here is a dos2unix script
    #!/bin/sh
    
    if [ -z "$2" ];
            then echo "Usage: dos2unix infile outfile"
    else
            awk '{ sub("\r$", ""); print }' $1 > $2
    fi
    
    and the reverse
    #!/bin/sh
    
    if [ -z "$2" ];
            then echo "Usage: unix2dos infile outfile"
    else
            awk 'sub("$", "\r")' $1 > $2
    fi
    


  • Registered Users, Registered Users 2 Posts: 1,183 ✭✭✭dioltas


    Here's a little script I wrote for picking 6 random lottery numbers between 1 and 49.

    #!/bin/bash
        i=0
    	while [ $i -lt 6 ];	do
    			num=$(( ((`od -An -N2 -t u2 /dev/random`)%(49)) +1   ))
             
                instr=`echo $stra | grep -c " $num "`
        
                if [ $instr -eq 0 ]
                    then
                    stra=`echo $stra $num`
                    let i=$i+1
                fi
    
    		done  
    
        echo $stra
    

    Don't think it's a very efficient way of checking for duplicates but it'll do for the purpose!


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 425 ✭✭Mathiasb


    dioltas wrote: »
    Here's a little script I wrote for picking 6 random lottery numbers between 1 and 49.

    #!/bin/bash
        i=0
        while [ $i -lt 6 ];    do
                num=$(( ((`od -An -N2 -t u2 /dev/random`)%(49)) +1   ))
             
                instr=`echo $stra | grep -c " $num "`
        
                if [ $instr -eq 0 ]
                    then
                    stra=`echo $stra $num`
                    let i=$i+1
                fi
    
            done  
    
        echo $stra
    
    Don't think it's a very efficient way of checking for duplicates but it'll do for the purpose!

    Fixed:
    #!/bin/bash
    i=0
    while [ $i -lt 6 ]; do
        num=$(( ((`od -An -N2 -t u2 /dev/random`)%(49)) +1   ))
        instr=`echo $stra | grep -c " $num "`
        if [ $instr -eq 0 ]; then
            stra=`echo $stra $num`
            i=$(($i+1))
        fi
    done  
    echo $stra
    


  • Registered Users, Registered Users 2 Posts: 455 ✭✭zappb




  • Registered Users, Registered Users 2 Posts: 1,889 ✭✭✭evercloserunion


    Here is a simple script I wrote that backs up my home directory to my external hard drive, if it is connected and turned on, and keeps track of when the last backup was performed.
    #!/bin/bash
    
    EXT_HDD_PATH=/media/IOMEGA_HDD # path to your external storage device
    BACKUP_DIR=$EXT_HDD_PATH/home_backup/ # name of directory on storage device that you want to backup to
    TIME_FILE=$HOME/bin/last_backup.txt# name/location of file to keep track of when last backup was performed.
    
    if [[ -e $EXT_HDD_PATH ]]
    then
    
        if [[ ! -e $BACKUP_DIR ]]
        then mkdir $BACKUP_DIR
        fi
    
        rsync -qa $HOME $BACKUP_DIR
        
        date +%s > $TIME_FILE
    
    fi
    

    Then you can put this line in your .conkyrc file:
    ${color grey}Days since backup:$color ${exec echo $((($(date +%s)-$(cat ~/bin/last_backup.txt))/86400))}
    
    Replacing the path with the path to your TIME_FILE (and the color settings with your own, of course), and conky will tell you how many days since you last backed up.

    I wrote it with the intention of crontabing it to run once per day. When I run it from the command line, it appears as though some attributes (like symlinks in certain hidden files) are not properly transferred due to permissions errors. Presumably running the operation as root would take care of that but if you do that remember to change $HOME to the actual path to your home directory, otherwise it will backup /root/. Personally I'm not bothered running it as root in cron because it looks as though what is not properly transferred is insignificant.

    It's a simple script, but if anyone has any comments or suggestions I'd like to hear them as I'm a bit of a noob when it comes to scripting.


  • Registered Users, Registered Users 2 Posts: 85 ✭✭rfrederick


    Out in my neck of the woods (western Kansas) monitoring watches and warnings from the National Weather Service is rather important, especially during the spring, which is the height of tornado season. Thus I have a bit of perl called as a Nagios check that pulls the latest weather alert information from Weather Underground's XML-API for the specified ZIP (postal) code and returns an appropriate alert code for Nagios.
    #!/usr/bin/perl -w
    
    use strict;
    use URI::Escape;
    use XML::Simple;
    
    my $location = $ARGV[0];
    my $alert_output_short;
    my $alert_output_long;
    
    if (not defined $location or $location !~ /^[0-9]{5}$/) {
    	print "Usage: $0 [ZIP Code]\n";
    	exit(3);
    }
    
    $location = uri_escape($location);
    
    my $alert_raw = `wget --quiet -O - http://api.wunderground.com/auto/wui/geo/AlertsXML/index.xml?query=$location`;
    if (not defined $alert_raw) {
    	print "Error grabbing alert information.\n";
    	exit(3);
    }
    
    my $xml = new XML::Simple;
    my $alert_data = $xml->XMLin($alert_raw, forcearray => ['AlertItem']);
    
    if ($alert_data->{alert}->{count} eq 0) {
    	print "No current watches, warnings, or advisories.\n";
    	exit (0);
    }
    
    my $exit_code = 1;
    foreach my $alert (@{$alert_data->{alert}->{AlertItem}}) {
    	if ($alert->{significance} eq "W") {
    		$exit_code = 2;
    	}
    	if (defined $alert_output_short) {
    		$alert_output_short .= "; $alert->{description} In Effect Until $alert->{expires}->{content}";
    	}
    	else {
    		$alert_output_short = "$alert->{description} In Effect Until $alert->{expires}->{content}";
    	}
    	$alert_output_long .= $alert->{message};
    }
    
    print "$alert_output_short$alert_output_long";
    
    exit($exit_code);
    

    Notifications from this script are relayed to me by Nagios via email, IM, and SMS. The script proved itself useful last spring when I received word of a tornado warning around five minutes before the city's sirens were activated for two confirmed tornados approaching the city. :cool:


  • Registered Users, Registered Users 2 Posts: 1,889 ✭✭✭evercloserunion


    Send text messages from the command line. A simple bash script which relies on http://cabbagetexter.com, an excellent service maintained by boardsie Sam Vimes. It's a bit crude, no input validation and lacks polish but it gets the job done. I was going to write it in Python but Python cannot work through the TCD authenticated proxy, while wget can.
    #!/bin/bash
    
    conf_dir="$HOME/.msg"
    if [[ ! -d $conf_dir ]]; then
        mkdir $conf_dir
    fi
    num_file="$conf_dir/numbers"
    login_file="$conf_dir/login"
    login=$(cat $login_file)
    unum=$(echo $login | awk -F ',' '{print $1}')
    pin=$(echo $login | awk -F ',' '{print $2}')
    serv=$(echo $login | awk -F ',' '{print $3}')
    
    function add_number {
        name=$1
        num=$2
        echo "$1,$2" >> $num_file
        echo "Number added."
    }
    
    function send_msg {
        numtest='^[[:digit:]]+$'
        if [[ "$1" =~ $numtest ]]; then
            num=$1
        else
            num=$(cat $num_file | grep "$1" | awk -F ',' '{print $2}')
        fi
        shift
        msg=$(echo "$@" | sed 's/\ /%20/g' | sed "s/'/%27/g")
        url="http://cabbagetexter.com/send.php?u=$unum&p=$pin&s=$serv&d=$num&m=$msg"
        ans=$(wget -qO - $url)
        case $ans in
            -1  )   echo "Invalid login. Message not sent." ;;
            -2  )   echo "Could not log in to network website. Message not sent." ;;
            -5  )   echo "Problem with network website. Message not sent." ;;
            ""  )   echo "No message specified. Message not sent." ;;
            *   )   echo "Message sent. $ans messages left."
        esac
    }
    
    function set_login {
        echo "$1,$2,$3" > $login_file
        echo "Login set."
    }
    
    function empty {
        echo "" > $num_file
        echo "Contact book cleared."
    }
    
    function view_contacts {
        cat $num_file
    }
    
    function gui {
        if [[ $(which zenity 2>/dev/null) ]]; then
            send_msg $(zenity --entry --text='Enter the number or name of the recipient folowed by your message.')
        fi
    }
    
    case "$1" in
        "add"       )   add_number "$2" "$3" ;;
        "login"     )   set_login "$2" "$3" "$4" ;;
        "clear"     )   empty ;;
        "contacts"  )   view_contacts ;;
        ""          )   gui ;;
        *           )   send_msg "$@"
    esac    
    

    Usage is as follows (presuming the script is saved as msg):

    On the first use, set your login details by typing:
    msg login <number> <pin> <m, v or o>
    
    Add a contact (a simple name-number association) with:
    msg add <name> <number>
    
    To send a message, type:
    msg <name or number> <message>
    
    View your saved contacts with:
    msg contacts
    
    Clear contacts with:
    msg clear
    

    Running the script with no arguments, if you have zenity installed, opens a simple GUI prompt. Enter <name or number> <message> to send a message.

    If you see something wrong with it, please let me know. Thanks.


  • Registered Users, Registered Users 2 Posts: 425 ✭✭Mathiasb


    Works wonders :D


  • Closed Accounts Posts: 13,404 ✭✭✭✭sKeith


    OSI wrote: »
    Very simple script that will take an argument and then kill all processes matching the string..
    Is handy for example when you have multiple instances of the one process running, you can kill them all in one go.
    Lookup the command killall. (man killall)
    killall <string>
    nice script all the same. thanks.


  • Registered Users, Registered Users 2 Posts: 220 ✭✭dueyfinster


    Script to output % of a file:
    # !/bin/sh
    # Public Domain, by Neil Grogan 2010
    # Script to output last 30% of file by lines 
    
    OLDFILE="oldlog.txt"
    NEWFILE="newlog.txt"
    PERCENT=0.7
    
    #Use wc to count lines
    LINES=$(wc -l $OLDFILE | awk '{ print $1}')
    
    #Linespercent = 2 decimal places, lines by percent, round to whole
    LINESPERCENT=$(echo "scale=2; $LINES*$PERCENT" | bc | xargs printf "%1.0f" )  
    
    # Use tail to get last  30% and output, can use tail -s with sleep time to have it run on sched.
    tail -n $LINESPERCENT $OLDFILE >> $NEWFILE
    


  • Registered Users, Registered Users 2 Posts: 1,889 ✭✭✭evercloserunion


    Sometimes my internet randomly goes down for five minutes, an hour or a few hours. I just wrote up this simple thing to continuously check for internet connectivity by attempting typing Google, and alert you (and exit) when it manages to connect. It currently waits 10 seconds between ping attempts, just change the 10 to something else to vary sleeping time. Run it, fork it to background and go about your offline work, and don't worry about checking to see if you're back online every 5 minutes.
    #!/bin/sh
    
    on_success() {
            zenity --info --text="Internet is working."
            exit 0 
    }
    
    while true; do ping -c 1 google.com > /dev/null 2>&1 && on_success || sleep 10; done
    


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,745 ✭✭✭laugh


    convert gz to bz:

    for i in `ls ../server*/log.*.gz`; do gunzip $i; new=`echo $i | awk 'sub("...$", "")'`; bzip2 $new; done


  • Closed Accounts Posts: 5,082 ✭✭✭Pygmalion


    A nethack wrapper, if your nethack is built with the dumplog patch, it'll check to see whether the dumplog has been updated, and then copy it to ~/dumps/ with the date and time as filename, then make it world-readable.
    #!/bin/bash
    
    NAME="bunbun"
    DUMP="/var/games/nethack/dumps/$NAME.lastgame.txt"
    
    #Check whether the dump file exists, get last modified time if so, 0 otherwise
    if [ ! -e $DUMP ]
    then
        LDATE=0
    else
        LDATE=$(date -r $DUMP +%s)
    fi
    
    #Execute nethack with the options given
    nethack $*
    
    #If dump exists and is newer than the one we saw earlier copy the dump
    if [ -e $DUMP ]
    then
        if [ $(date -r $DUMP +%s) -gt $LDATE ]
        then
            #Only eval `date ...` once, in case the second changes between these lines
            OUTFILE="$HOME/dumps/`date +%Y-%m-%d-%H:%M:%S`.txt"
            cp $DUMP $OUTFILE
            chmod 644 $OUTFILE
        fi
    fi
    

    Just change the username and location of dumplog file at the top, and ensure the dumps directory exists.


  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    Nothing super mega about this, but I use it all the time.

    Set up key exchange with a remote server:
    #!/bin/bash
    
    ssh "$1" "echo `cat ~/.ssh/id_?sa.pub` >> ~/.ssh/authorized_keys2"
    ssh "$1" "chmod 700 ~/.ssh/authorized_keys2"
    

    usage:
    ./sendKeyTo <user@remoteserver>


  • Registered Users, Registered Users 2 Posts: 218 ✭✭Tillotson


    Helps in my never-ending war against badly named mp3's. Converts files to Start Case, eg: "some file.mp3" -> "Some File.mp3".
    #!/bin/bash
    
    set -e
    
    function usage {
        echo -e "USAGE: $0 [OPTION] [FILE]"
        echo -e "Converts filename to Start Case\n"
        printf "%4s%20s\n" "-d" "dry run"
        printf "%4s%20s\n" "-s" "verbose"
        echo -e "Example: $0 *.mp3"
    
    }
    
    function err_exit {
        echo -e "$1\n"
        usage
        exit 1
    }
    
    declare dry verbose
    
    while getopts ":dv" opt; do
        case $opt in
            d) dry=y ;;
            v) verbose=y ;;
            \?) usage; exit 1 ;;
        esac
    done
    
    shift $((OPTIND-1))
    
    for f0; do
        [[ -f $f0 ]] || err_exit "$f0: is not a file"
    
        # Keep extension unchanged
        f1="$(echo $f0 | cut -d'.' -f 1)"
        ext="$(echo $f0 | cut -d'.' -f 2)"
        f1="$(echo $f1 | sed 's/\b\(.\)/\u\1/g').$ext"
    
        if [[ ! -z $dry  ]] || [[ ! -z $verbose ]]
        then
            printf "%s%40s\n" "$f0" "-> $f1"
        fi
        if [[ $f0 != $f1 ]] && [[ -z $dry ]]    # Test filenames not the same
        then
            mv "$f0" "$f1"
        fi
    done
    


  • Registered Users, Registered Users 2 Posts: 1,092 ✭✭✭KAGY


    Khannie wrote: »

    Set up key exchange with a remote server:

    <snip>
    usage:
    ./sendKeyTo <user@remoteserver>

    ssh-copy-id does the same thing, part of openssh i think,
    ssh-copy-id [-i [identity_file]] [user@]machine
    
    , though your's is easier to use as the rsa file location is hard coded


  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    I use this all the time for using up all my cores on apps that aren't multithreaded (mostly we're talking about oggenc and convert (part of imagemagick)).
    #!/bin/bash
    
    NUMCPU="$(grep ^processor /proc/cpuinfo | wc -l)"
         
    until [ -z "$1" ]
    do
    	while [ `jobs -p | wc -l` -ge $NUMCPU ] ; do
    		sleep 0.1
        done
      	oggenc -q8 "$1" &
    	shift
    done
    

    Usage would be:
    oggenc_cpus.sh *.flac

    edit: be somewhat careful with hyperthreaded CPU's (mine is not).


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,931 ✭✭✭PrzemoF


    A simple script to monitor modification date of a file and trigger an action.
    Example use:
    0. Save the script to trig.sh file and make executable
    chmod a+x trig.sh
    
    1. start script with a tex file name as parameter i.e.
    ./trig.sh my_tex_file.tex
    
    2. Open the tex file in an editor:
    vim my_tex_file.tex
    
    3. Make any modifications to the file and save it - it starts pdflatex that produces my_tex_file.pdf
    4. Open evince to see the content of the pdf file
    evince my_tex_file.pdf
    

    From now on every saved modification of my_tex_file.tex will be caught by trig.sh. trig.sh will run pdflatex and evince will show updated version of the pdf file.

    trig.sh:
    #!/bin/bash
    # Delay between checks
    DELAY=1
    DATE_ORIG=$(stat -c %Y $1)
    DATE_MOD=$DATE_ORIG
    
    while [ 1 ];
    do
            if [ $DATE_ORIG -ne $DATE_MOD ]; then
                    #File changed - add your action here
                    pdflatex $1
                    DATE_ORIG=$DATE_MOD
            else
                    sleep $DELAY
                    DATE_MOD=$(stat -c %Y $1)
            fi
    done
    


  • Closed Accounts Posts: 4,763 ✭✭✭Fenster


    I blog a lot of photographs.

    1. Take in a folder name followed by a list of images as arguments.
    2. Create said, copy images to it as JPG, and rename them in sequential order.
    3. Create a smaller thumbnail size of the images to fit my blog.
    4. Move this folder to my Dropbox Public folder.
    5. Insert HTML code for the images into my clipboard.
    #!/bin/bash
    
    dropbox="$HOME/Dropbox/Public/content"
    uuid="4144919"
    thumbs="m"
    large_size=1500
    thumb_size=600
    temp="/tmp/prep"
    
    prune_spaces() {
        cp "$1" $(echo "$1" | sed -e 's/ /_/g')
    }
    
    add_line() {
        echo "<a title=\"CHANGE ME\" href=\"http://dl.dropboxusercontent.com/u/$1/content/$2/$3\"><img src=\"http://dl.dropboxusercontent.com/u/$1/content/$2/m/$3\" alt=\"CHANGE ME\" /></a>" >> $4
    }
    
    resize_image() {
        mogrify -quality 100 -format jpg -resize "$1"x\> $2
    }
    
    if [ ! -d $1 ]; then
        mkdir -p $1/$thumbs
    fi
    
    if [ -d $dropbox/$1 ]; then
        rm -r $dropbox/$1
    fi
    
    src=$1
    shift
    
    count=1
    
    for n in "$@"; do
        if [ -e "$1" ]; then
            cp "$1" $src/$count.jpg
            cp "$1" $src/$thumbs/$count.jpg
            let count++
            shift
        fi
    done
    
    cd $src
    
    if [ $(ls -1 *.jpg 2> /dev/null | wc -l) == 0 ]; then
        echo "No images selected!"
        rm -r $(pwd)
        exit 1
    fi
    
    for n in *.jpg; do
        resize_image $large_size $n
        resize_image $thumb_size $thumbs/$n
        add_line $uuid $src $n $temp
    done
    
    cat $temp | xclip -sel clip
    rm $temp
    mv $(pwd) $dropbox
    
    exit 0
    


  • Registered Users, Registered Users 2 Posts: 883 ✭✭✭Keplar240B


    My First Arch Linux script , a backup utility.
    #!/bin/bash
    
    #GWB 22-06-16
    #Backup script backs up files to remote directory
    #1: Make sure destination  is mounted
    #2: Run as root
    
    
    
    echo -e  "\e[42m******************************\e[0m"
    echo -e  "\e[42m*Written By Gwb     *\e[0m"
    echo -e  "\e[42m*Verision 3 21-06-16         *\e[0m"
    echo -e  "\e[42m*Backup script Backup_3.sh   *\e[0m"
    echo -e  "\e[42m******************************\e[0m"
    echo " "
    
    #Check that user ran as sudo
    if (( $EUID != 0 )); then
        echo -e "\e[41mPlease run as root\e[0m"
        exit
    fi
    
    #get user input for which path for backup
    Dest1="/run/media/gwb/Linux_backup"
    Dest2="/run/media/gwb/iomeaga_320"
    echo -e  "\e[44mPick destination directory for backup :--  \e[0m"
    echo "(1)   "  "$Dest1"
    echo "(2)   "  "$Dest2"
    echo "(3)    Specify a path" 
    echo "(4)    exit"
    echo ""
    echo "Press option followed by [ENTER]"
    echo ""
    read -r choice
    #check that paths exist and change path to dest
    case "$choice" in
    
    1)  echo $Dest1
        if  [ -d $Dest1 ] 
        then
          cd $Dest1
        else
    	  echo  -e "\e[91mPath not found to destination directory\e[0m"
    	  echo  -e "\e[91mNOTE : The Hard drives  internal and external must be mounted\e[0m"
              echo ""
              exit
        fi
        ;;
    
    2)  echo  $Dest2
        if  [ -d $Dest2 ] 
        then
          cd $Dest2
        else
    	  echo  -e "\e[91mPath not found to destination directory\e[0m"
    	  echo  -e "\e[91mNOTE : The Hard drives  internal and external must be mounted\e[0m"
              echo ""
              exit
        fi
        
        ;;
    3)  echo -e "\e[44mType a custom destination path\e[0m"
        read -r Path1
        if  [ -d $Path1 ] 
        then
          echo "$Path1"
          cd $Path1
        else
    	  echo  -e "\e[91mPath not found to destination directory\e[0m"
              echo ""
              exit
        fi
        ;;
    
    *) echo "exit"
        exit
       ;;
    esac
    
    #make the backup directory
    TODAYSBACKUPDATE=`date +%d-%b-%Y`
    mkdir $TODAYSBACKUPDATE
    cd $TODAYSBACKUPDATE
    echo "Backup Directory made at " "$TODAYSBACKUPDATE" at "backup path"
    echo ""
    
    #begin the backup
    echo -e "\e[30;48;5;10m1 Make copy of first 512 bytes MBR with dd\e[0m"
    dd if=/dev/sdb1 of=hda-mbr.bin bs=512 count=1
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    
    echo -e "\e[30;48;5;10m2 Make a copy of etc dir\e[0m"
    cp -a  -u /etc .
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    
    echo -e "\e[30;48;5;10m3 Make a copy of home dir\e[0m"
    cp -a  -u /home .
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    sync
    
    
    echo -e "\e[30;48;5;10m4 Make tarball of all except tmp dev proc sys run\e[0m"
    tar --one-file-system --exclude=/tmp/* --exclude=/dev/* --exclude=/proc/* --exclude=/sys/* --exclude=/run/* -pzcf RootFS_backup.tar.gz /
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    sync
    
    echo -e "\e[30;48;5;10m5 Make copy of package lists\e[0m"
    pacman -Qqen > pkglist.txt
    pacman -Qm > pkglistAUR.txt
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    
    
    
    


  • Registered Users, Registered Users 2 Posts: 883 ✭✭✭Keplar240B


    My Second Arch Linux script , a maintenace and update utility. uses cower as a AUR helper an clamav as antivirus
    #!/bin/bash
    
    #Gwb 23-06-16
    #Arch Linux distro update script
    #update system official and AUR repo using pacman and cower
    #check for broken symlinks, log errors and failed systemd services
    #and run anti virus check clam and update the database.
    echo -e "\e[40;38;5;82m*********************************\e[0m"
    echo -e "\e[40;38;5;82m*gwb 23-06-16 updatman.sh *\e[0m"
    echo -e "\e[40;38;5;82m*Arch Linux distro update script*\e[0m"
    echo -e "\e[40;38;5;82m*update system official and      *\e[0m"
    echo -e "\e[40;38;5;82m*AUR repo using cower and pacman*\e[0m"
    echo -e "\e[40;38;5;82m*check for broken symlinks,     *\e[0m"
    echo -e "\e[40;38;5;82m*log errors and failed systemd  *\e[0m"
    echo -e "\e[40;38;5;82m*services and anti virus check  *\e[0m"
    echo -e "\e[40;38;5;82m*********************************\e[0m"
    echo ""
    #cower AUR
    cd /home/gwb/Downloads
    TODAYSBACKUPDATE=`date +%d-%b-%Y`
    mkdir $TODAYSBACKUPDATE
    cd $TODAYSBACKUPDATE
    echo -e "\e[30;48;5;10m***** UPDATE COWER PACKAGES *****\e[0m"
    echo "Directory $TODAYSBACKUPDATE made at /home/gwb/Downloads for cower updates"
    echo ""
    cower -vdu
    echo ""
    # look for empty dir (if no updates) 
    if [ "$(ls -A .)" ] 
    then
         	echo -e  "\e[44mCower updates available for package build\e[0m"
         	echo -e  "\e[44mDo you wish to build and install  them now?\e[0m"
          	echo ""
    	echo "Pick Option"
    	echo "1) Yes"
    	echo "2) No"
    	echo ""
    	echo "Press 1 or 2 followed by [ENTER]"
    	echo ""
    	read choice
            if [ $choice = "1" ]
    	then
    		echo "Building and installing cower package updates"	
    		echo ""	
    		#build and install packages
                   find . -name PKGBUILD -execdir makepkg -si \;
    			
    	fi	
    
    else
        	echo ""
    	echo " No updates of AUR packages by Cower..."
    fi	
    echo ""
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    
    # -systemd --failed:
    echo -e "\e[30;48;5;10m***** ALL FAILED SYSTEMD SERVICES *****\e[0m"
    systemctl --failed --all
    echo ""
    echo -e "\e[30;48;5;10m***** FAILED ACTIVE SYSTEMD SERVICES *****\e[0m"
    systemctl --failed
    echo ""
    # -Logfiles:
    echo -e "\e[30;48;5;10m***** LOGFILES *****\e[0m"
    cd /home/gwb/Documents/Tech/Linux/MyLinux
    journalctl -p 3 -xb >errorfile
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    # Checking for broken symlinks:
    echo -e "\e[30;48;5;10m***** CHECKING FOR BROKEN SYMLINKS *****\e[0m"
    cd ~
    find . -type l -! -exec test -e {} \; -print > /home/gwb/Documents/Tech/Linux/MyLinux/log.txt
    echo -e "\e[1;10mDone.\e[0m"
    echo ""
    #pacman
    cd ~
    echo -e "\e[30;48;5;10m***** UPDATE PACMAN *****\e[0m"
    echo ""
    sudo pacman -Syu
    echo ""
    echo "Delete orphans"
    sudo pacman -Rns $(pacman -Qtdq)
    echo ""
    echo "Prune older packages from cache"
    sudo paccache -r
    echo "Writing installed package lists to file "
    sudo pacman -Qqen > /home/gwb/Documents/Tech/Linux/MyLinux/packagesQe.txt
    sudo pacman -Qm > /home/gwb/Documents/Tech/Linux/MyLinux/package.txt
    echo -e "\e[1;10mDone.\e[0m"    
    echo ""
    #Anti-virus clam Av
    # update clamscan virus definitions:
    echo -e "\e[30;48;5;10m***** UPDATING CLAMAVSCAN DATABASE *****\e[0m"
    sudo freshclam
    echo""
    echo -e "\e[30;48;5;10m***** SCANNING / RECURSIVELY WITH CLAMAV*****\e[0m"
    echo "Do you wish to run anti-virus check with clamAv at this point?"
    echo ""
    echo "Pick Option"
    echo "1) Yes"
    echo "2) No"
    echo ""
    echo "Press 1 or 2 followed by [ENTER]"
    echo ""
    read choice1
    echo "You picked option " $choice1 
    echo "" 
    if [ $choice1 = "1" ]
    	then
    		# scan entire system
    		
    		sudo clamscan --recursive=yes --infected --exclude-dir='^/sys|^/proc|^/dev|^/lib|^/bin|^/sbin' /
    		echo "" 
    		echo -e "\e[1;10mDone.\e[0m"
    			
    	else
    		echo -e "\e[1;10mDone.\e[0m"
    		exit
    fi
    
    


  • Closed Accounts Posts: 13,404 ✭✭✭✭sKeith


    Here is one I call rsync.no.quit as it will keep on going until job is finished. I needed it some time ago when I was migrating a huge amount of data off a server where rsync would keep failing with some pipe issue. (most likely flakey network or something similar)
    Set this script up with source and destination and fire and forget.

    *Warning* With the current rsync commands used Destination will mirror source. Any files or folders in destination will be destroyed /*Warning*

    #!/bin/bash
    
    source="192.168.1.1:/home/main/"
    destination="/mnt/backup/"
    
    if [ -z $source ]; then
            echo "Source unset"
            exit 1
    fi
    if [ -z $destination ]; then
            echo "Destination unset"
            exit 1
    fi
    
    echo Warning: Everything in $destination will we destroyed.
    read -n1 -t5 -r -p "CRLT+C to cancel" key
    echo
    while [ 1 ];    do
    rsync --progress --timeout=30 -a --delete-during $source $destination
    if [ "$?" = "0" ] ; then
            echo "rsync completed normally"
            exit
    else
            echo "Rsync failure. Backing off and retrying..."
            sleep 10
    fi
    done
    
    


  • Advertisement
Advertisement