Networking

Unix and Linux network configuration. Multiple network interfaces. Bridged NICs. High-availability network configurations.

Applications

Reviews of latest Unix and Linux software. Helpful tips for application support admins. Automating application support.

Data

Disk partitioning, filesystems, directories, and files. Volume management, logical volumes, HA filesystems. Backups and disaster recovery.

Monitoring

Distributed server monitoring. Server performance and capacity planning. Monitoring applications, network status and user activity.

Commands & Shells

Cool Unix shell commands and options. Command-line tools and application. Things every Unix sysadmin needs to know.

Home » Commands & Shells

Compress Old Log Files on Linux

Submitted by on January 12, 2012 – 6:06 pm 6 Comments

Most log files located in /var/log are part of the log rotation and will be compressed automatically. However, in many cases various user applications maintain log files outside of /var/log. These logs are not managed by the system and can consume a lot of disk space if not cleaned up on a regular basis. The simple script below will find all old log files in /opt that are larger than 10Mb and have the name of “something.log.something”. This syntax is most commonly used for old log files created by applications during the log rotation process. For example: /opt/application/logs/application.log.0

find /opt -type f -mtime -200 -size +10240k -exec ls -als {} ; | sort -rn | grep "log." | grep -v ".gz" | awk '{print $NF}' | while read line
do
echo "Compressing $line"
gzip "${line}"
done
Print Friendly, PDF & Email

6 Comments »

  • Sergeant Pickle says:

    Write a script that will be given the name of a log file, as a command line arg, and rotate all of the files that start with that name. For instance, if you have a series of log files named foo.1, foo.2, foo.3, then it rotates these to become foo.2, foo.3 and foo.4 respectively and creates a blank file called foo.1.
    Assume that the files are consecutively numbered without skipping any (so that if there is a foo.2, there must be a foo.1 as well). You will first have to find out how many files already exist which you can do use ls name* | wc –l for this. If the number of files is 0, do not rotate anything but instead output an error message.
    Remember that the name will be passed in as a parameter. This script will require, after determining the number of files of the given name, the use of a while loop that counts downward. For instance, if there are 4 files, then the highest number is .4, so we will start at 4 and move .4 into .5, then move .3 into .4, then move .2 into .3 and then move .1 into .2, and then finish off by creating an empty .1 file.

    this is what i have so far:

    number=0
    echo “enter name for log file”
    read $1

    count= `ls $1* | wc -l`
    if [ $count -gt 0 ]
    echo “error”
    done
    else

    while count is count
    if [ $count -gt 0 ]
    while [ $count -ne 0 ]
    mv $1.$count $1.$(($count+1))
    mv $1.(($count-1))$1.$count
    mv $1.(($count-2))$1.$count
    mv $1.(($count-3))$1.$count
    done

  • Thomas A says:

    the path to the script is /home/joe/Server/minecraft_autobackup.sh
    when executed, it is supposed to backup files inside “Server” folder.
    Instead, it looks for the files in /home/joe and fails to find the resources it needs from “Server” and wont run.
    $PWD=/home/joe The script simply will not change its working directory to anything but my home folder (/home/joe) How might I change the working directory to the “Server” folder for the duration of this script so it will function properly? (By the way I have been “google-ing” this to do it myself for almost a week and ive tried adding dozens of different fixes in the form of new lines or arguments, so its safe to say that “google it” is an option I have completely exhausted.) I’ve tried to “cd” the working directory to “Server” at the beginning of the script, but it has no effect on the process. Needless to say, I am out of options, so here I am. I will post the script beneath the Terminal output. Also, I cut a few irrelevant variables out to meet the character limit for this post.

    Terminal Output
    joe@joe:~$ /home/joe/Server/minecraft_autobackup.sh
    [LOG] Starting Justins AutoBackup Script..
    [LOG] Working in directory: /home/joe.
    [LOG] Fetching Level Name..
    /home/joe/Server/minecraft_autobackup.… line 112: server.properties: No such file or directory

    The File

    #!/bin/bash
    #Autobackup By Justin Smith

    #Variables

    STAMP=`date +%d-%m-%Y_%H%M%S`

    # The screen session name, this is so the script knows where to send the save-all command (for autosave)
    SCREENNAME=”minecraft”

    # Backups DIR name (NOT FILE PATH)
    BACKUPDIR=”backups”

    # MineCraft server properties file name
    PROPFILE=”server.properties”

    # Enable/Disable (0/1) Automatic CronJob Manager
    CRONJOB=1

    # Update every ‘n’ Minutes
    UPDATEMINS=60

    # Delete backups older than ‘n’ Days
    OLDBACKUPS=3

    # Enable/Disable Logging (This will just echo each stage the script reaches, for debugging purposes)
    LOGIT=1

    # *————————-* SCRIPT *————————-*
    # Set todays backup dir

    if [ $LOGIT -eq 1 ]
    then
    echo “[LOG] Starting Justins AutoBackup Script..”
    echo “[LOG] Working in directory: $PWD.”
    fi

    BACKUPDATE=`date +%d-%m-%Y`
    FINALDIR=”$BACKUPDIR/$BACKUPDATE”

    if [ $LOGIT -eq 1 ]
    then
    echo “[LOG] Checking if backup folders exist, if not then create them.”
    fi

    if [ -d $BACKUPDIR ]
    then
    echo -n < /dev/null
    else
    mkdir "$BACKUPDIR"

    if [ $LOGIT -eq 1 ]
    then
    echo "[LOG] Created Folder: $BACKUPDIR"
    fi

    fi

    if [ -d "$FINALDIR" ]
    then
    echo -n < /dev/null
    else
    mkdir "$FINALDIR"

    if [ $LOGIT -eq 1 ]
    then
    echo "[LOG] Created Folder: $FINALDIR"
    fi

    fi

    if [ $OLDBACKUPS -lt 0 ]
    then
    OLDBACKUPS=3
    fi

    # Deletes backups that are 'n' days old
    if [ $LOGIT -eq 1 ]
    then
    echo "[LOG] Removing backups older than 3 days."
    fi
    OLDBACKUP=`find $PWD/$BACKUPDIR -type d -mtime +$OLDBACKUPS | grep -v -x "$PWD/$BACKUPDIR" | xargs rm -rf`

    # –Check for dependencies–

    #Is this system Linux?
    #LOL just kidding, at least it better be…

    #Get level-name
    if [ $LOGIT -eq 1 ]
    then
    echo "[LOG] Fetching Level Name.."
    fi

    while read line
    do
    VARI=`echo $line | cut -d= -f1`
    if [ "$VARI" == "level-name" ]
    then
    WORLD=`echo $line | cut -d= -f2`
    fi
    done < "$PROPFILE"

    if [ $LOGIT -eq 1 ]
    then
    echo "[LOG] Level-Name is $WORLD"
    echo ""
    fi

    BFILE="$WORLD.$STAMP.tar.gz"
    CMD="tar -czf $FINALDIR/$BFILE $WORLD"

    if [ $LOGIT -eq 1 ]
    then
    echo "[LOG] Packing and compressing folder: $WORLD to tar file: $FINALDIR/$BFILE"
    fi

    if [ $NOTIFY -eq 1 ]
    then
    screen -x $SCREENNAME -X stuff "`printf "say Backing up world: '$WORLD'r"`"
    fi

    #Create timedated backup and create the backup directory if need.
    if [ $AUTOSAVE -eq 1 ]
    then
    if [ $NOTIFY -eq 1 ]
    then
    screen -x $SCREENNAME -X stuff "`printf "say Forcing Save..r"`"
    fi
    #Send save-all to the console
    screen -x $SCREENNAME -X stuff `printf "save-allr"`
    sleep 2
    fi

    if [ $NOTIFY -eq 1 ]
    then
    screen -x $SCREENNAME -X stuff "`printf "say Packing and compressing world…r"`"
    fi

    # Run backup command
    $CMD

    if [ $NOTIFY -eq 1 ]
    then
    # Tell server the backup was completed.
    screen -x $SCREENNAME -X stuff "`printf "say Backup Completed.r"`"
    fi

  • slipknot0129 says:

    I dont understand what i have to do. It tells me the following:
    If you have been using cPanel with your previous host then you need to create a full backup of your account on the old server, then transfer the backup files via FTP from the old server to your new account with JustHost. Please then inform us when this is done and we will restore your account from the backup provided by you.

    If you were using another control panel then you need to download your website from the old server onto your local computer. Then you need to upload your files from your local computer into your Just Host account which can be done via FTP.

  • mr flibble says:

    How do I enable sending router log files to my computer? The router I’m using is th billion 7300NX and I cant find the settings to direct the logs.

  • Disrae says:

    I was turning my computer off one day and when it would not turn off, I held the power switch and it turned on and off real quick. Now when I turn it back on, it will turn off when it hits the Windows Vista loading screen, like right when it gets to it, it does not restart just shuts off and it is annoying. People are saying its the PSU which I doubt it because when I hit F2 or F10 to go to those menus when I first start my computer, the computer stays running perfectly until it try to boot it in safe mode or boot it normal mode.

    This happened to me before when a friend messed my computer up and I found my old installation disk for vista and put it in and it worked perfectly, but now when I try to put the disk in, it shuts off when it tries to load up.

    When all that happens, my computer will not even turn on when I hit the power switch, it only turns on after I unplug the computer, open the case, close the case, then plug it back in, but when I turn it off, it shuts off at the loading screen and then wont turn on again until I unplug it all again, this sucks.

    I tried re-seating the CMOS battery or whatever its called, someone said it might work, but it did not. My computer does the same thing over and over.

    Please email me at kryptichellsing@hotmail.com
    or give a reasonable answer here or something. Thanks.
    (Asking from my moms netbook right now)

  • Shay H says:

    In windows 7 and windows xp, is there a log file that memorizes every internet connection that has been established in a particular time,day, week, month?

    If there is such a thing, where can I find it?

Leave a Reply

%d bloggers like this: