Another Trip to Europe

It’s December 26th and I’ve spent another Christmas in Spain – oh poor me… To be honest I really can’t complain. We went to Paris (http://photo.thomaswimprine.com/paris) then to Granada. I’ve said it before and I’ll probably continue, but I’m actually very fortunate to be able to have the hobbies I do and travel like I do. My issue is that I like “normal” and after spending a couple of weeks away from home I start to crave waking up in my bed and having my “normal” routine. That said I’m not opposed to a new “normal” if the opportunity presents itself.

It was a pleasure to see everyone and to have some interesting conversations about the economy with people that either see it daily or are affected by it. While nobody in the family has been affected to the point of losing their jobs they have seen it affect close friends that were in construction or other businesses that were. There are a lot more signs on houses for rent or for sale that I have never seen before additionally there are signs on small businesses stating they are closing or liquidating everything that I’ve also never seen before.

Anya Claire and her cousin Julia

WP_20121223_010

Curaçao Dive Trip

I’m in the airport heading home. Honestly I had a wonderful time with a lot of great people. The majority of the group was from the dive shop http://www.harrysdiveshop.com and everyone was great. Including some of the local wildlife ;)

We did a lot of dives and some days even got four in before crashing for the night just to it it all over again. Everyone took pictures and so far I think we have over 1000 under review for publishing. I’m sure Cindy will bet posting them to the dive shop website as soon as she gets back in town!

[![](https://lh5.googleusercontent.com/-dTvfu9tUHig/ULTsWYIbG5I/AAAAAAAAAI0/M6lqd5fwoXo/s640/blogger-image-2132011585.jpg)](https://lh5.googleusercontent.com/-dTvfu9tUHig/ULTsWYIbG5I/AAAAAAAAAI0/M6lqd5fwoXo/s640/blogger-image-2132011585.jpg)

Schedule Creation of MKSYSB to NIM Server

I have a customer that has a fair amount of systems they need to protect. Recently we had a corruption that required the re-install of the operating system. While this is an extremely rare event on modern systems, the problem was compounded by the fact they didn’t have a good mksysb backup. We needed to find a system that was similar (Test/Dev), locate the tape drives, move them to the source, take the backup, move the tape to the target and re-install. Not a fun evening and all the trouble could have easily been adverted by having a good backup.
If you have more than four AIX systems I would recommend having a NIM server. This should be your point of administration for everything done in AIX if possible. Also it gives you an environment to script, upgrade and deploy without working on your production systems.
This is a script I wrote to be executed by cron on the NIM server. It does a few things here:

  1. Queries NIM for a list of “standalone” systems
  2. Performs a mksysb backup of each and registers them on the NIM server as a resource
  3. Creates a backup of the volume group that all my NIM data is on. (I know you didn’t put it on rootvg!)
  4. Ejects the tape so it can be brought offsite
  5. Emails the report to the admin
    #!/usr/bin/sh
     
    DATE=date +%m%d%Y
    LOGFILE=/tmp/mksysblog_${DATE}
    SENDTO="admin@domain.com"
    MSGCONTENT=""
     
    LOG()
    {
        echo "$*" >> $LOGFILE 2>&1
    }
     
    LOG "------------ MKSYSB LOG FOR ${DATE} ------------------"
    TIME=date
    LOG "Process started at ${TIME}"
     
    #for mach in 0; do
    for mach in $(lsnim -t standalone | awk '{print $1}'); do
        LOG ""
        LOG ""
        LOG "**"
        LOG "Starting process for ${mach} "
        LOG "Removing the NIM resources for ${mach}"
     
        nim -o remove ${mach}_mksysb >> $LOGFILE 2>&1
     
        LOG "NIM resources removed for ${mach}"
        LOG ""
        LOG "Starting mksysb backup of ${mach}"
        nim -o define -t mksysb -a server=master -a location=/export/mksysb/${mach}_mksysb -a source=${mach} -a mk_image=yes -a mksysb_flags="-i -m -e -p" -F ${mach}_mksysb >> $LOGFILE 2>&1
        if [ $? != 0 ]
            then
            echo "ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR" > ${MSGCONTENT}
            mail -s "MKSYSB error on ${mach}" "${SENDTO}" < ${MSGCONTENT}
            LOG "  ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR "
            LOG " There was an error of ${mach} "
            echo "" > ${MSGCONTENT}
        fi
        LOG "Completed the mksysb of ${mach}"
    done
     
    LOG "Starting the SAVEVG of NIM_VG"
    /usr/bin/savevg -vmpXf /dev/rmt0 nim_vg >> $LOGFILE 2>&1
    if [ $? != 0 ] then
        echo "ERROR on SAVEVG " > ${MSGCONTENT}
        mail -s "ERROR on SAVEVG" ${SENDTO} < ${MSGCONTENT}
        LOG " ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR "
        LOG "There was an error on the NIM server while saving to tape"
        echo "" > ${MSGCONTENT}
    fi
    LOG "Completed the SAVEVG"
    LOG "Rewinding and ejecting the tape"
    mt -f /dev/rmt0 rewoffl
    LOG "Backup process complete"
     
    TIME=date
    LOG "Completed at ${TIME}"
     
    mail -s "MKSYSB Process Log" "${SENDTO}" < ${LOGFILE}

As always use as needed, but please comment if you have them.

Cheers

Collect Software Report for AIX

I had a need for a customer to report what software is installed and to report when it was installed. This included the AIX software sources, efixes and anything installed via RPM. I was able to throw this together in a few minutes which fit the bill.

#!/usr/bin/sh
 
HOSTNAME=hostname
DATE=date +%m%d%Y
LOGDIR=$HOME/log
REPORTNAME="SOFTWARE_AUDIT_$DATE"
 
MAILTO="someone@domain.com"
 
FULLLOG=$LOGDIR/$REPORTNAME
 
LOG()
{
        echo "$*" >> $FULLLOG 2>&1
}
 
 
if [[ ! -d $LOGDIR ]] then
        mkdir -p $LOGDIR
fi
 
 
LOG "#######################################"
LOG "# List AIX LPP Software "
LOG "#######################################"
lslpp -L | lslpp -h >> $FULLLOG 2>&1
 
 
LOG "\n\n\n "
LOG "######################################"
LOG "# List of EFIX"
LOG "######################################"
emgr -l >> $FULLLOG 2>&1
 
LOG "\n\n\n"
LOG "#####################################"
LOG "# List of RPM Software"
LOG "#####################################"
 
rpm -qa --qf '%{installtime:date} Installed: %{name} %{version} \n' |awk '{print $5, $2, $3, $1, $4, $6, $7, $8}'| head -10 | sort >> $FULLLOG 2>&1
 
mail -s "Software Audit Report for $DATE" $MAILTO < $FULLLOG

Delete files based on date – ksh bash

Referring to an earlier blog post on how to delete log files or DRPlans using PowerShell I have another TSM server that is running on AIX. It’s new so I don’t have a bunch of DRM plans in the directory currently. Also this is a server I use for my internal testing and development so it’s not too heavily used by others.

Here is the directory listing before I wrote and ran the script:

image

After I wrote and ran the script:

image

I set it to only keep 7 days worth of files and now all I need to do is put it to run in my crontab every day…

Here’s the script – Cheers!

#!/usr/bin/ksh

DRM_Directory=/opt/tivoli/storage/drm/drplan
DaysToKeep=7

find $DRM_Directory -type f -mtime +$DaysToKeep -exec rm -f {} \;
![](http://feeds.feedburner.com/~r/Ad/PowershellStuff/~4/RMI6dAWkKlY)

Automated creation of mksysb and copy to NFS Server

I have a bunch of systems that we are working on, however we don’t have the tape library connected yet. It makes me very nervous to not have any backups so I put this together. It creates a mksysb on the local system and copies it to the NFS export. In my case I created this on my NIM server so if I needed to create a SPOT and reinstall or cut it to tape it’s already available there.
It’s pretty much ready to use so all you only need to do a few things:

  1. Update the variables
  2. Copy it to each system
  3. Schedule with cron
    #!/usr/bin/ksh
    HOSTNAME=uname -a | awk '{<span style="color: darkblue;">print</span> $2}'
    DATE=$(date +%m%d%Y)
    FILENAME=$HOSTNAME.$DATE
    RETAIN_LOCAL_BACKUPS=1
    RETAIN_NFS_BACKUPS=7
    BACKUPDIR="/mksysb/nfs_mksysb"
    NFSSERVER="NFSServer"
    # Check to make sure the directory we need to mount is created
    if [ ! -e "/mksysb/nfs_mksysb" ]; then
        mkdir -p /mksysb/nfs_mksysb
    fi
    # Determine if the NFS share is mounted unless it's the server serviing the NFS Mount
    if [ ! "$NFSSERVER" == "$HOSTNAME" ]; then
        mount | grep nfs_mksysb || mount "$NFSSERVER:/mksysb/nfs_mksysb" "$BACKUPDIR"
        mount | grep nfs_mksysb && MOUNTED = 1
    fi
    # Ensure the directory for the system is created
    if [ ! -e "$BACKUPDIR/$HOSTNAME" ]; then
        mkdir -p "$BACKUPDIR/$HOSTNAME"
    fi
    # Everything is mounted and ready to go - lets create our backup
    /usr/bin/mksysb -e -i "/mksysb/$FILENAME"
    # MKSYSB is completed so lets copy it over to the NFS share
    cp "/mksysb/$FILENAME" "$BACKUPDIR/$HOSTNAME/"
    # Clean up local directories
    find /mksysb ( ! -name mksysb -prune ) -name "$HOSTNAME.*" -mtime +$RETAIN_LOCAL_BACKUPS -exec rm {} ;
    # Clean up NFS share
    find "${BACKUPDIR}/${HOSTNAME}" -name "${HOSTNAME}.*" -mtime +${RETAIN_NFS_BACKUPS} -exec rm {} ;
    # We are finished so lets unmount the share
    [ ! -z "$MOUNTED" ] && umount /mksysb/nfs_mksysb

Email Files with PowerShell

I have a need when dealing with customers and their disaster recovery plans provided by [Tivoli Storage Manager (TSM)](http://www-01.ibm.com/software/tivoli/products/storage-mgr/) to get these files offsite on a regular basis. Normally every day at about the same time. It’s a great idea to email them to yourself, however not such a great idea if the email is on the server you may need to recover. I recommend in most cases that people get an external email account ([Gmail](http://www.blogger.com/www.gmail.com), [Live](http://www.blogger.com/www.live.com), [Yahoo](http://www.blogger.com/www.yahoo.com), etc.) and have the disaster recovery plans sent to them there. That way they are more likely to be able to retrieve them then if they were on the [Exchange](http://www.blogger.com/www.microsoft.com/exchange) or [Lotus Notes](http://www-01.ibm.com/software/lotus/products/domino/) (Yes, people still use Notes for email) server that was in the datacenter that just imploded.
You need to update a few things and to make this work:
  • $SMTPServer – Make it your SMTP server
  • $DistributionList – I did it this way so you (or someone else) don’t need to edit the script when the recipients change
  • $SendingAddress – Who is this email going to be coming from?
  • $DataDirectory – What directory are the files kept in that need to be sent?
  • $RequiredFiles – The file names that need to be sent
    In this instance the DR Plan itself is the last file to be created and has a different name everyday. I’m using the time difference to add it to the list of files that are needed.
    # Author: Thomas Wimprine
    # Creation Date: Dec 14, 2011
    # Filename: SendDRPlan.ps1
    # Description: Collect files needed for TSM Dr Recovery and email them to a distibution list
     
    Function SendEmail {
        param (
            $FilesArray
        )
        $SMTPServer = "mail.domain.com"
        $DistributionList = "DRPlanRecipiants@domain.com"
        $SendingAddress = "TSM@domain.com"
        
        # Create our mail message objects
        $ToAddress = New-Object System.Net.Mail.MailAddress $DistributionList
        $FromAddress = New-object System.Net.Mail.MailAddress $SendingAddress
        $Message = New-Object System.Net.Mail.MailMessage $FromAddress, $ToAddress
        
        $Date = Get-Date
        $Date = $Date.ToShortDateString()
        $Message.Subject = "TSM DR Plan for $Date"
        $Message.Body = @("This is the daily DR plan as created by TSM with the required files to recover. Retain this message with attachments until it is no longer needed")
        
        # Add the attachments we need to the message
        foreach ($File in $FilesArray) {
            $Attachment = New-Object System.Net.Mail.Attachment($File,'text/plain')
            $Message.Attachments.Add($Attachment)
        }
        
        $Client = New-Object System.Net.Mail.SMTPClient $SMTPServer
        
        $Client.Send($Message)
        
    }
     
    Function GetLatestDRPlan {
        param ($Directory)
        foreach ($File in Get-ChildItem $Directory) {
            if ($NewestFile.CreationTime -lt $File.CreationTime) {
                $NewestFile = $File
            }
        }
        $NewestFile
    }
     
    $DataDirectory = "D:\DRPlanDir"
    $RequiredFiles = "devconfig","volhist"
    $RequiredFiles += GetLatestDRPlan($DataDirectory)
     
    $AttachFiles = @()
    foreach ($File in $RequiredFiles) {
        $AttachFiles += "$DataDirectory\$File"
    }
     
    SendEmail($AttachFiles)
    ![](http://feeds.feedburner.com/~r/Ad/PowershellStuff/~4/6FlLey1oF5A)

Remove Old Computer Accounts from the Domain

We have a manual process of retiring systems from the domain and like any manual process without checks and balances there are times is does not get completed properly. One of these “checklist items” is to remove the computer from the domain. If this fails to happen we will have a computer account somewhere in Active Directory forever or at least someone notices. My solution to keep Active Directory “reasonably” clean is of course – PowerShell!
We have computers that could be turned off sitting in the closet or some place else for more than the 30 day password reset period. This we have found in our environment could actually exceed 90 days regularly enough to notice. I don’t want these computers deleted from the domain, however I would like them to call the helpdesk to make sure it’s on and give it the once over for anti-virus, patches, etc.
My script is simple three lines:
  1. Get the date 90 days ago
  2. Disable the computer accounts that haven’t had their password reset in at least 90 days
  3. Delete the computer accounts that haven’t been modified in at least 90 days
    I do it in this order and with the same date since disabling the computer account is a change that is registered. This way I will only be deleting the computer accounts that have not accessed the domain in at least 150 days and possibly up to 180 days.
    $date = [DateTime]::Today.AddDays(-90)
    Get-ADComputer -Filter 'PasswordLastSet -lt $date' -Properties PasswordLastSet | sort Name | Set-ADComputer -Enabled $false 
    Get-ADComputer -Filter 'Modified -lt $date' | Remove-ADObject -confirm:$false -Recursive

Note: I have the sort on line two simply for troubleshooting and if I ever need to look at the output I just replace the last pipe with a Write-Host or Export-Csv.

Update Computer Description from Active Directory

We try and keep everything in our domain synchronized, however it’s not always easy when the data is kept in three or more independent locations. I created this script a few years ago with the Quest cmdlets but figured since I can do it natively with the AD cmdlets I would update it and repost it.
 
$Servers = Get-ADComputer -Filter {OperatingSystem -like "*Server*"} | Select-Object Name, Description | Sort-Object Name 
foreach ($Server in $Servers) {
    $ServerWMI=get-wmiobject -class win32_operatingsystem  -computername $Server.Name
    $ServerWMI.Description = $Server.Description
    $ServerWMI.Put()
}
This script simply queries AD for all the servers,gets the computer name and description. It then connects to the server and updates the description. Pretty straight forward and simple…
Thomas
![](http://feeds.feedburner.com/~r/Ad/PowershellStuff/~4/ilWqXCS7bFI)

Protect Organizational Units from Deletion

While this is a real simple script it could save you from a lot of problems later. Have you (or any of your users) accidently right-clicked on a folder and moved it somewhere and you just can’t find it afterwards? Image that happing to your AD because an administrator made a mistake moving the mouse. Even worse when you did it yourself and you notice half a second too late.
In Windows 2008 R2 it defaults to being protected, however if the OU needed to be moved or was upgraded the flag may not be set.
This simply searches all your OU in Active Directory and if the ‘ProtectedFromAccidentalDeletion’ flag is not set to TRUE it sets it, it doesn’t matter how deeply buried.
Import-Module ActiveDirectory
$OU = Get-ADOrganizationalUnit -Filter {Name -like "*"} -property ProtectedFromAccidentalDeletion | Where-Object {$_.ProtectedFromAccidentalDeletion -eq $False} 
foreach ($UNIT in $OU) {
Set-ADOrganizationalUnit $UNIT -ProtectedFromAccidentalDeletion $true 
}

Hope this saves at least one person a late evening of stress and heartache!