Get MAC Addresses through PowerShell

$Servers = Import-Csv c:\Temp\servers.csv

foreach ($Server in $Servers) {
$NetAdapter = Get-WmiObject -Class Win32_NetworkAdapterConfiguration -ComputerName $Server.Name -Filter “IpEnabled = TRUE”
foreach ($Adapter in $NetAdapter) {
$Name = $Server.Name
$MAC = $Adapter.MacAddress
Write-Host “$Name - $MAC”

We needed to get the MAC addresses for the network team and I didn’t have a script for it in my library. We already had the list of servers we need, so I used that to query WMI and return the MAC on the adapters that would be connected to the network.

CertLog Consuming Large Amounts of Disk Space


Yesterday we had an issue where our certificate server stopped responding. The OS was responsive, however the CA stopped servicing requests and there were a fair amount of errors in the Application log that were similar to this one:
[![image]( "image")](
When we looked in the directory we found files that looked like this:
[![image]( "image")](
People that are familiar with Exchange recognize that [ESENT is a Jet database]( The log files and the edb.log and edb.chk files also look really familiar. The problem was that we had 7Gb of log files filling up this drive and the certificate services couldn’t write  the log files due to a lack of free space. Doing a simple search showed a fair amount of results explaining how to stop the services and delete the log files, however this didn’t seem like the correct course of action since this is a database. There is no way I would just delete the log files for my Exchange server so why would I do it for my certificate server? I would backup my Exchange server and that would truncate all my log files.



Another search on “[Backup certutil](” sent me to TechNet and the article explaining how to backup my certificate authority. The command “certutil –p P@ssw0rd –backup D:\CertBackup” performs a full backup of the database and truncates the log files, thus returning all the used drive space. This creates the directory “CertBackup” on the D drive if it doesn’t exist and populates it with a certificate file “ServerName.p12” and another directory called DataBase with the actual edb file and a dat file.
[![image]( "image")](
After the backup completes all the log files will be truncated and the services, if stopped, can be restarted. We will be running this periodically to make sure we don’t have this problem again. One issue with the scripted approach is that it will not overwrite the previous backup so you must delete or rename the previous one or create a new path for each backup which isn’t hard if you are a [Scripting Guy](![](

Collect WWN from AIX Systems

I have a need to collect all the WWN from my AIX systems that are running. Unfortunately I’m inheriting this environment from someone and they didn’t keep records and I’m really not interested in the other method of getting them from the HMC.
I have a file that has all the systems or IP address that I need the names from called “hostfile” in the current directory – you can parse this from any file you have with just a bit more scriptfu if you need.

for x in `cat hostfile | sort`
        echo $x
        ssh root@$x "for i in \`lsdev -Cc adapter | grep fcs | awk '{print \$1}'\`; do \
                lscfg -vpl \$i | grep 'Network Address'; done"


Running this will produce output similar to this:      (With the actual WWN of course)

        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx
        Network Address.............C0xxxxxxxxxxxxxx

Automated nmon collection

I’m working with a customer and I get this question

“My system is running slow, do you know why?”
“How do you know it’s running slow?”, I ask.
“I don’t know it just seems like it’s taking longer to do <whatever>”, they say.
“Do you have any trending that we can look at?”
“No – it was running fine. Why would I need that if it everything was fine?”, they ask.

I shake my head and die a little inside…
As a result of a few conversations like this I decided that rather than leaving it up to the customer to collect performance statistics on a regular basis, I need to do it for them. This gives me a few bits of information to use when they finally start coming to me with the inevitable performance problems and/or questions:
  • We have a baseline to work from!!!
  • We can determine what changed, if anything
  • It may only be in your head – we need to prove it’s not! (Some crazy people work on computers)
  • If there really is a problem, where do we even start looking?
    I whipped up this simple script to make it a lot easier to collect that baseline. [Nmon]( is now part of the [AIX]( operating system and there is really no reason why you shouldn’t be using it to collect data. It’s a very good “whole system health” type monitoring tool that you can get down and dirty with if needed.
  1. Change the variables to valid values for you!
  2. REPORT_RCPT – Who is going to receive this analysis?
  3. COMPANY – If you are collecting this for multiple companies (I am) put the company name here so the reports make sense when you get them.
  4. REPORTS_TO_KEEP – How many are we going to keep on the local system (not in your email!)
    Schedule the script in cron.
  5. A lot of people like to schedule this for midnight, however that’s when a lot of people schedule maintenance or backups. This splits those periods of high activity into multiple reports. Consider scheduling it for a regular “slow” period, like when people leave work ( 17:00 to 19:00 ) or usually about 06:00 before they come in to hit the system and nightly processes are finished.
    Collect the reports it sends to you – DON’T DELETE THEM! Copy them to your system if you need to get them out of email. Remember this is data collection not data collect and toss!!!
  6. [Analyze]( periodically so you have a quantitative value and idea what your system is doing – how often depends on your environment.
  7. When you have a performance problem later – reference earlier reports to determine what changed.


export DATE=`date +%m%d%Y`
export CURRENT_DIR=`pwd`

export REPORT_RCPT=""
export COMPANY="MyCompany"

# if the directory doesn't exist - create it

if [[ ! -d "/tmp/nmon" ]] then
 mkdir /tmp/nmon
cd /tmp/nmon

# Now lets get the one from yesterday and email to where is needs to be
export NMON_FILES=`ls -ctl | awk '{print $9}' | grep nmon | grep -v gz`
for i in $NMON_FILES; do
 export NEW_FILENAME=`echo $COMPANY ${i} | awk '{print $1 "_" $2}'`
 sort $i > $NEW_FILENAME
 tar cvf - $NEW_FILENAME | gzip -c > $NEW_FILENAME.tar.gz

 # we have the file we need now lets email it to whomever need it
 uuencode $NEW_FILENAME.tar.gz $NEW_FILENAME.tar.gz | mail -s "$COMPANY nmon report for $DATE" $REPORT_RCPT

 # Cleanup just a bit
 rm -f $i

find /tmp/nmon -type f -mtime +$REPORTS_TO_KEEP -exec rm -f {} \;

# Start NMON for the next day!
nmon -f -s 60 -c 1440

# Just incase you run it interactivly - return to where you started


PowerShell–Remove old log files

Have you ever had an application running on a server and was completely happy until you realized your drive space was slowly getting chewed up? You do a little investigation and realize that this application has been writing log files since creation and never does any cleanup.  This may not be a huge problem as before with our 3Tb hard drives, however if this has been a system that’s been in production for a few years and forgotten about this could become an issue.

This isn’t completely limited to log files, some applications actually create files for you to use and leave them on the drive assuming (correctly in some cases) that you will deal with them. One application that does this and I support is IBM’s Tivoli Storage Manager (TSM). This is a data protection/backup/archive/management software and it creates disaster recovery plans on a daily basis, when configured properly. These plans get created and place in a per-determined location for you to do what you need to get them offsite so you can recover your systems and data in the event of a disaster.

I recently went into a customer’s location and they have had this system running for many years. This server was sitting in the rack just happy as could be, however they had data on the drives that went back… a long time… Unfortunately this was their history files and the disaster recovery plans from forever ago. They didn’t need to keep it beyond its useful life, which in this case is really only a few days. This is one of the scripts I wrote for them using PowerShell to clean out all the log files and DRM plans that are no longer needed.

This gets put in a file “C:\Scripts\RemoveExpiredLogs.ps1” that is scheduled to run daily.

# Author: Thomas Wimprine
# Creation Date: Dec 14, 2011
# FileName: RemoveExpiredLogs.ps1
# Description: Delete logfiles from a specified directory based on age

$RetainDays = 10 # Number of days you want to keep logs
$LogDirectory = "D:\LogDir&quot; # In this instance I need the trailing slash

$DeleteDate = [DateTime]::Today.AddDays(-$RetainDays)

foreach ($file in Get-ChildItem $LogDirectory) {
if ($file.CreationTime -lt $DeleteDate) {
Remove-Item $LogDirectory$file


Simple & Effective – I find myself re-writing this more often than I thought I would so I hope you enjoy…

You need Watson in your business!!!

Honestly – I just want someone to need this so I can work on it. This is one of those projects that would be amazing to be involved in and I want to be involved!

Quoting from the YouTube comments:

“Watson is designed to further the science of natural language processing through advances in question and answer technology. This "first of its kind" workload optimized system has applicability in your day-to-day business analytics challenges as well.
Solving these challenges requires many of the same architectural elements as Watson. Power Systems with POWER7 processor technology is uniquely positioned to deliver these capabilities..“

Blog Name Change

I’m hope there are a few people that subscribe to my blog feed. Unfortunately for some of those people that have subscribed for my technical Windows or PowerShell content I may need to disappoint  you.

I have recently changed employers and I was the only person in the shop with AIX experience and they had an over abundance of Windows administrators. This led me to a decision that I’m hoping will pay off and not haunt me later. I’ve moved over to handle the IBM pSeries side of the shop. This includes working on anything the pSeries will support as an installed operating system, interface with or be managed by.

While I’m not completely new to various flavors of UNIX, this will certainly be a learning experience for me and I will be posting technical content, tips & tricks and other times as I learn them.

Another bad Win7 Mobile decision–Developer Tools

I have an idea for an application for my new phone. Nothing complicated but I want to try and build it and see if I can get it to work or not (more likely). I have a Windows 7 Phone so I’ll just download the tools from Microsoft and see what I can do.

I download the file and run it expecting that there will be an installation and I’ll have some nifty tools installed that I can try to give my dreams life with. Um…  Nope – Epic fail here…


Both my laptop and my desktop are running Windows Server 2008 R2 – Why? I need to run virtual machines for development and/or a lab environment. This is a great idea since I can’t run a x64 guest on Windows 7 and I got the idea from MS personnel while at TechEd. This isn’t’ a foreign idea to anyone… Why would you not allow me to install developer tools here when I’m sure, there are people in Microsoft doing the exact same thing!!?!

The fix is simple enough, however why does there need to be a fix for something so simple? Seriously?

Here is the fix -

Windows Mobile 7 doesn’t connect to hidden wireless networks

I just purchased my Win7 Mobile phone. Honestly I’ve been waiting for this device to come out for over a year and I was pretty stoked that they are finally released. 

One of the cool things about modern mobile devices is that they can use other available networks besides just eating up your 3G data plan. This isn’t an issue for me at home, on campus or in most coffee shops since they have the wireless open or I have a the shared key. This is not the case at my place of employment, since it is our policy to keep the wireless networks from broadcasting the SSID. There are a few reasons for this and security really isn’t on that list. We are right across the street from a few retail stores, a hospital and some of our locations are in shopping malls – We see enough SSID broadcasts.

Microsoft does have a blog post about why NOT to hide your, however their reasoning is based on security and ignoring any other reasons for hiding a SSID.

This is so far the ONLY mobile device I’ve heard of or seen that cannot connect to a hidden network. There are enterprise customers  that I’m sure have a standard to hide the SSID besides us for very valid reasons besides a security façade. This isn’t just the inability to connect, you can’t even type in a SSID and key so I can’t even pre-stage or configure a network if I needed to say… Send a device to one of my executives at a conference or remote meeting.

Bad choice and pulling the security card is another bad call…