Category Archives: Automation

Adding SQL Server Agent Jobs using Puppet

I find Puppet Enterprise to be very useful for configuring our many SQL Servers.  It does a nice job of setting up the SQL Instance and doing some base configuration.  There were a few things I wanted to add that it didn’t do out of the box that I thought I’d share.  One need I had was there was a set of specific SQL Agent jobs that I deployed out to our servers that I wanted Puppet to lay down for me.  I was able to build a pseudo-resource using the PE sqlserver forge module and some T-SQL.  I call it a pseudo-resource because it’s not a real resource in Puppet (with all the backing Ruby classes), but it behaves very much like a resource.

To do this, I needed the puppetlabs/sqlserver module and I had to create two files in my Puppet code repository.

NOTE: You must have Puppet Enterprise to use the puppetlabs/sqlserver module!

The first file I had to create was a T-SQL template that would generate the code needed to add the SQL Agent job.  This template is not 100% fully-featured, and a lot  more variables can be added to fully flesh out all of its options, but this is a very solid start.  I named this file sql_agent_job.epp and dropped it in my “templates” folder.  It looks like this:

<%- | String $name, String $description, String $notifyOperator = "", Array[Hash] $steps, Any $schedules = undef | -%>
BEGIN TRANSACTION
BEGIN TRY
  DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]';
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
END;

This isn’t the complete file (see my link below for the entire thing), but it gives you the idea. The template gets called by the .pp class file, which is below.

The second file is the actual Puppet class (.pp extension).  This is the file that implements the template and makes the whole thing “resource-like”.  This file belongs in your “manifests” folder in your repository or module:

class sqlserver::sql_agent_job()
{
  define sql_agent_job
  (
    String $sqlInstanceName,
    String $description,
    String $notifyOperator,
    Array[Hash] $steps,
    Any $schedules = undef
  )
  {
   sqlserver_tsql { "${title}_${sqlInstanceName}_sql_agent_job" :
      instance    => $sqlInstanceName,
      command     => epp("sqlserver/sql_add_job.epp", {
                        name            => $name,
                        description     => $description,
                        notifyOperator  => $notifyOperator,
                        steps           => $steps,
                        schedules       => $schedules
                      }),
      onlyif      => "IF NOT EXISTS ( SELECT * FROM msdb.dbo.sysjobs WHERE name = '${name}' ) BEGIN
                        THROW 51000, '${name} job not present.', 10;
                    END;"
    }
  }
}

Note:  You have to make sure the call to epp(…) above points to the path your template is at.  In the example above, I presume it’s in the same module in the templates/sqlserver folder.  Your folder structure should look roughly like this:

manifests/
     /sqlserver/sql_add_job.pp
templates/
     /sqlserver/sql_add_job.epp

This is the resource you will actually drop in you profile classes to add jobs to servers. The input parameters are as follows:

#  PARAMETERS:
    # name                          => (namevar) Specifies the name of the agent job.   - https://msdn.microsoft.com/en-us/library/ms182079.aspx
    # sqlInstanceName               => Specifies the SQL Server instance.
    # description                   => Specifies the description on the job.
    # notifyOperator                => Specifies the name of the job operator to notify.
    # steps                         => An array of hashes specifying the job steps:
    #   name                          => String - The name of the job step
    #   command                       => String - The T-SQL to execute
    #   database                      => String - The name of the database to execute against if the subsystem is TSQL.
    #   onSuccess                     => Integer - 3(next)|2(quitfail)|1(quitsuccess)|4(gotostep), default is 1
    #   onFail                        => Integer - 3(next)|2(quitfail)|1(quitsuccess)|4(gotostep), default is 2
    #   onSuccessStepId               => Integer - The stepid to go to on success
    #   onFailStepId                  => Integer - The stepid to to go in failure
    #   subsystem                     => String - Specify either "TSQL" or "CmdExec".  Default is TSQL.
    #   outputFileName                => String - Specify the path to the file to write the output to.
    # schedules                     => (optional) A hash specifying a job schedule.     - https://msdn.microsoft.com/en-us/library/ms366342.aspx
    #   frequencyType                 => Integer - 1(once)|4(daily)|8(weekly)|16(monthly), default 4
    #   frequencyInterval             => Integer - (once) - not used | (daily) - every frequencyInterval days | (weekly) - frequencyinterval determines day of wek | (monthly) - determines day of the month
    #   frequencySubdayType           => Integer - 1(attime)|4(minutes)|8(hours), default 1
    #   frequencySubdayInterval       => Integer - number of minutes/hours
    #   frequencyRecurrenceFactor     => Integer - Number of weeks/months between exectutions.  Nonzero value required if frequencytype is 8|16|32 (not used otherwise).  Default is 0.
    #   activeStartTime               => "HHMMSS, default 0",
    #   activeEndTime                 => "HHMMSS, default 235959"

You’ll probably notice the parameter names and values are pretty much identical to the input parameters for sp_add_job, sp_add_jobstep and sp_add_jobschedule stored procedures. A trick I use when I want to take a job and add it to Puppet is to add the job to SQL Server first, set it up the way I want, then script the job out. The parameters in the T-SQL script will pretty much translate to the sql_agent_job resource.

Here is an example of a profile with the sql_agent_job resource in use:

profile::sqlserver::component::sql_agent_job::sql_agent_job { "${name}_my_agent_job":
      name                  => "My SQL Agent Job",
      sqlInstanceName       => $name,
      description           => 'This is my SQL Agent Job being deploying wiht Puppet.',
      notifyOperator        => 'SQLTeam',
      steps                 => [{
                                name      => 'Execute Test Script',
                                database  => 'master',
                                subsystem => 'TSQL',
                                command   => "SELECT 'test data here'",
                                onSuccess => 1,
                                onFail    => 2
                              }],
      schedules             => {
                                frequencyType           => 4,
                                frequencyInterval       => 1,
                                frequencySubdayType     => 4,
                                frequencySubdayInterval => 30,
                                activeStartTime         => 000000,
                                activeEndTime           => 235959
                              },
    }
The full versions of these files can be found in my GitHub repository here:
Enjoy!

Creating a Directory Tree in Puppet

As you can probably tell from the flurry of blog posts I’ve made concerning Puppet, I’m going through the process of learning and setting up Puppet Enterprise.

One thing that irked me early on is the inability of the file resource to create a directory if the parent directory does not exist.  For example:

file { 'mydirectory' :
  ensure         => 'directory',
  path           => 'c:/parentdir/childdir'
}

If c:\parentdir does not exist, this fails.

Error: Cannot create C:/parentdir/childdir; parent directory C:/parentdir does not exist
Error: /Stage[main]/Profile::Myclass/File[mydirectory]/ensure: change from absent to directory failed: Cannot create C:/parentdir/childdir; parent directory C:/parentdir does not exist

You can alternately specify it like this to get it to work:

file { ['c:/parentdir', 'c:/parentdir/childdir'] :
  ensure         => 'directory'
}

This works, and for the most part is OK.  In my case though, I have the user provide the directory name through a class parameter:

class myclass ([String] $mydirectory)
{
  file { 'mydirectory' :
   ensure         => 'directory',
   path           => $mydirectory
  }
}

If the user specifies c:/parentdir/childdir, and c:/parentdir does not exist, it explodes.  I could adjust the code and advise my users to pass in arrays of strings representing the directories, but that’s not very clear or clean.

Fortunately, Puppet supports PowerShell and PowerShell is awesome:

class myclass (String $directory)
{
   exec { 'mydirectory' :
     command => "c:\\windows\\system32\\windowspowershell\\v1.0\\powershell.exe -noprofile -noninteractive -command \"New-Item -ItemType Directory -Path \"$directory\" \"",
     onlyif  => "c:\\windows\\system32\\windowspowershell\\v1.0\\powershell.exe -noprofile -noninteractive -command \"if (Test-Path -Path \"$directory\" -PathType Container) { exit 99 }\""
   } 
}

This code block creates the entire directory tree without issue.  The onlyif parameter ensures that the exec block is not fired off if the directory already exists.

Enjoy!

Puppet Agent on Windows – Module not found

The first step I take when developing a new Puppet configuration is to install the Puppet Agent on a standalone test Windows server and build the configuration files locally there.  I then use the puppet apply utility to test it and make sure it works.  This saves a lot of time since it avoids having to do hundreds of pushes and merge requests to our source control system as I tweak and debug the config files to get them working the way I want.

I had some challenges getting this setup initially though.  I attempted to follow advice given to me by my Puppet SE, and researched and tried to implement Roles and Profiles as a means of developing layered configurations.  It make sense to do it this way, especially as your configuration base grows, but it requires a bit of know-how to get working properly.  One of the major stumbling blocks I hit was getting Puppet to recognize classes located in a non-standard directory.  The normal, standard directory structure looks like this:

C:/ProgramData/PuppetLabs/code
    /modules # This is the default $basemodulepath
    /environments/production
        /manifests  # This is where it expects site.pp and any other code you write
        /modules     # Your downloaded and custom modules can also go here

In my case, I wanted to create a “site” directory in which I stored my role and profile configurations per the design above.  My structure looked like this:

c:/programdata/puppetlabs/code/environments/production
    /site
        /profile/manifests
        /role/manifests

Since this was not in the default $basemodulepath directory  or the environment module directory I’d receive an error stating the class could not be found:

ModuleNotFound

This is easy enough to figure out.  Puppet is highly configurable, and as such you can add additional directories to the list of those it looks in for classes it can use.  In my case, I simply edited the environment.conf file found at C:\ProgramData\PuppetLabs\code\environments\production\environment.conf  and commented-in the modulepath variable.  I then added my site folder.  I changed this line:

# modulepath = ./modules:$basemodulepath

To look like this:

modulepath = modules:site:$basemodulepath

However, I found I would still receive the same error as before.  A clue for me was when I ran the puppet config print modulepath command:

PS C:\ProgramData\PuppetLabs\code\environments\production\manifests&amp;gt; (puppet config print modulepath) -split &quot;;&quot;

You can see it lists the following paths:

C:/ProgramData/PuppetLabs/code/environments/production/modules
C:/ProgramData/PuppetLabs/code/modules
C:/opt/puppetlabs/puppet/modules

None of these were my site directory.  It’s as if the change I made to environment.conf was simply ignored.

Essentially, I found it was.  Even though the inital example show in the environment.conf files shows this (note the colon delimiter):

# modulepath = ./modules:$basemodulepath

I found the Windows Agent uses semicolons, not colons as a delimiter for multiple paths.  This is kind of documented here.

Path Separator

Make sure to use a semi-colon (;) as the path separator on Windows, e.g., modulepath=path1;path2

Plain enough, but this document does not reference the environment.conf file specifically, or even the Puppet Agent (this seems to be just a general Windows thing).  Also, the Puppet Agent installer lays down the environment.conf file with the colons in place, so it’s very misleading.

In any case, I found that if I changed the file to look like this, everything worked:

modlepath = modules;site;$basemodulepath

Running puppet config print modulepath confirmed my site path now shows up:

C:/ProgramData/PuppetLabs/code/environments/production/modules
C:/ProgramData/PuppetLabs/code/environments/production/site
C:/ProgramData/PuppetLabs/code/modules
C:/opt/puppetlabs/puppet/modules

So, in summary, if you are using any non-standard paths for your modules or classes on a Windows machine, make sure and use semicolons to delimit multiple paths for the modulepath setting, rather than the default colon.

Confusing, but easy to fix fortunately.

hiera-eyaml and Puppet Enterpise – Command not found?

I’m in the process of evaluating Puppet Enterprise as a configuration management solution for my company.  A glaring issue I hit early on is figuring out how to secure credentials that are fed to the various Puppet configurations.  By default, there is no way I’m aware of to obfuscate credentials in the configuration areas (including hiera files and class parameters in the GUI).  This is an issue as I can’t expose certain credentials to the general public.

Fortunately, hiera-eyaml was easy-to-find and does the trick.  There’s a lot of good documentation out there on how to set this up, and I won’t belabor that point, but to a Puppet noob the documentation makes a lot of assumptions.  The main assumption I want to clear up is how to get it up-and-running on your Puppet Master server using the eyaml utility from the CLI.

 

The GitHub document appears easy-to-follow:

https://github.com/TomPoulton/hiera-eyaml

The first step makes perfect sense, and worked without issue:

puppetserver gem install hiera-eyaml

The problem was after this.  I could not call the eyaml executable.  If I typed “eyaml –help”, “eyaml encrypt” or any valid variation of the command I received a “eyaml:  command not found” error.

Long story short, the issue is the Puppet master server does not have the ruby interpreter setup by default for command line use. The command above does make hiera-eyaml available for the Puppet software’s use, and you can go about configuring  it and using as stated in the GitHub readme for Puppet, but the eyaml calls will not work for you on the CLI.  The assumption they make is that you know to install the Ruby interpreter and gem separately for CLI usage.  To do this, do the following from the Puppet master (or any Linux station):

apt-get install ruby
gem install hiera-eyaml

Now the ruby interpreter is available for use to you on the CLI and you can call the eyaml executable as noted in the GitHub article.

I’m sure this is obvious to a Ruby/Linux expert, but it took me about 3/4 of a day to figure this out, so hopefully this helps save someone some time down the road.

Powershell Module for Logging to a File

Despite out best efforts as coders, automated processes sometimes fail.  One of the principle ways to troubleshoot such processes is to log data to a file so you can follow what happened after-the-fact.  I have a TON of scripts that have to do this, so it made sense to cobble together some functions that make doing this easier.

To this end, I’ve written a script module called bsti.logging.  It features the following functions:

logging_functions

Once you import the module, you call New-LogFile to setup a new file to write to.  You can specify options to append the weekday to the file or a more specific timestamp (e.g. MyLog_mmddyyyyhhmmss.log) to the log file.  For timestamped log files, you can also setup retention so old log files get automatically deleted after a period of time or after so many accumulate.

logging_new_logfile

I have three basic types of ways to handle log file naming that you need to be clear on to get good use out of the module:

1) Standard – The log file path you pass is will be unchanged by the function.  The purging parameters are ignored, you must use -Append or it will be overwritten if it exists already.

2) Circular – The log file will have _weekday appended to the file name before the extension.  If you pass in C:\temp\log\MyLogFile.log for example, you get:
MyLogFile_Monday.log
MyLogFIle_Tuesday.log
etc…
When you call New-LogFile with circular naming and the *same* log file path again that same day, it will automatically be appended to.  When the next Monday rolls by, it *automatically* overwrites it.

This scheme is good if you call the script that writes to the log file frequently, don’t want to manage a large number of log files, and don’t need more than 7 days of log file history.

3) DateStamped – This appends a detailed datestamp to the log file name before the extension.  In the example above, you get:
MyLogFile_03292015200500.log (Assuming 3/29/2015 10:05 PM)
This means every time you call New-LogFile (Assuming you wait at least 1 second!), you get a unique log file.  Append is essentially ignored.
The PurgeAfter and KeepNumberOfFIles, if specified, will cause the New-LogFile function to call Remove-LogFiles automatically and keep your log files trimmed as you specify.  If you specify both PurgeAfter AND KeepNumberOfFiles, both thresholds are observed (meaning the file needs to be older than what you specified with PurgeAfter AND you have to have KeepNumberOfFiles remaining).

This scheme is good if you need a specified history of log files and want individual log files for each run of your process.  The automatic cleanup is a bonus.

Once you’ve setup your new log file, you call the following functions to write to it.  These functions also echo to the console, so you can replace any calls to Write-Host with these functions and get messages to your console and to a log file:

Write-Message
Write-Object
Write-Banner
Write-BlankLines

logging_write

Makes things pretty simple.

As with all of my modules and module functions, I heavily document them using Powershell comments-based help.  Just try:

Get-Help New-LogFile -Full

This module does depend on my bsti.conversion module, so if you use this module as a whole you need both modules.  I posted about that module here.

Here is a link to the new bsti.logging module.
Here is a link to the bsti.conversion module.

TimeSpan Conversion Function and Module

I have a ton of Powershell code I’ve written over the last 6 years or so that I’m in the process of cleaning up and looking to share.  I plan on re-presenting them as a set of scripts and script modules I’ll be featuring in a series of blog posts coming in the near future.

I’ll start simply.  I have a new script module called bsti.conversion that has a single function:  ConvertTo-TimeSpan

I always thought it was neat that you could type in 1kb and Powershell would convert that value to 1024.  You can also use mb (megabytes), tb (terabytes), and pb (petabytes).  I don’t see that eb (exabytes) works.  In any case, I always wished I could do the same with time units like this:

12m (minutes) or 24h (hours) or 7d (days)

The ConvertTo-TimeSpan function I’ve included in this module does just that.

What I use this for is I have functions and scripts I like to write that require the user to pass in an easy time unit value.

This functionality can also be achieved by Timespan conversion like so:

[Timespan](“1.00:00:00”)  # 1 day
[Timespan](“1.6:00:00”)  # 1 day, 6 hours
[Timespan](“1:22:00”)  # 1 hour, 22 minutes

The conversion function is a little less precise, but a bit more human-readable, which is important to me since most of my users are not .NET programmers and don’t understand the format of a timespan object right offhand.

The function in this module supports both formats:

bsti.conversion1

The module files can be downloaded here.
Once downloaded, extract the folder to the C:\windows\system32\windowspowershell\v1.0\modules directory.  The final structure should look like this:
C:\windows\system32\windowspowershell\v1.0\modules\bsti.conversion

Then just use the Import-Module bsti.conversion command as shown above.

Not bad for a start, hope you enjoy.

UPDATE:  I’m adding my stuff to GitHub.  Bear with me while I come up-to-speed on how to use it.  Find this module project here:

https://github.com/Roadkill4545/bsti.Powershell/tree/master/bsti.conversion

Automatically Transcripting all Powershell Sessions

If you love Windows Powershell as much as I do, you probably find yourself using it to complete day-to-day management tasks in addition to scripting and automation.

More and more hardware vendors, like NetApp and VmWare, provide very robust Powershell toolsets.  Because I’m such a command line guy, and these Powershell libraries are so powerful, I perform nearly all of my management tasks from Powershell.  I find tasks like provisioning and destroying storage, creating clones, managing snapshots, and getting data from virtual machines much easier via the command line in many cases.

Because this is a day-to-day thing to me, it’s useful to have all of the code I type in and the output I get back automatically logged to a file for future perusal.  I liken it to the administrators who manage a lot via SSH, and setup Putty to log all sessions to a file.  This allows me to look at past actions, remember how I did stuff, or see what I did wrong if something got messed up.  It can also helpful for auditing if you need to track who did what using Powershell.

Below is a procedure I use on all management stations I use Powershell from.  This procedure automatically logs all activity from the command line to a file.

By default, it creates a new transcript file in the C:\users\myusername\LogFiles\Powershell\computername directory.

You can override this if you want to go to a central location, by calling the Set-TranscriptFilePath function.

For instance, to transcript everything to a central file share:

Set-TranscriptFilePath -path "\\myserver\LogFiles\Powershell\server1"

It creates one new transcript file per session you launch, so you won’t have multiple sessions writing over the same file.

To set this up, create a new file called profile.ps1 in one of the following directories:

Apply to just the current user:
C:\users\myusername\documents\WindowsPowershell\profile.ps1   only

Or to apply to all users on the computer:
c:\windows\system32\WindowsPowershell\v1.0\profile.ps1

Copy the following script text to the profile script you created:

<# 
  .SYNOPSIS 
  Windows Powershell console profile script. This script is generic enough to be run from any machine. It sets up console logging to a network share for servers and locally for workstations. 
  NOTE: This will not transcript in Powershell ISE! Transcripting in ISE is supported in the current (early) version of Windows Management Framework 5.0 however. 
#>

$script:TranscriptFileKey = "HKLM:\SOFTWARE\PowershellManagement\Powershell"

#############################################################################################################################################
# FUNCTIONS
#############################################################################################################################################

function Get-TranscriptFilePath()
{
  <# 
    .SYNOPSIS 
    This function returns the location where Powershell session transcript log files go. 
  #>

  [CmdletBinding()]
  param
  (
  )

  $transcriptFilePath = ""

  # User can override log path in registry:
  if ( Test-Path -Path $script:TranscriptFileKey )
  {
    $regKey = Get-Item -Path $script:TranscriptFileKey
    if ( $regKey.Property -icontains "RootTranscriptPath" )
    {
      $transcriptFilePath = (Get-ItemProperty -Path $script:TranscriptFileKey -Name "RootTranscriptPath").RootTranscriptPath
    }
  }

  if ( !$transcriptFilePath )
  {
    # Station is a workstation, use local path:
    $transcriptFilePath = Join-Path $Env:USERPROFILE -ChildPath "LogFiles\Powershell\$($Env:ComputerName)"
  }

  # Create the log file path if it does not exist:
  if ( !(Test-Path -Path $transcriptFilePath) )
  {
    New-Item -ItemType directory -Path $transcriptFilePath -Force | Out-Null
  }

  $transcriptFilePath
}

function New-TranscriptFilePath()
{
  Join-Path -Path (Get-TranscriptFilePath) -ChildPath ("Powershell {0:MM dd yyyy hh mm ss} {1:00000}.log" -f (Get-Date),$PID)
}

function Test-Administrator
{
  $user = [Security.Principal.WindowsIdentity]::GetCurrent()
  (New-Object Security.Principal.WindowsPrincipal $user).IsInRole([Security.Principal.WindowsBuiltinRole]::Administrator)
}

function Set-TranscriptFilePath()
{
  <# 
    .SYNOPSIS 
    This function sets the transcript file path from the default. This affects all future Powershell sessions. 
    
    .PARAMETER Path 
    Specifies the new path where future transcript files get saved to. This will remain the path until it is changed. Set this to "" to reset to the default path: C:\users\username\LogFiles\Powershell\computername 
  #>

  [CmdletBinding(SupportsShouldProcess=$true)]
  param
  (
    [string] $Path
  )

  if ( !(Test-Administrator) )
  {
    throw ("You must launch this console as an administrator to execute this function!")
  }

  if ( $Path )
  {
    if ( !(Test-Path -Path $Path) )
    {
      # Create the registry path:
      New-Item -ItemType Directory -Path $script:TranscriptFileKey -Force -ErrorAction Stop | Out-Null
    }
  }

  Set-ItemProperty -Path $script:TranscriptFileKey -Name RootTranscriptPath -Value $Path -ErrorAction Stop | Out-Null

  if ( !$Path )
  {
    $Path = Join-Path $Env:USERPROFILE -ChildPath "LogFiles\Powershell\$($Env:ComputerName)"
  }

  Write-Host ("Future transcript files will be saved to the following directory: $Path") -ForegroundColor Green
}

Once complete, every new Powershell console session you launch will be automatically transcripted!

Note: As of Powershell 4.0, you can’t transcript from the Powershell session that gets launched from Powershell ISE.  This is due to a difference in the console.  However, I have noticed it works OK in the preview of Windows Management Framework 5.0, so i suspect support for this is coming soon.

I’ve uploaded the script here if you’d rather not cut-and-paste.