Tag: Powershell

Automate SQL Express Instance removal from SCOM with PowerShell – Ruben Zimmermann

Ruben is back again with another Powershell banger! This time he presents you a Powershell script that will automatically detect and remove the SQL Express Instances from SCOM monitoring and save you from unnecessary overhead of removing them manually!

Introduction

SQL Express Databases are a widely used storage for settings in applications or as data storage for small amount of data. Except of backups in cases it is not required to manage those databases.

The MS SQL Server Management Pack for SCOM discovers any edition. Thus, we can spot Express databases from the SCOM Console.

Unfortunately, the Management Pack can’t monitor Express databases and lots of unfixable alerts are thrown.

Manual solution

As described by Tim McFadden here https://www.scom2k7.com/disabling-sql-express-instance-discoveries/

or by  Kevin Holman here

https://kevinholman.com/2010/02/13/stop-monitoring-sql-express-and-windows-internal-database/

It is possible to either set filter strings to prevent the discovery for all Express instances by name.

This does not work if the Express is named as MSSQLSERVER.

MSSQLSERVER is also the default for SQL Standard and other editions.

Only choice then is to override object by object manually, or?

PowerShell solution

With a bit of PowerShell it is possible to override the discovery rules for Express editions no matter which name they have. – Put this script into your regular maintenance scripts to keep your SCOM free from Express instances:

# Defining Override Management Pack. - It needs to be created before starting.

$overrideMP = Get-SCOMManagementPack -DisplayName 'Custom.SQL.Server.Express.Removals'

# Get all Windows Servers Instances, needed as lookup to disable the specific computer object

$winSrvClass     = Get-SCOMClass -Name Microsoft.Windows.Server.Computer

$winSrvInstances = Get-SCOMClassInstance -Class $winSrvClass

# Get Express instances For SQL 2005 to SQL 2012

$classSQL2to8     = Get-SCOMClass -Name Microsoft.SQLServer.DBEngine

$instancesSQL2to8 = Get-SCOMClassInstance -Class $classSQL2to8

$expressSQL2to8   = $instancesSQL2to8 | Where-Object {$_.'[Microsoft.SQLServer.DBEngine].Edition'.Value  -eq 'Express Edition' }

$computersSQL2to8 = $expressSQL2to8.path

# Finding the computer objects which host SQL 2005 to SQL2012, store them in ArrayList

$targetComputers = New-Object -TypeName System.Collections.ArrayList

$computersSQL2to8 | ForEach-Object {

    $tmp   = ''

    $check = $false

    $check = $winSrvInstances.DisplayName.Contains($_)   

    if ($check) {

        $number = $winSrvInstances.DisplayName.IndexOf($_)

        $tmp    = $winSrvInstances[$number]

        if ($tmp -ne '') {       

            $targetComputers.Add($tmp)

        }   

    }

} #end $computersSQL2to8 | ForEach-Object { }

# Disabling the Dicovery Rules for those computers which host SQL 2005 to SQL 2012

$discoveryRuleList = @(

  'Discover SQL Server 2005 Database Engines (Windows Server)'

  'Discover SQL Server 2008 Database Engines (Windows Server)'

 'Discover SQL Server 2012 Database Engines (Windows Server)'

)

foreach ($discoveryRuleItem in $discoveryRuleList) {

$discoveryRule = Get-SCOMDiscovery -DisplayName $discoveryRuleItem

$targetComputers | ForEach-Object {

Disable-SCOMDiscovery -Instance $_ -Discovery $discoveryRule -ManagementPack $overrideMP

    }

}

#Removing the objects from SCOM. - This can take some Time!

Remove-SCOMDisabledClassInstance

# Get Express instances for SQL 2014

$classSQL2014     = Get-SCOMClass -Name 'Microsoft.SQLServer.2014.DBEngine'

$instancesSQL2014 = Get-SCOMClassInstance -Class $classSQL2014

$expressSQL2014   = $instancesSQL2014 | Where-Object {$_.'[Microsoft.SQLServer.2014.DBEngine].Edition'.Value  -eq 'Express Edition' }

$computersSQL2014 = $expressSQL2014.path

# Finding the computer objects which host SQL 2014, store them in ArrayList

$targetComputers   = New-Object -TypeName System.Collections.ArrayList

$computersSQL2014 | ForEach-Object {

    $tmp   = ''

    $check = $false

    $check = $winSrvInstances.DisplayName.Contains($_)

    if ($check) {

        $number = $winSrvInstances.DisplayName.IndexOf($_)

        $tmp    = $winSrvInstances[$number]

        if ($tmp -ne '') {       

            $targetComputers.Add($tmp)

        }   

    }

} #end $computersSQL2014 | ForEach-Object { }

# Disabling the Dicovery Rule for those computers which host SQL 2014

$discoveryRule = Get-SCOMDiscovery -DisplayName 'MSSQL 2014: Discover SQL Server 2014 Database Engines'

$targetComputers | ForEach-Object {

Disable-SCOMDiscovery -Instance $_ -Discovery $discoveryRule -ManagementPack $overrideMP

}

#Removing the objects from SCOM. - This can take some Time!

Remove-SCOMDisabledClassInstance

# Get Express instances for SQL 2016

$classSQL2016     = Get-SCOMClass -Name 'Microsoft.SQLServer.2016.DBEngine'

$instancesSQL2016 = Get-SCOMClassInstance -Class $classSQL2016

$expressSQL2016   = $instancesSQL2016 | Where-Object {$_.'[Microsoft.SQLServer.2016.DBEngine].Edition'.Value  -eq 'Express Edition' }

$computersSQL2016 = $expressSQL2016.Path

# Finding the computer objects which host SQL 2016, store them in ArrayList

$targetComputers   = New-Object -TypeName System.Collections.ArrayList

$computersSQL2016 | ForEach-Object {

    $tmp = ''

    $check = $false    

    $check = $winSrvInstances.DisplayName.Contains($_)

    if ($check) {

        $number = $winSrvInstances.DisplayName.IndexOf($_)

        $tmp = $winSrvInstances[$number]

        if ($tmp -ne '') {       

            $targetComputers.Add($tmp)

        }

    }

} #end $computersSQL2016 | ForEach-Object { }

# Disabling the Dicovery Rule for those computers which host SQL 2016

$discoveryRule = Get-SCOMDiscovery -DisplayName 'MSSQL 2016: Discover SQL Server 2016 Database Engines'

$targetComputers | ForEach-Object {

    Disable-SCOMDiscovery -Instance $_ -Discovery $discoveryRule -ManagementPack $overrideMP

}

#Removing the objects from SCOM. - This can take some Time!

Remove-SCOMDisabledClassInstance

Feedback is appreciated 😊

Warm regards

Ruben

Export All SCOM Subscriptions into a Text File

A few days ago I was in need to export all my SCOM subscriptions and be able to analyze them thoroughly. So I researched for a script/solution online but didn’t find anything particularly useful to my exact requirement. So I decided to write one myself!

I wrote a quick version of the script to get the job done, and indeed it was an investment of time well made! The script really spells out everything and it is very easy to analyze it visually and understand the relationships between the subscriptions, the channels as well as the subscribers. Sample output:

SCOM-Bob Cornelissen was gracious enough to publish the blog on his website as a guest post, read it here:

Export SCOM Subscriptions using Powershell

Cheers!

SCOM 2012 R2 to 1801 Side-by-Side Migration : The Powershell Way! – By Ruben Zimmermann

Ruben is back with another awesome blog post, and I have no doubt this one is going to help a lot of people!

We all know the hassles of migrating your SCOM from one version to another, and it involves a lot of manual efforts. And especially so when you choose the side-by-side upgrade path. Making sure you have all the Management Packs re-imported, all overrides are intact, making sure all permissions to various users you’ve assigned over the years are still in place, etc just to mention a few examples. We’ve all wished how great it would be if you could run some kind of script and it’ll do it all for us. Well, this is exactly what Ruben has made available for us! Here, take a look:

Preface

Migrating from SCOM 2012 R2 to SCOM 1801 isn’t a stressful job as both environments can live in parallel.

This blog gives examples how to migrate the configuration from A to B by trying to leverage PowerShell whenever possible / reasonable.

Please apply the PowerShell code only if you feel comfortable with it. If there are questions, please don’t hesitate to drop me a line.

Introduction

Although it is possible letting the agent do duplicate work in the sense of executing code in management packs, sending data to different management groups can cause excessive resource consumption on the monitored computer.

I suggest:

  • Migrate configuration to 1801 from 2012 R2 (blog topic)
  • Test with a very small amount of machine of different types (Application Servers, Web Servers, MSSQL, Exchange, Domain Controllers)
  • Move the remaining, majority to 1801
  • Change connectors to ticket systems, alert devices other peripheral systems
  • Remove the agent connections from 2012 R2 (see below for details)
  • Monitor the new environment for a while, if all fine decommission 2012 R2

Requirements

  • 1801 is setup fully functional and first agents have been deployed successfully.
  • Windows PowerShell 5.1 is required on the 2012 R2 and on the 1801 server
  • Service Accounts used in 2012 R2 will be re-used. – If they are different in 1801 it is no big obstacle to change them if they are overwritten by one of the import steps.
    • Usually SCOM will alert if there is misconfiguration such a lack of permission somewhere.

Migration

Below now the steps separated in the different parts. – No guarantee for complete-  or correctness.

Management Packs

Migrating to a new environment is always a great chance to perform cleanup and apply the lessons you’ve learned in the old one.

Review

Export all Management Packs from the current SCOM environment and review.

Get-SCOMManagementPack | Select-Object -Property Name, DisplayName, TimeCreated, LastModified, Version | Export-Csv -Path C:\temp\CurrentMps.csv -NoTypeInformation

For your convenience use Excel to open the CSV file. Import only those Management Packs which bring measurable benefit.

Overrides

Standard Approach

If you have followed best practices you have created Override Management Packs for each Management Pack to store your customizations. Export those and import them into your new environment.

Note: Verify that the overrides work as expected.

Green field approach

In case you have stored the all overrides only in one Management Pack or want to be more selective follow these steps

  1. Create a new Override Management Pack for that specific MP and name it properly.
    1. E.g. <YourCompayName>.Windows.Server.Overrides

    2. Follow the steps mentioned in ‘Management Pack tuning’.
      https://anaops.com/2018/06/22/guest-blog-management-pack-tuning-ruben-zimmermann/

Notification Settings (Channels, Subscribers and Subscriptions)

Quoted from: http://realscom.blogspot.de/2017/05/migrate-notifications-to-another.html

  • Export the Microsoft.SystemCenter.Notifications.Internal mp in the old and new management groups – do not modify these, they are our backup copies
  • Make a copy of both files and rename them to something meaningful like Microsoft.SystemCenter.Notifications.Internal.ManagementGroupName_backup.xml
  • Make a note of the MP version number of the Microsoft.SystemCenter.Notifications.Internal MP from the new management group. In my case it was 7.2.11719.0
  • Open up the Microsoft.SystemCenter.Notifications.Internal.xml file for the old management group and change the version number to that of the new management group MP version + 1. In my case I changed it from 7.0.9538.0 to 7.2.11719.1. This is so the MP imports properly in the new management group

Exporting Configuration with PowerShell

Open a PowerShell console on the 2012 R2 Management Server and use the following cmdlets to store the information to C:\Temp\ResolutionStates.json

Alert Resolution State

Only user defined resolution states are exported.

Get-SCOMAlertResolutionState | Where-Object {$_.IsSystem -eq $false} | Select-Object Name, ResolutionState | ConvertTo-Json | Out-File C:\Temp\ResolutionStates.json
Alert Auto – Resolution Settings

Limited to the properties AlertAutoResolveDays and HealthyAlertAutoResolveDays

Get-SCOMAlertResolutionSetting | Select-Object AlertAutoResolveDays, HealthyAlertAutoResolveDays | ConvertTo-Json | Out-File C:\Temp\SCOMExport\AlertResolutionSetting.json
Database Grooming Settings (hint to dwarp! Plus link)

Exports data retention settings for information in the OperationsManager database. The DataWareHouse database is covered later.

Get-SCOMDatabaseGroomingSetting | Select-Object AlertDaysToKeep, AvailabilityHistoryDaysToKeep,EventDaysToKeep,JobStatusDaysToKeep,MaintenanceModeHistoryDaysToKeep,MonitoringJobDaysToKeep,PerformanceDataDaysToKeep,PerformanceSignatureDaysToKeep,StateChangeEventDaysToKeep | ConvertTo-Json | Out-file C:\Temp\SCOMExport\DatabaseGrooming.json
User Roles

Exporting User roles and System roles into dedicated files.

Get-SCOMUserRole | Where-object {$_.IsSystem -eq $false } | Select-Object -Property Name, ProfileDisplayName, Users | ConvertTo-Json | Out-File C:\Temp\SCOMExport\UserRoles.json
Get-SCOMUserRole | Where-object {$_.IsSystem -eq $true }  | Select-Object -Property Name, ProfileDisplayName, Users | ConvertTo-Json | Out-File C:\Temp\SCOMExport\SystemUserRoles.json
Run As Accounts

Exporting user defined RunAsAccounts. System defined ones are created by Management Packs.

Get-SCOMRunAsAccount | Where-Object {$_.isSystem -eq $false} | Select-Object -Property AccountType, UserName, SecureStorageId, Name, Description, SecureDataType | ConvertTo-Json | Out-File C:\Temp\SCOMExport\RunAsAccounts.json
<# The cmdlet 'Get-SCOMRunAsAccount' can't extract credential information. A free cmdlet, written by Stefan Roth is required for that. – Import it to your 2012 R2 environment before proceeding with the following lines.

https://www.powershellgallery.com/packages/RunAsAccount/1.0 #>
Get-SCOMRunAsAccount | Select-Object -ExpandProperty Name | ForEach-Object {

    try {  

        $theName  = $_

        $theTmp  = Get-RunAsCredential -Name $theName

        $thePass = ($theTmp.Password).ToString()

        $thePass




        $secPass  = ConvertTo-SecureString $thePass -AsPlainText -Force

        $theCreds = New-Object -TypeName System.Management.Automation.PSCredential($theName,$secPass)




        $theCreds | Export-Clixml -Path "C:\temp\SCOMExport\$theName.cred"

        $fileName =  "C:\temp\SCOMExport\" + $theName + ".txt"

        "$($thePass)" | Out-File -FilePath $fileName

    } catch {

        $info = 'Swallowing exception'

    }

}

Importing Configuration with PowerShell

Copy the JSON files to the 1801 Management Server (C:\Temp\SCOM-Scripts)  and import using the following code:

Alert Resolution State
$jsonFileContent = Get-Content -Path C:\Temp\SCOM-Scripts\ResolutionStates.json | ConvertFrom-Json



foreach ($aState in $jsonFileContent) { 

Add-SCOMAlertResolutionState -Name $aState.Name -ResolutionStateCode $aState.ResolutionState

}


Alert Resolution Setting
$jsonFileContent = Get-Content -Path C:\Temp\SCOM-Scripts\AlertResolutionSetting.json | ConvertFrom-Json

Set-SCOMAlertResolutionSetting -AlertAutoResolveDays $jsonFileContent.AlertAutoResolveDays -HealthyAlertAutoResolveDays $jsonFileContent.HealthyAlertAutoResolveDays

Database Grooming Settings
$jsonFileContent = Get-Content -Path C:\Temp\SCOM-Scripts\DatabaseGrooming.json | ConvertFrom-Json

Set-SCOMDatabaseGroomingSetting -AlertDaysToKeep $jsonFileContent.AlertDaysToKeep -AvailabilityHistoryDaysToKeep $jsonFileContent.AvailabilityHistoryDaysToKeep -EventDaysToKeep $jsonFileContent.EventDaysToKeep

Set-SCOMDatabaseGroomingSetting -JobStatusDaysToKeep $jsonFileContent.JobStatusDaysToKeep -MaintenanceModeHistoryDaysToKeep $jsonFileContent.MaintenanceModeHistoryDaysToKeep -MonitoringJobDaysToKeep $jsonFileContent.MonitoringJobDaysToKeep

Set-SCOMDatabaseGroomingSetting -PerformanceDataDaysToKeep $jsonFileContent.PerformanceDataDaysToKeep -PerformanceSignatureDaysToKeep $jsonFileContent.PerformanceSignatureDaysToKeep -StateChangeEventDaysToKeep $jsonFileContent.StateChangeEventDaysToKeep        

User Roles
# Importing User roles and System roles from dedicated files. – Review them first.

$jsonFileContent = Get-Content -Path C:\Temp\SCOM-Scripts\UserRoles.json | ConvertFrom-Json

foreach ($uRole in $jsonFileContent) {

    switch ($uRole.ProfileDisplayName) {       

        'Advanced Operator' {

            Add-SCOMUserRole -Name $uRole.Name -DisplayName $uRole.Name -Users $uRole.Users -AdvancedOperator           

        }       

        'Author' {

            Add-SCOMUserRole -Name $uRole.Name -DisplayName $uRole.Name -Users $uRole.Users -Author

        }

        'Operator' {

            Add-SCOMUserRole -Name $uRole.Name -DisplayName $uRole.Name -Users $uRole.Users -Operator

        }

        'Read-Only Operator' {

            Add-SCOMUserRole -Name $uRole.Name -DisplayName $uRole.Name -Users $uRole.Users -ReadOnlyOperator

        }       

    }       

}




$jsonFileContent = Get-Content -Path C:\Temp\SCOM-Scripts\SystemUserRoles.json | ConvertFrom-Json

foreach ($sRole in $jsonFileContent) {

    if ($sRole.Users) {   

        Get-SCOMUserRole | Where-Object {$_.Name -eq $sRole.Name} | Set-SCOMUserRole -User $sRole.Users

    } else {

        continue

    }    

}
Run As Accounts
# Importing User roles and System roles from dedicated files. – Review them first.

$jsonFileContent = Get-Content -Path C:\Temp\SCOM-Scripts\RunAsAccounts.json | ConvertFrom-Json



foreach($rAccount in $jsonFileContent) {




    $theName  = $rAccount.Name   

    if ($theName -match 'Data Warehouse') {

        write-warning "Skipping default account $theName"

        continue

    }

    switch ($rAccount.AccountType) {

        'SCOMCommunityStringSecureData' {                       

            $credFile =  Get-Content -Path "C:\Temp\SCOM-Scripts\creds\$theName.txt"

            $secPass  = ConvertTo-SecureString $credFile -AsPlainText -Force

            Add-SCOMRunAsAccount -CommunityString -Name $theName -Description $rAccount.Description -String $secPass

        }

        'SCXMonitorRunAsAccount' {

            try{               

                $thePass = Get-Content -Path "C:\Temp\SCOM-Scripts\creds\$theName.txt"

                $secPass = ConvertTo-SecureString $thePass -AsPlainText -Force

                $theCreds = New-Object -TypeName System.Management.Automation.PSCredential($theName,$secPass)               

                Add-SCOMRunAsAccount -SCXMonitoring -Name $theName -Description $rAccount.Description -RunAsCredential $theCreds -Sudo              

            } catch {           

                Write-Warning $_                            

            }           

        }

        'SCXMaintenanceRunAsAccount' {

            try{               

                $thePass = Get-Content -Path "C:\Temp\SCOM-Scripts\creds\$theName.txt"

                $secPass = ConvertTo-SecureString $thePass -AsPlainText -Force

                $theCreds = New-Object -TypeName System.Management.Automation.PSCredential($theName,$secPass)               

                Add-SCOMRunAsAccount -SCXMaintenance -Name $theName -Description $rAccount.Description -RunAsCredential $theCreds -Sudo               

            } catch {

                Write-Warning $_                            

            }           

        }

        'SCOMBasicCredentialSecureData' {

            try{              

                $thePass = Get-Content -Path "C:\Temp\SCOM-Scripts\creds\$theName.txt"

                $secPass = ConvertTo-SecureString $thePass -AsPlainText -Force

                $theCreds = New-Object -TypeName System.Management.Automation.PSCredential($theName,$secPass)                

                Add-SCOMRunAsAccount -Basic -Name $theName -Description $rAccount.Description -RunAsCredential $theCreds                

            } catch {

                Write-Warning $_                            

            }           

        }

        'SCOMWindowsCredentialSecureData' {

            try{               

                $thePass = Get-Content -Path "C:\Temp\SCOM-Scripts\creds\$theName.txt"

                $secPass = ConvertTo-SecureString $thePass -AsPlainText -Force

                $theName  = $env:USERDOMAIN + '\' + $rAccount.Name

                $theCreds = New-Object -TypeName System.Management.Automation.PSCredential($theName,$secPass)                

                Add-SCOMRunAsAccount -Windows -Name $theName -Description $rAccount.Description -RunAsCredential $theCreds                

            } catch {

                Write-Warning $_                            

            }           

        }

        default {

            Write-Warning "Not coverd: $rAccount.AccountType please add manually."

        }

    }

}
Run As Profiles

Run As Profiles usually come with Management Packs and therefore not handled here.

Note:

Run As Profiles are the binding component between Run As Account and the Object (e.g. a Computer / Health State) they are used. Please check the configuration in the 2012 R2 environment to setup the mapping and distribution manually.

Datawarehouse Retention Settings

Datawarehouse retention settings are configured with a cmdline tool named dwdatarp.exe. It was released in 2008 and works since then.

Download link can be found here: https://blogs.technet.microsoft.com/momteam/2008/05/13/data-warehouse-data-retention-policy-dwdatarp-exe/

Checking the current settings

After downloading, copy the unpacked executable to your SCOM – database server (e.g. C:\Temp).

Open an elevated command prompt and use the following call to export the current configuration:

dwdatarp.exe -s OldSCOMDBSrvName -d OperationsManagerDW > c:\dwoutput.txt

Setting new values

The following values are just a suggestion to reduce the amount or required space.

dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Alert data set" -a "Raw data" -m 180
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Client Monitoring data set" -a "Raw data" -m 30
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Client Monitoring data set" -a "Daily aggregations" -m 90
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Configuration dataset" -a "Raw data" -m 90
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Event data set" -a "Raw Data" -m 30
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Performance data set" -a "Raw data" -m 15
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Performance data set" -a "Hourly aggregations" -m 30
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "Performance data set" -a "Daily aggregations" -m 90
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "State data set" -a "Raw data" -m 15
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "State data set" -a "Hourly aggregations" -m 30
dwdatarp.exe -s newscomdbsrv -d OperationsManagerDW -ds "State data set" -a "Daily aggregations" -m 90

Agent Migration – Remove the older Management Group

After the computer appears as fully managed computer in the SCOM console, remove the older Management group.

Jimmy Harper has authored a Management Pack which helps to clean-up the agent migration after the new agent has been deployed. Please read through:

https://blogs.technet.microsoft.com/jimmyharper/2016/08/13/scom-task-to-add-or-remove-management-groups-on-agents/

This is great! Thanks a lot Ruben, for taking the time and efforts to put this all together!

Get to know more about Ruben here!

Cheers!

Authoring PowerShell SCOM Console Tasks in XML – Ruben Zimmermann

Summary:

This post describes how to create PowerShell SCOM Console Tasks in XML along three examples.

Console-ListTopNProcesses

Introduction:

Console Tasks are executed on the SCOM Management Server. Three examples show how to create them using Visual Studio.

  • Task 1: Displaying a Management Server’s Last-Boot-Time [DisplayLastBootTime]
    • Executes a PowerShell script which displays the result in a MessageBox
  • Task 2: Show all new alerts related to selected Computer [ShowNewAlerts]
    • Passes the Computer principal name property (aka FQDN) to the script which then uses a GridView to display the alerts
  • Task 3: Listing the top N processes running on a Management Server [ListTopNProcesses]
    • An InputBox let the user specify the number (N) of top process on the Management server which is retrieved by script and shown via GridView

Requirements:

This blog assumes that you have created already a management pack before. If not, or if you feel difficulties please visit the section ‘Reading’ and go through the links.

The used software is Visual Studio 2017 (2013 and 2015 should work as well) plus the Visual Studio Authoring Extension. Additionally, the ‘PowerShell Tools for Visual Studio 2017’ are installed.

The Community Edition of Visual Studio technically works. – Please check the license terms in advance. – If you are working for a ‘normal company’ you will most likely require Visual Studio Professional.

Realization:

Initial steps in Visual Studio

à Create a new project based on Management Pack / Operations Manager 2012 R2 Management Pack template.

à Name it SCOM.Custom.ConsoleTasks

à Create a folder named Health Model and a sub folder of it named Tasks

à Add an Empty Management Pack Fragment to the root and name it Project.mpx.

Project File Project.mpx Content:

<ManagementPackFragment SchemaVersion="2.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema">

  <LanguagePacks>

    <LanguagePack ID="ENU" IsDefault="true">

      <DisplayStrings>

        <DisplayString ElementID="SCOM.Custom.ConsoleTasks">

          <Name>SCOM Custom ConsoleTasks</Name>

        </DisplayString>

      </DisplayStrings>

    </LanguagePack>

  </LanguagePacks>

</ManagementPackFragment>

Your screen should look like this now:

VSAE-ProjectFile

VSAE-ProjectFile.gif

Creating DisplayLastBootTime task

Within the Tasks folder create a class file named ConsoleTasks.mpx and remove all XML insight the file.

ConsoleTasks.mpx firstly only contains the code for the task DisplayLastBootTime and will be successively extended with the other two tasks.

 

Content of ConsoleTasks.mpx :

<ManagementPackFragment SchemaVersion="2.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <Categories>

    <Category ID ="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.ConsoleTaskCategory"  Target="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.ConsoleTask" Value ="System!System.Internal.ManagementPack.ConsoleTasks.MonitoringObject"/>

  </Categories>
  <Presentation>

    <ConsoleTasks>      

      <ConsoleTask ID="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.ConsoleTask" Accessibility="Public" Enabled="true" Target="SC!Microsoft.SystemCenter.RootManagementServer" RequireOutput="false">        <Assembly>SCOM.Custom.ConsoleTasks.DisplayLastBootTime.Assembly</Assembly>

        <Handler>ShellHandler</Handler>

        <Parameters>

          <Argument Name="WorkingDirectory" />

          <Argument Name="Application">powershell.exe</Argument>

          <Argument><![CDATA[-noprofile -Command "& { $IncludeFileContent/Health Model/Tasks/DisplayManagementServerLastBootTime.ps1$ }"]]></Argument>          

        </Parameters>

      </ConsoleTask>    

    </ConsoleTasks>

  </Presentation>
  <LanguagePacks>

    <LanguagePack ID="ENU" IsDefault="true">      

      <DisplayStrings>        

        <DisplayString ElementID="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.ConsoleTask">

          <Name>Custom Console-Tasks: Display LastBootTime</Name>

          <Description></Description>

        </DisplayString>        

      </DisplayStrings>

    </LanguagePack>

  </LanguagePacks>
  <Resources>

    <Assembly ID ="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.Assembly" Accessibility="Public" FileName ="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.Assembly.File" HasNullStream ="true" QualifiedName ="SCOM.Custom.ConsoleTasks.DisplayLastBootTime.Assembly" />

  </Resources>

</ManagementPackFragment>

 

Key points explanation for ConsoleTasks.mpx

Categories

  • Category Value specifies that this element is a console task

Presentation

ConsoleTask Target defines against what this task is launched. In this case it’s the Management Server. You can only see the task if you click on a View that is targeting the corresponding class. E.g.:

Console-ManagementServer-Tasks

Console-ManagementServer-Tasks.gif
  • Parameters, Argument Name “Application” sets PowerShell.exe to be called for execution
  • Parameters, Argument <![CDATA … defines the file within this Visual Studio project that contains the PowerShell code to be processed when the task is launched ( not handled yet ).

LanguagePack

  • DisplayString maps the Console Task ID to a text that is more user friendly. This will be shown in the SCOM console

Within the Tasks folder create a PowerShell file named DisplayManagementServerLastBootTime.ps1

Content of  DisplayManagementServerLastBootTime.ps1 :
$regPat         = '[0-9]{8}'

$bootInfo       = wmic os get lastbootuptime

$bootDateNumber = Select-String -InputObject $bootInfo -Pattern $regPat | Select-Object -ExpandProperty Matches | Select-Object -ExpandProperty Value

$bootDate       = ([DateTime]::ParseExact($bootDateNumber,'yyyyMMdd',[Globalization.CultureInfo]::InvariantCulture))

$lastBootTime   = $bootDate | Get-Date -Format 'yyyy-MM-dd'




$null = [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic')

$null = [Microsoft.VisualBasic.Interaction]::MsgBox($lastBootTime,0,"""Last Boot time of $($env:COMPUTERNAME)""")

 

Deploy this Management Pack to the SCOM Server and test it. The following screenshot shows the expected result:
Console-DisplayLastBootTime

Console-DisplayLastBootTime.gif

Creating ShowNewAlerts task

Keep the content of ConsoleTasks.mpx unchanged. Add the new code into the proper places.

Additional Content for ConsoleTasks.mpx for ShowNewAlerts:

<Category ID ="SCOM.Custom.ConsoleTasks.ShowNewAlerts.ConsoleTaskCategory"  Target="SCOM.Custom.ConsoleTasks.ShowNewAlerts.ConsoleTask" Value ="System!System.Internal.ManagementPack.ConsoleTasks.MonitoringObject"/><ConsoleTask ID="SCOM.Custom.ConsoleTasks.ShowNewAlerts.ConsoleTask" Accessibility="Public" Enabled="true" Target="Windows!Microsoft.Windows.Computer" RequireOutput="false">

        <Assembly>SCOM.Custom.ConsoleTasks.ShowNewAlerts.Assembly</Assembly>

        <Handler>ShellHandler</Handler>

        <Parameters>

          <Argument Name="WorkingDirectory" />

          <Argument Name="Application">powershell.exe</Argument>

          <Argument><![CDATA[-noprofile -noexit -Command "& { $IncludeFileContent/Health Model/Tasks/ShowNewAlertsForThisComputer.ps1$ }"]]></Argument>          <Argument>"$Target/Property[Type='Windows!Microsoft.Windows.Computer']/PrincipalName$"</Argument>

        </Parameters>

      </ConsoleTask>
<DisplayString ElementID="SCOM.Custom.ConsoleTasks.ShowNewAlerts.ConsoleTask">

          <Name>Custom Console-Tasks: Display ShowNewAlertsForThisComputer</Name>

          <Description></Description>

        </DisplayString>

<Assembly ID ="SCOM.Custom.ConsoleTasks.ShowNewAlerts.Assembly" Accessibility="Public" FileName ="SCOM.Custom.ConsoleTasks.ShowNewAlerts.Assembly.File" HasNullStream ="true" QualifiedName ="SCOM.Custom.ConsoleTasks.ShowNewAlerts.Assembly" />

Key points explanation for ConsoleTasks.mpx for ShowNewAlerts

Categories

  • Category Value specifies that this element is a console task

Presentation

  • ConsoleTask Target defines against what this task is launched. In this case it’s the Windows Computer. – This task is visible when in a View that shows Computer objects.
  • Parameters, Argument Name “Application” sets PowerShell.exe to be called for execution
  • Parameters, Argument <![CDATA … defines the file within this Visual Studio project that contains the PowerShell code to be processed when the task is launched ( not handled yet ).
  • Parameters, Argument “$Target/Property […]/PrincipalName$” gets the FQDN (principal name attribute) of the selected computer and makes it available for retrieving it in the script.

 

Content of ShowNewAlertsForThisComputer.ps1 :
param($ComputerName)

$allNewAlerts = Get-SCOMAlert | Select-Object -Property Name, Description, MonitoringObjectDisplayName, IsMonitorAlert, ResolutionState, Severity, PrincipalName, TimeRaised  | Where-Object {$_.PrincipalName -eq $ComputerName -and $_.ResolutionState -eq '0'}




if ($allNewAlerts) {

       $allNewAlerts | Out-GridView

} else {

       Write-Host 'No new alerts available for the computer: ' + $ComputerName

}

 

Deploy this Management Pack to the SCOM Server and test it. The following screenshot shows the expected result:

Console-ShowNewAlerts

Console-ShowNewAlerts

Creating ListTopNProcesses task

Keep the content of ConsoleTasks.mpx unchanged. Add the new code into the proper places.

Additional Content for ConsoleTasks.mpx for ListTopNProcesses:

<Category ID ="SCOM.Custom.ConsoleTasks.ListTopNProcesses.ConsoleTaskCategory"  Target="SCOM.Custom.ConsoleTasks.ListTopNProcesses.ConsoleTask" Value ="System!System.Internal.ManagementPack.ConsoleTasks.MonitoringObject"/>  <ConsoleTask ID="SCOM.Custom.ConsoleTasks.ListTopNProcesses.ConsoleTask" Accessibility="Public" Enabled="true" Target="SC!Microsoft.SystemCenter.RootManagementServer" RequireOutput="false">

        <Assembly>SCOM.Custom.ConsoleTasks.ListTopNProcesses.Assembly</Assembly>

        <Handler>ShellHandler</Handler>

        <Parameters>

          <Argument Name="WorkingDirectory" />

          <Argument Name="Application">powershell.exe</Argument>

          <Argument><![CDATA[-noprofile -noexit -Command "& { $IncludeFileContent/Health Model/Tasks/ListTopNProcesses.ps1$ }"]]></Argument>

        </Parameters>

      </ConsoleTask>

<DisplayString ElementID="SCOM.Custom.ConsoleTasks.ListTopNProcesses.ConsoleTask">

          <Name>Custom Console-Tasks: Display ListTopNProcesses</Name>

          <Description></Description>

        </DisplayString>     

<Assembly ID ="SCOM.Custom.ConsoleTasks.ListTopNProcesses.Assembly" Accessibility="Public" FileName ="SCOM.Custom.ConsoleTasks.ListTopNProcesses.Assembly.File" HasNullStream ="true" QualifiedName ="SCOM.Custom.ConsoleTasks.ListTopNProcesses.Assembly" />

Key points explanation for ConsoleTasks.mpx for ListTopNProcesses

Categories

  • Category Value specifies that this element is a console task

Presentation

  • ConsoleTask Target defines against what this task is launched. In this case it’s the Windows Computer. – This task is visible when in a View that shows Computer objects.
  • Parameters, Argument Name “Application” sets PowerShell.exe to be called for execution
  • Parameters, Argument <![CDATA … defines the file within this Visual Studio project that contains the PowerShell code to be processed when the task is launched ( not handled yet ).

 

Content of ListTopNProcesses.ps1 :
$null          = [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic')

$receivedValue = [Microsoft.VisualBasic.Interaction]::InputBox("""Enter the number of top processes to be shown""", """List top processes:""", "10")

Get-Process | Sort-Object -Property CPU -Descending | Select-Object -First $receivedValue  | Out-GridView

 

Deploy this Management Pack to the SCOM Server and test it. The following screenshot shows the expected result:

Console-ListTopNProcesses

Console-ListTopNProcesses.gif

Reading:

If you are new to management pack authoring I suggest the free training material from Brian Wren. – It was made for SCOM 2012 R2 but is still valid until today (current SCOM 1807 in year 2018).

On Microsoft’s Virtual Academy: System Center 2012 R2 Operations Manager Management Pack

On Microsoft’s Wiki: System Center Management Pack Authoring Guide

Closing:

If you have questions, comments or feedback feel free to feedback in our SCOM – Gitter – Community on: https://gitter.im/SCOM-Community/Lobby

The Visual Studio solution file is downloadable here:

SCOM Console Powershell Task

Thanks!

Ruben Zimmermann

 

Know more about Ruben and get in touch with him here:

Ruben Zimmermann (A fantastic person who has a lot of great ideas) [Interview]

SCOM User Session Duration – Powershell

A few days ago, I needed to find out how many users are connecting to SCOM daily/weekly and how long was each user connected. Out-of-the-box SCOM does not provide you a way of doing this. So I started looking around for some hints. I came across this article, which looked pretty convincing.

https://blogs.technet.microsoft.com/dirkbri/2014/10/15/an-approach-of-collecting-and-analyzing-scom-2012-sdk-connections/

This looked all good, except I wanted to do it with Powershell.

Whenever any “client” connects/disconnects to/from the console an event is written in the Operations Manager event log. In each event there is event data which gives you information on the following points:

  • EventID
  • ManagementServer Name
  • Username
  • SessionID
  • UUID – extracted from SessionID
  • ID – extracted from SessionID
  • TimeCreated
  • SessionDuration -calculated Time between log on and log off Event
  • SessionCount – calculated cumulated counter of all current sessions

So, I started working on a script that’d meet my requirements. To be honest, I am still a Powershell student, so I got stuck halfway. So I decided to “get-help” (Powershell pun, get it? ;)) from my friend and guide Stoyan Chalakov. I explained to him my idea and asked whether he would help me script it. Sure enough, I wasn’t disappointed. He liked the idea as well, and came up with a nice script that outputs the data the way we imagined. I’ve linked the link to his script on Technet at the end.

Some important remarks which apply to both methods (the one described in the blog and the script):

“Yes, there are design flaws in this approach:

  1. If you create daily files, you will have an unknown amount of already open sessions at the beginning of the day and a certain amount of not yet closed sessions at the end. So your SessionCount will be lower than the cumulated values from the “Client Connections” performance counters of all Management Servers.
    But when you analyse e.g. weekly data instead of daily, you should get very good results.
  2. The tricky part is , the exact same event is written when a Powershell SDK connection is made. You cannot distinguish between PowerShell SDK connections and Operations console connections. Unfortunately this is by design and I am not aware of a way to mitigate this problem.”

The script parses all the events, which are being logged in regards to the particular user and gathers information in regards to:

  • Open session, where the session has not been closed (presence of the Login event, absence of Logout event)
  • Closed session and their duration (presence of the Login event and Logout event)
  • Closed session without and Login session event, indicating that the Login session event has been overridden and cannot be found.
  • The script should be run directly on a management server. If you need you can enable PSremoting and run this on a multiple management server.
  • The script is in its first version, where you need to enter a domain and username. The second version (currently being worked on) will include a full report on all user session and their duration, without the need of specifying a particular user name.

Sample output of the script:

user session

You can download the script here:

Stoyan has commented thoroughly throughout the script, so there would no issues to understand the way it works.

SCOM User Session Script

Thanks a ton Stoyan!

Cheers!

 

 

 

 

 

 

 

 

 

Guest Blog – Authoring a PowerShell Agent Task in XML – By Ruben Zimmermann

It is my absolute pleasure to announce this guest blog today. My good friend Ruben has come up with a fantastic community MP that is going to make all of our lives a lot easier 🙂

I will let him talk about it himself. All yours, Ruben! –


Authoring a PowerShell Agent Task in XML

Summary:

This post describes the code behind a PowerShell Agent Task. With the help of a real-life case ‘Create Log Deletion Job’ the lines in XML and PowerShell are commented.

Introduction:

Create Log Deletion Job is a SCOM – Agent Task which offers the creation of a scheduled task that deletes log files older than N days on the monitored computer. It works on SCOM 2012 R2 and later.

Requirements:

This blog assumes that you have created already management packs before. If not, or if you feel difficulties please visit the section ‘Reading’ and go through the links.

The used software is Visual Studio 2017 (2013 and 2015 work as well) plus the Visual Studio Authoring Extension. Additionally the ‘PowerShell Tools for Visual Studio 2017’ are installed.

The Community Edition of Visual Studio technically works. – Please check the license terms in advance. – If you are working for a ‘normal company’ you will most likely require Visual Studio Professional.

Realization:

The agent task mainly consists of two components.

  1. A PowerShell Script which creates a Scheduled Task and writes another script on the target machine with the parameters specified in the SCOM console.
  2. A custom module that contains a custom write action based on PowerShell Write Action and a task targeting Windows Computer leveraging the write action.
  3. Visual Studio Authoring Extension (VSAE), a free Plugin for Visual Studio is used to bind the XML and PowerShell together and produce a Management Pack.
    The Management Pack itself can be downloaded as Visual Studio solution or as compiled version directly from GitHub.

https://github.com/Juanito99/Windows.Computer.AgentTasks.CreateLogDeletionJob

Steps to do in Visual Studio:

  • Create a new project based on Management Pack / Operations Manager R2 Management Pack.- Name it ‘Windows.Computer.AgentTasks.CreateLogDeletionjob’
  • Create a folder named ‘Health Model’ and a sub folder of it named ‘Tasks’.
  • Add an Empty Management Pack Fragment to the root and name it Project.mpx.

Project File ‘project.mpx’ Content:

<ManagementPackFragment SchemaVersion="2.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema">

  <LanguagePacks>




    <LanguagePack ID="ENU" IsDefault="true">

      <DisplayStrings>

        <DisplayString ElementID="Windows.Computer.AgentTasks.CreateLogDeletionJob">

          <Name>Windows Computer AgentTasks CreateLogDeletionJob</Name>

          <Description>Creates a scheduled task on the managed computer which automatically deletes old logs.</Description>

        </DisplayString>

      </DisplayStrings>

    </LanguagePack>

  </LanguagePacks>




</ManagementPackFragment>

The Powershell Script:

Create a file named CreateLogDeletionJob.ps1 in the ‘Tasks'<h3> sub folder and copy the following content inside it.

param($LogFileDirectory,$LogFileType,$DaysToKeepLogs,$ScheduledTasksFolder)

$api = New-Object -ComObject 'MOM.ScriptAPI'

$api.LogScriptEvent('CreateLogDeletionJob.ps1',4000,4,"Script runs. Parameters: LogFileDirectory $($LogFileDirectory), LogFileType: $($LogFileType) DaysToKeepLogs $($DaysToKeepLogs) and scheduled task folder $($scheduledTasksFolder)")           

Write-Verbose -Message "CreateLogDeletionJob.ps1 with these parameters: LogFileDirectory $($LogFileDirectory), LogFileType: $($LogFileType) DaysToKeepLogs $($DaysToKeepLogs) and scheduled task folder $($scheduledTasksFolder)"

$ComputerName          = $env:COMPUTERNAME

$LogFileDirectoryClean = $LogFileDirectory      -Replace('\\','-')

$LogFileDirectoryClean = $LogFileDirectoryClean -Replace(':','')

$scheduledTasksFolder  = $scheduledTasksFolder -replace([char]34,'')

$scheduledTasksFolder  = $scheduledTasksFolder -replace("`"",'')

$taskName              = "Auto-Log-Dir-Cleaner_for_$($LogFileDirectoryClean)_on_$($ComputerName)"

$taskName              = $taskName -replace '\s',''

$scriptFileName        = $taskName + '.ps1'

$scriptPath            = Join-Path -Path $scheduledTasksFolder -ChildPath $scriptFileName

if ($DaysToKeepLogs -notMatch '\d' -or $DaysToKeepLogs -gt 0) {

                $daysToKeepLogs = 7

                $msg = 'Script warning. DayToKeepLogs not defined or not matching a number. Defaulting to 7 Days.'

                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4000,2,$msg)

}

if ($scheduledTasksFolder -eq $null) {

                $scheduledTasksFolder = 'C:\ScheduledTasks'

} else {

                $msg = 'Script warning. ScheduledTasksFolder not defined. Defaulting to C:\ScheduledTasks'

                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4000,2,$msg)

                Write-Warning -Message $msg

}

if ($LogFileDirectory -match 'TheLogFileDirectory') {

                $msg =  'CreateLogDeletionJobs.ps1 - Script Error. LogFileDirectory not defined. Script ends.'

                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4000,1,$msg)

                Write-Warning -Message $msg

                Exit

}

if ($LogFileType -match '\?\?\?') {    

                $msg = 'Script Error. LogFileType not defined. Script ends.'

                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4000,1,$msg)

                Write-Warning -Message $msg

                Exit

}

Function Write-LogDirCleanScript {




                param(

                                [string]$scheduledTasksFolder,

                                [string]$LogFileDirectory,                 

                                [int]$DaysToKeepLogs,                      

                                [string]$LogFileType,

                                [string]$scriptPath

                )

               

                if (Test-Path -Path $scheduledTasksFolder) {

                                $foo = 'folder exists, no action requried'

                } else {

                                & mkdir $scheduledTasksFolder

                }

               

                if (Test-Path -Path $LogFileDirectory) {

                                $foo = 'folder exists, no action requried'

                } else {

                                $msg = "Script function (Write-LogDirCleanScript, scriptPath: $($scriptPath)) failed. LogFileDirectory not found $($LogFileDirectory)"

                                Write-Warning -Message $msg

                                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4001,1,$msg)                

                                Exit

                }




                if ($LogFileType -notMatch '\*\.[a-zA-Z0-9]{3,}') {

                                $LogFileType = '*.' + $LogFileType

                                if ($LogFileType -notMatch '\*\.[a-zA-Z0-9]{3,}') {

                                                $msg = "Script function (Write-LogDirCleanScript, scriptPath: $($scriptPath)) failed. LogFileType: $($LogFileType) seems to be not correct."

                                                Write-Warning -Message $msg

                                                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4001,1,$msg)                

                                                Exit

                                }

                }




$fileContent = @"

Get-ChildItem -Path `"${LogFileDirectory}`" -Include ${LogFileType} -ErrorAction SilentlyContinue | Where-Object { ((Get-Date) - `$_.LastWriteTime).days -gt ${DaysToKeepLogs} } | Remove-Item -Force

"@         

               

                $fileContent | Set-Content -Path $scriptPath -Force

               

                if ($error) {

                                $msg = "Script function (Write-LogDirCleanScript, scriptPath: $($scriptPath)) failed. $($error)"                       

                                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4001,1,$msg)

                                Write-Warning -Message $msg

                } else {

                                $msg = "Script: $($scriptPath) successfully created"   

                                Write-Verbose -Message $msg

                }




} #End Function Write-LogDirCleanScript







Function Invoke-ScheduledTaskCreation {




                param(

                                [string]$ComputerName,                   

                                [string]$taskName

                )                

                               

                $taskRunFile         = "C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe -NoLogo -NonInteractive -File $($scriptPath)"    

                $taskStartTimeOffset = Get-Random -Minimum 1 -Maximum 10

                $taskStartTime       = (Get-Date).AddMinutes($taskStartTimeOffset) | Get-date -Format 'HH:mm'                                                                                                                                                                                     

                $taskSchedule        = 'DAILY'             

                & SCHTASKS /Create /SC $($taskSchedule) /RU `"NT AUTHORITY\SYSTEM`" /TN $($taskName) /TR $($taskRunFile) /ST $($taskStartTime)                

                               

                if ($error) {

                                $msg = "Sript function (Invoke-ScheduledTaskCreation) Failure during task creation! $($error)"

                                $api.LogScriptEvent('CreateLogDeletionJob.ps1',4002,1,$msg)                

                                Write-Warning -Message $msg

                } else {

                                $msg = "Scheduled Tasks: $($taskName) successfully created"

                                Write-Verbose -Message $msg

                }              




} #End Function Invoke-ScheduledTaskCreation







$logDirCleanScriptParams   = @{

                'scheduledTasksFolder' = $ScheduledTasksFolder

                'LogFileDirectory'     = $LogFileDirectory       

                'daysToKeepLogs'       = $DaysToKeepLogs     

                'LogFileType'          = $LogFileType

                'scriptPath'           = $scriptPath

}




Write-LogDirCleanScript @logDirCleanScriptParams







$taskCreationParams = @{

                'ComputerName'  = $ComputerName             

                'taskName'      = $taskName

                'scriptPath'    = $scriptPath

}




Invoke-ScheduledTaskCreation @taskCreationParams

The Custom Module:

Add an Empty Management Pack Fragment in the ‘Tasks’ sub folder name it ‘AgentTasks.mpx’. Copy the content below into it.

<ManagementPackFragment SchemaVersion="2.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema">




  <TypeDefinitions>   

    <ModuleTypes>

      <!-- Defining the Write Action Module Type. The name is user-defined and will be used in the task definition below. -->

      <WriteActionModuleType ID="Windows.Computer.AgentTasks.CreateLogDeletionJob.WriteAction" Accessibility="Internal" Batching="false">

        <!-- The items in the Configuration sections are exposed through the SCOM console -->

        <Configuration>

          <xsd:element minOccurs="1" name="LogFileDirectory" type="xsd:string" xmlns:xsd="http://www.w3.org/2001/XMLSchema" />

          <xsd:element minOccurs="1" name="LogFileType" type="xsd:string" xmlns:xsd="http://www.w3.org/2001/XMLSchema" />

          <xsd:element minOccurs="1" name="ScheduledTasksFolder" type="xsd:string" xmlns:xsd="http://www.w3.org/2001/XMLSchema" />

          <xsd:element minOccurs="1" name="DaysToKeepLogs" type="xsd:integer" xmlns:xsd="http://www.w3.org/2001/XMLSchema" />

          <xsd:element minOccurs="1" name="TimeoutSeconds" type="xsd:integer" xmlns:xsd="http://www.w3.org/2001/XMLSchema" />

        </Configuration>

        <!-- To make exposed items editable it is required to specify them in the OverrideableParameters section -->

        <OverrideableParameters>

          <OverrideableParameter ID="LogFileDirectory" Selector="$Config/LogFileDirectory$" ParameterType="string" />

          <OverrideableParameter ID="LogFileType" Selector="$Config/LogFileType$" ParameterType="string" />

          <OverrideableParameter ID="ScheduledTasksFolder" Selector="$Config/ScheduledTasksFolder$" ParameterType="string" />

          <OverrideableParameter ID="DaysToKeepLogs" Selector="$Config/DaysToKeepLogs$" ParameterType="int" />         

        </OverrideableParameters>

        <ModuleImplementation Isolation="Any">

          <Composite>

            <MemberModules>

              <!-- The ID name is user-defined, suggested to keep it similar as the type. The type signals SCOM to initiate the PowerShell engine  -->

              <WriteAction ID="PowerShellWriteAction" TypeID="Windows!Microsoft.Windows.PowerShellWriteAction">

                <!-- To keep the module readable and the script better editable it is referenced below -->

                <ScriptName>CreateLogDeletionJob.ps1</ScriptName>

                <ScriptBody>$IncludeFileContent/Health Model/Tasks/CreateLogDeletionJob.ps1$</ScriptBody>

                <!-- The parameters are the same than in the Configuration section as we want to pass them into the script from the SCOM console -->

                <Parameters>

                  <Parameter>

                    <Name>LogFileDirectory</Name>

                    <Value>$Config/LogFileDirectory$</Value>

                  </Parameter>

                  <Parameter>

                    <Name>LogFileType</Name>

                    <Value>$Config/LogFileType$</Value>

                  </Parameter>

                  <Parameter>

                    <Name>ScheduledTasksFolder</Name>

                    <Value>$Config/ScheduledTasksFolder$</Value>

                  </Parameter>

                  <Parameter>

                    <Name>DaysToKeepLogs</Name>

                    <Value>$Config/DaysToKeepLogs$</Value>

                  </Parameter>

                </Parameters>

                <TimeoutSeconds>$Config/TimeoutSeconds$</TimeoutSeconds>

              </WriteAction>

            </MemberModules>

            <Composition>

              <Node ID="PowerShellWriteAction" />

            </Composition>

          </Composite>

        </ModuleImplementation>

        <OutputType>System!System.BaseData</OutputType>

        <InputType>System!System.BaseData</InputType>

      </WriteActionModuleType>     

    </ModuleTypes>

  </TypeDefinitions>




  <Monitoring>   

    <Tasks>     

      <!-- The name for the task ID is user-defined. Target is set to 'Windows.Computer' that make this task visible when objects of type Windows.Computer are shown in the console -->

      <Task ID="Windows.Computer.AgentTasks.CreateLogDeletionJob.Task" Accessibility="Public" Enabled="true" Target="Windows!Microsoft.Windows.Computer" Timeout="120" Remotable="true">

        <Category>Custom</Category>

        <!-- The name for the write action is user-defined. The TypeID though must match to the above created one. -->

        <WriteAction ID="PowerShellWriteAction" TypeID="Windows.Computer.AgentTasks.CreateLogDeletionJob.WriteAction">

          <!-- Below the parameters are pre-filled to instruct the users how the values in the overrides are exptected-->

          <LogFileDirectory>C:\TheLogFileDirectory</LogFileDirectory>

          <LogFileType>*.???</LogFileType>

          <ScheduledTasksFolder>C:\ScheduledTasks</ScheduledTasksFolder>

          <DaysToKeepLogs>7</DaysToKeepLogs>         

          <TimeoutSeconds>60</TimeoutSeconds>

        </WriteAction>

      </Task>     

    </Tasks>

  </Monitoring>




  <LanguagePacks>

    <LanguagePack ID="ENU" IsDefault="true">

      <DisplayStrings>       

        <DisplayString ElementID="Windows.Computer.AgentTasks.CreateLogDeletionJob.Task">

          <Name>Create Log Deletion Job</Name>

        </DisplayString>     

      </DisplayStrings>     

      <KnowledgeArticles></KnowledgeArticles>

    </LanguagePack>

  </LanguagePacks>




</ManagementPackFragment>

Reading:

If you are new to management pack authoring I suggest the free training material from Brian Wren.

On Microsoft’s Virtual Academy: https://mva.microsoft.com/en-US/training-courses/system-center-2012-r2-operations-manager-management-pack-8829?l=lSKqLpx2_7404984382

On Microsoft’s Wiki: https://social.technet.microsoft.com/wiki/contents/articles/15251.system-center-management-pack-authoring-guide.aspx

Closing:

If you have questions, comments or feedback feel free to feedback in our SCOM – Gitter – Community on: https://gitter.im/SCOM-Community/Lobby


Awesome! Thanks a lot for all your contribution and for being considerate for others by publishing it for everyone, Ruben 🙂 Wishing to have many more cool guest blogs from you!

You can get to know Ruben better here:

Ruben Zimmermann (A fantastic person who has a lot of great ideas) [Interview]

Cheers!

SCOM Event Based Monitoring – Part 2 – Rules

In the last post we discussed about event based monitoring options SCOM  provides with Monitors. You can find it here:

SCOM Event Based Monitoring – Part 1 – Monitors

In this post we are going to discuss the event based monitoring options using SCOM Rules. Basically the highlighted options in the image below:

1

As we can see, we have 2 kinds of rules for monitoring events. “Alert Generating Rules” and “Collection Rules”. Let’s walk through them one by one.

Alert Generating Rules:

As the name suggests, this type of rules raise an alert if they detect the event ID.

As you’re going through the Rule Creation wizard you will notice that there are several options in “Rule category”, as shown in the pic below. In my experience, there is no difference whatever you choose. These are more like for “Logical Grouping” of these rules, that you can maybe make use of if working on them through Powershell. Since we’re here to create an “alert generating” rule, the most obvious option here would be “Alert”.

1

Now, this step is pretty important and if you’re new to creating this, you’re very likely to miss this. As you reach the final window, on the bottom of “Alert Description” (which you can modify btw), is the box for “Alert Suppression”. This is used to tell SCOM what events should be considered “duplicates” and hence not raise a new alert if there’s already one open in active alerts. “Auto-suppression” happens automatically for monitors – it only increases the repeat count for every new detection – but for rules, you’ll have to do this manually. If you don’t do this, you’re gonna get a new alert every time this event is detected. In this demo, I’m telling SCOM to consider the event as duplicate if it has the same Event ID AND the same Event Source.

I HIGHLY recommend using this since I learned this hard way some time back. I missed out configuring alert suppression for some rule and a couple nights later, woke up to a call from our Command Center guys. They said, “Dude! We’ve received THOUSANDS of  emails in our mailbox since last 15 minutes…and they’re all the same. Help!”

I rushed and turned it off, further investigation brought to light the cause that something had gone wrong on the server and it was writing the same event over and over again, thousands of times in the event log, and since I had not configured the suppression criteria, it created a new alert every time and sent mail as set up in subscription. Now I check for suppression criteria at least 3 times 🙂

1

Now that is set up, I created the event in the logs with a simple PoSh:

Write-EventLog -Logname Application -ID 1000 -Source custom -Message "This is a test message"

And as you can see, the alert was raised.

1

Collection Rules:

Collection rules are used only to collect the events in the SCOM database. They do not create any alert. In fact, you’ll hardly even notice their presence at all. Why create these rules then? Most commonly for reporting/auditing purposes. You can also create a dashboard showing the detection of these events.

1

1

Notice in the left side of the wizard that there’s no “Configure Alerts” tab as we had in the “Alert Generating” rule.

So what this rule is going to do is detect the occurrence of event 500 with source “custom” and if it detects it, just saves it in the database.

I wrote this event in the logs a bunch of times, now let’s see if SCOM is really collecting them or not. We’ll create a simple “Event View” dashboard for this.

1

And yup, we can indeed see that the event is actually being collected by SCOM.

1

Here’s another thing that might be a bit tricky. After you create these 2 types of events, if you open the properties, you’ll see they’re almost identical. If you’ve named them properly, kudos to you but if you (or someone else) hasn’t, how will you find out whether the rule is generating alerts or just collecting events? Take a close look at the content in the yellow boxes below. See the difference?

Alert Generating Rule Properties:

1

Event Collection Rule Properties:

1

You rightly noticed that the response for the alert generating rule is “Alert”, which means this rule is going to respond to the event detection by generating an alert. If you click on the “Edit” option just beside that, you’ll see wizard for alert name change, format change, suppression criteria, etc.

On the other hand, the yellow box in the collection rule is “Event Data Collection Write Action” and “Event data publisher”, which indicates that it is only writing it in the databases. You can also verify this with Powershell:

1

You can also fetch a report for the event collection rule for auditing purposes, but unfortunately I do not have the Reporting feature installed in my lab so can’t show a demo for that. 🙂

Thus concluding the event monitoring options SCOM offers with monitors as well as Rules. Now you know what abilities SCOM has, now it’s up to you to decide which option do you wanna choose 🙂

Cheers!

 

Run Powershell scripts in Console Tasks

I am working on one of the projects and as a part of it I needed to create some console tasks that would run a Powershell script to do the stuff I want. I knew that it was no problem for a script a line of two long, but any more than that and it is a real pain to pass it as the parameter in the console task. The other way I was aware of is you can point the path to your script in the parameters if you have it locally saved on your management servers (each and every one of them, at the exact same path). This didn’t really serve my purpose as I wanted to embed in a re-usable XML so I decided to do something on my own.

After a bit of researching the Internet and a lot of trial-and-error, I finally got it working. The key points to remember here are:

1. CDATA to parse it through xml

2. -command “cmd1;cmd2” to pass the block as a single input

3. Using “;” to break the cmdlets

4. Use the escape character “\” before every double quote (“) to skip the character otherwise the compiler misunderstands it for -command syntax and throws errors.

Here’s an example of the XML, with a simple Powershell code that creates an event in the custom event source:

 

<ManagementPackFragment SchemaVersion="2.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
	<Categories>
		<Category ID ="Cat.CustomTasks.CreateEvent"  Target ="CustomTasks.CreateEvent" Value ="System!System.Internal.ManagementPack.ConsoleTasks.MonitoringObject"/>
	</Categories>
	<Presentation>

		<ConsoleTasks>

			<ConsoleTask ID="CustomTasks.CreateEvent" Accessibility="Internal" Enabled="true" Target="Windows!Microsoft.Windows.Computer" RequireOutput="true">
				<Assembly>CustomTasks.CTCreateEventAssembly</Assembly>
				<Handler>ShellHandler</Handler>

				<Parameters>
					<Argument Name ="WorkingDirectory"/>
					<Argument Name ="Application">%windir%\system32\windowsPowershell\v1.0\powershell.exe</Argument>
					<Argument>
						<![CDATA[ -command 
						"
#Create a custom source;
New-EventLog -Source 'Task Source Name' -LogName 'Operations Manager';
#Write the event;
Write-EventLog -LogName 'Operations Manager' -Source 'Task Source Name' -EntryType Warning -EventId 1010 -Message 'This is a test event created by task'
"
						]]>
						
					</Argument>
				</Parameters>

			</ConsoleTask>
		</ConsoleTasks>
	</Presentation>

	<LanguagePacks>
		<LanguagePack  ID ="ENU" IsDefault ="true">
			<DisplayStrings>
				<DisplayString  ElementID ="CustomTasks.CreateEvent">
					<Name>CT - Create Test Event</Name>
					<Description>Creates a Warning test event</Description>
				</DisplayString>
			</DisplayStrings>
		</LanguagePack>
	</LanguagePacks>
	<Resources>
		<Assembly ID ="CustomTasks.CTCreateEventAssembly" Accessibility ="Public" FileName ="Res.CustomTasks.CTCreateEventAssembly" HasNullStream ="true" QualifiedName ="Res.CustomTasks.CTCreateEventAssembly" />
	</Resources>  
</ManagementPackFragment>

Here’s the output:

createeventoutput

Hope this helps someone out there with similar need.

For further reading, you can go through this thread:

Powershell script in a console task?

Keep SCOMing 🙂

Cheers