Quantcast
Channel: Stefan Roth
Viewing all 140 articles
Browse latest View live

SCSM – SMLets “The criteria could not be parsed. See inner exception for details.”

$
0
0

If you are using the SMLets from codeplex in combination with PowerShell remoting to retrieve information from Service Manager it could well be that you hit this error:

The criteria could not be parsed. See inner exception for details.

    + CategoryInfo          : OperationStopped: (:) [], InvalidCriteriaException

    + FullyQualifiedErrorId : The criteria could not be parsed. See inner exception for details.

+ PSComputerName        : Computer1

In my case the setup consisted out of 2 infrastructures like a TEST and an INTEGRATION which were supposed to be identical. In both environment we had SMA and Service Manager installed. I wanted to run a PowerShell workflow in SMA to retrieve the service request id from a newly generated service request. The fun part was that executing the code within the INTEGRATION environment did not cause any problem but if I executed the code in the TEST environment I faced the error above. The code part which generated the error was:

 

$SRID = “SR445”

$SRClass = Get-SCSMClass –Name “System.WorkItem.ServiceRequest$”

$SRObject = Get-SCSMObject -Class $SRClass -Filter “ID -eq $SRID”

If I used Where-Object like…

 

$SRID = “SR445”

$SRClass = Get-SCSMClass –Name “System.WorkItem.ServiceRequest$”

$SRObject = Get-SCSMObject -Class $SRClass | Where-Object {$_.ID -eq $SRID”}

 

or if I used the Name property to filter the service requests like…

 

$SRID = “SR445”

$SRClass = Get-SCSMClass –Name “System.WorkItem.ServiceRequest$”

$SRObject = Get-SCSMObject -Class $SRClass -Filter “Name -eq $SRID”

 

I did NOT hit any error! So the part which caused the failure was the -Filter “ID -eq $SRID” statement. Again, I just received the error in the TEST environment. In the INTEGRATION environment all the above samples worked as expected! So what was wrong? After a long time of troubleshooting and comparing the environments my fellow and automation guru Fulvio Ferrarini finally found the cause of the problem (all credits to him).

If you check the inner exception to get more details you find this error:

 

Exception of type ‘System.OutOfMemoryException’ was thrown.

    + CategoryInfo          : OperationStopped: (:) [], OutOfMemoryException

    + FullyQualifiedErrorId : Exception of type ‘System.OutOfMemoryException’ was thrown.

    + PSComputerName        : Computer1

 

What the hell?

To execute the PowerShell workflow on the remote Service Manager from SMA I was using InlineScript{} and PowerShell remoting. Because of that by comparing the WinRM (Windows Remote Management) settings on the Service Manager server there was a major difference:

INTEGRATION

image

TEST

image

MaxMemoryPerShellMB was reduced to 300 MB by a GPO! After adjusting the GPO everything worked as expected…

image

I hope this delivers you the proper hint if you run into a similar issue.


Filed under: Service Manager, SMA, Troubleshooting

Microsoft Azure Operational Insights – Mobile App Available

$
0
0

image

I have blogged previously about Microsoft Azure Operational Insights which is still in a preview stage and you can test it free of charge. Microsoft did already a great job visualizing all the data collected by the Intelligence Packs. At this moment you are also able to have all this information on your Windows Phone. The release of the Microsoft Azure Operational Insights Mobile App allows you to look at all the available information on your mobile device.

You are able to get access to all the Intelligence Packs and their predefined queries…

image image image

…or access your defined custom dashboard…

image

Or you even can fire off your query for a quick analysis of your data using the custom query language in Search Data Explorer

image

What I have seen the app has some limitations:

  • No Intelligence Packs can be added
  • My Dashboard cannot be customized
  • No access to the traffic usage information
  • No Direct Agent configuration
  • Configuration Assessment information is partly available (I think because one part is still based on Silverlight)
  • Besides the Workspace you cannot change any other user settings information

I am sure you can expect functional changes within Microsoft Azure Operational Insights using your browser or mobile app in the upcoming month. So this is just a point in time review.

Tip: Did you know that you can display your Windows Phone screen on your Windows computer and take screenshots? Use this very cool tool Project My Screen App for Windows Phone, that’s the way I got the screenshots above :).


Filed under: Azure Operational Insights, Configuration, Dashboard, Software, Tool

SCSM – Get Service Request Object From Any Nested Child Runbook Activity

$
0
0

When you are starting to automate stuff in your cloud environment you probably use SMA for automation and Service Manager to trigger the necessary runbooks in SMA. As this is really great stuff you will start facing many new challenges because you have to script in PowerShell to achieve your goals. Depending on your architecture and your strategy and how you plan your runbooks you will end up with more or the far better way is – less problems.

One strategy I follow is to have just one parameter which is the runbook ID to pass from the runbook activity template to the SMA runbook in Service Manager.  So imagine, you pass the runbook activity ID to your SMA runbook and you want to get some information from the parent work item like the service request. If you have the runbook activity placed directly on the service request itself it is not a problem like in this example…

image

You can write a few lines of code like this….

image

…and the result will look like this…

image

As you can see you will get the ID from the parent work item. In this case the service request SR2067.

But how are you going to proceed if your runbook activity e.g. RB2135 is placed deeply nested in an activity tree because you are building the SCSM activity tree dynamically or you have a very complex template? If you would use the code above you receive the ID from the parent work item in this case the sequential activity SA2134

image

The (theoretical) solution approach is to jump from the runbook activity RB2135 to the parent sequential activity SA2134 and using this activity as starting point to find the next parent work item PA2133 and from there one more step up again and so on until you are at the service request level SR2126. “Nice words bro, but how does this look like in PowerShell?” – Good question buddy.

Here some background information. If an activity is placed on the service request you can use the relation System.WorkItemContainsActivity$ to query the activity. As you can see the relationship has source WorkItem and target Activity. That’s the same relationship we used in the first code sample above…

image

Since you know how to get the parent work item and you know how the relationship works, you need to find some sort of way to build a loop to use the parent work item as input and starting point for iterating one level up the hierarchy.

Here I have written just a few lines of code to solve this challenge…

image

$RunbookActivityID = “RB2135”

#Get the SMA runbook activity class (Cireson SMA connector)
$ActivityClass = Get-SCSMClass -Name System.WorkItem.Activity.SMARunbookActivity$

#Get the SMA runbook activity object
$ActityObject = Get-SCSMObject -Class $ActivityClass -Filter “Name -eq $RunbookActivityID”

#Get the relationship class System.WorkItemContainsActivity$
$RelationWorkItemActivity = Get-SCSMRelationshipClass System.WorkItemContainsActivity$

#Get the relationship object and use the activity as target and query the relationship id for the matching relation id. Select the source object.
$Object = (Get-SCSMRelationshipObject -ByTarget $ActityObject | Where-Object {$_.Relationshipid -eq $RelationWorkItemActivity.id}).SourceObject

While ( $Object.Name -notlike “SR*”)

  {   
      Write-Host -ForegroundColor Green  $Object.Name
      $Object = ((Get-SCSMRelationshipObject -ByTarget $Object | where {$_.Relationshipid -eq $RelationWorkItemActivity.id}).SourceObject)
    
  }

Write-Host -ForegroundColor Yellow  $Object.Name

…and if you run the code it looks like this…

image

Having this code in place we are able to access the service request properties e.g. $Object.Name and also the ID which we are able to use in further runbooks. Of course you could modify the While ($Object.Name –notlike “SR*”) statement to stop at another point in the activity hierarchy like the next sequential activity (SA*) or next parallel activity (PA*).

In this example and In all our other automation projects we use Cireson SMA Connector and the SMLets from Codeplex.

Here I provided the code logic and you just need to fit in a workflow to use it in SMA. Happy SCSM-SMA-ing.


Filed under: Script, Service Manager, SMA

NiCE Datacenter Tag 2015 – Session “Operations Manager Dashboards – Neue Widgets und Möglichkeiten”

$
0
0

image

This blog title looks probably a bit strange to you because it is in German. The company NiCE is organizing a “Datacenter Day” in Munich and there will be all kind of topics presented by Microsoft, partners and people from the community. I will be having a session about the new widgets and its capabilities in SCOM called “Operations Manager Dashboards – Neue Widgets und Möglichkeiten”.

According to Microsoft, 60% of System Center customers are actively using System Center Operations Manager (SCOM) today. While there are many different System Center events available, only a handful of them focus predominantly on SCOM and new products like Azure Operational Insights (OpInsights) for big data analytics. As a leading System Center ISV in Germany, NiCE will be hosting an event in conjunction with Microsoft to address the above mentioned gap. The objectives of the event include:

  • Present and discuss the latest developments and news on SCOM and OpInsights
  • Hands-on tips and tricks for using SCOM in large environments
  • Illustration and recount of the common System Center integration scenarios
  • Present partner extensions for System Center
  • Provide top content in German for the DACH region

The detailed agenda looks like this (subject to change):

image

If you are interested in attending this event it will held on:

February 17th 2015 at Microsoft München,Konrad-Zuse-Straße185716 Unterschleißheim.


Filed under: Azure Operational Insights, Dashboard, Recommended

System Center – RSS Feeds Collection

$
0
0

It doesn’t matter If you are new to the System Center stack or if you have been working many years in the System Center field, we all share the same problem. How are we keeping up with this massive amount of information dropping in every single day? There are so many excellent sources out there but you have to find it and get informed if there is a new article or blog post.

I think staying up-to-date is essential for must of us, especially if you are working as a consultant. So, how are we staying ahead? Well, I like Twitter very much when it comes to interact with the community and getting information immediately. In my opinion Twitter is the fastest source for getting information almost in real-time.

One other source which everyone knows about, but somehow not a lot of people are using are RSS feeds. This technology started about 15 years ago and was developed further up to now. RSS feeds are based on XML and publish the blog or website in a structured way. Almost any blog / website has such a functionality and if enabled it lets you collect all the published content in a very comfortable way. You can use many applications for displaying RSS feeds like Internet Explorer, Outlook or dedicated applications like Feedreader . You can use Feedreader online or install it on your Windows box.

image

Feedreader (Online)

image

Feedreader (Client)

image

I would like to share my (personal) feeds which I collected and tried to categorize by topic and / or technology (download the OPML file at the end of the post). I tried to collect as many good and reliable sources for staying up-to-date in the System Center field. If I missed any valuable source it was not my intention, let me know and I will add it.

For IT News:

image

For Microsoft Azure:

image

For Microsoft Support:

(Latest updates from Microsoft on product issues and fixes)

image

Microsoft Product Team Blogs:

(All feeds from http://blogs.technet.com/b/server-management/)

image

PowerShell Blogs:

image

(Mostly but not only) SCCDM MVPs Blogs:

(If I missed a valuable blog let me know!)

image

System Center Community Blogs:

(Here I categorized several community blogs from Microsoft employees and community members)

image

If you are not interested in certain parts you can easily delete or update these feeds yourself. My goal was to provide a solid System Center / server management / community RSS feed collection.

I exported the above feeds in the OPML format which you can import in almost any feed reader. You can find the file here TechNet Gallery .


Filed under: Configuration, Orchestrator, Recommended, RSS, Service Manager, Software, System Center

SMA – SMA Runbook Toolkit (SMART) Watch The Encoding

$
0
0

Recently while building some serious SMA runbooks I bumped in an issue. We usually develop the SMA runbooks in different editors like PowerShell ISE / SMA GUI / Notepad++ and keep them saved in TFS online, which we managed through Visual Studio. After we finished developing the runbook, which would update the runbook activity in Service Manager, I discovered some strange characters in the description field.

The description of the review activity looked like this…

image

Our development process works like this. First we write a runbook in our favorite editor and if we have a working release we copy the script to the SMA development environment (DEV) and see how it goes. If everything works as expected we use the SMART toolkit to export and import the runbooks into the integration environment (INT). This process worked great so far and we have been doing this for dozens of runbooks without an issue. At one point there was a need to write some German text in a description field of a review activity in Service Manager. In the screenshot above you can see that Service Manager could not display certain characters properly.

By checking the raw runbook in TFS everything seemed to be ok, it appeared like this…

image

The description text contains some summary data from the service request written in German language and some French character. The description text is wrapped in a here-string and the problematic character is an “à”. I was wondering how the script looked in SMA and therefore I checked the editor in SMA. The runbook itself in SMA looked like this, notice the “rectangle” characters…

image

The only difference between the TFS version and the SMA version was, that the SMA runbook had been imported using the SMART tool. This confirmed my suspicion that the import process could not deal with this special “à” character . I checked other places where I had some other “non-English” charaters like “ä,ö,ü etc.” and they did also not appear correctly. After some more investigation, I had another runbook which also contained such problematic characters BUT it did not create any problems in SMA nor in Service Manager. All characters appeared properly as they should.

Hmm…

Let’s summarize so far, I have two runbooks both contain special characters, both were imported through the SMART toolkit BUT one runbook did NOT display the special characters properly in SMA / SCSM and one runbook displayed the description string correctly.

Analysis:

Luckily we manage the runbooks in TFS so I made a copy of both runbooks and looked at the files in a HEX editor. Why? Well, if you think about it, runbooks are just PowerShell files and PowerShell files are just plain text files with a “PS1” suffix. To see if there is a difference we need to analyze the text file in a HEX editor “to see behind the curtain”. A possible choice since we use Notepad++ a lot, is to install the HEX Editor plugin for Notepad++…

image

image

I resarted Notepad++ then I opened both files and switched to HEX view. On the left side the “OK” runbook and on the right side the “Not OK” runbook which caused some problems.  We clearly see there is a difference in the code…

image

Just from a first brief look we can tell that both files are not the same “format”. The “format” or better character encoding of the file seems to be different. Therefore let’s check both encoding types in Notepad++…

image

In the example above we also see that the “OK” script uses a UCS-2 Little Endian (UTF-16) encoding and the problematic “Not OK” file an ANSI encoding. By just looking at the files itself we cannot recognize any difference. Well I am not an encoding expert, but as far I know the most popular encoding used in the World Wide Web area is UTF-8 and also if I look at the metadata of the SMA web service https://smaserver:9090/00000000-0000-0000-0000-000000000000/$metadata the encoding type used in the header is UTF-8.

image

Conclusion:

If we look a the SMART toolkit it uses the Import-SmaRunbook cmdlet which finally uses the SMA web service to import / update the runbooks. This leads us to the conclusion, that it is best to have all your runbooks you write in a UTF-8 encoding format. Usually you will not encounter any character problems as long you just write plain PowerShell, because there you don’t have any special characters. BUT(!) as soon you start adding free text in your PowerShell workflows within a here-string or in a regular string written in your native language which might be German, Dutch or any other language  you are very likely to run into this issue when you try to import your runbooks using the SMART toolkit or the Import-SmaRunbook cmdlet itself.

Solution:

How can you avoid running into this issue? Well as I mentioned you need to have saved your scripts as UTF-8 encoded. If you use PowerShell ISE in version 3 and higher ALL your scripts are natively saved as UTF-8. In this screenshot I just checked my PowerShell ISE on a Windows 8.1…

image

As soon you are editing / saving your scripts in different editors or your are copying your files around from one environment to another it could happen that this changes your encoding of the file. Is there an easy solution to fix this problem?

Well, if you just have one or two files you can check the encoding in Notepad++ or any other capable editor.

image

If you have multiple files you could use this PowerShell script to check the encoding. Finally if you really just want to convert everything in bulk to UTF-8 in one shot, I provide here a simple script that just uses a source directory and a target directory and will the copy and convert all files from the source directory recursively to the target directory in the UTF-8 format. I converted the problematic files in my environment and everything appeared as expected.

You can find the script here, I hope this helps.


Filed under: Configuration, Script, SMA, Troubleshooting

SMA – Runbooks Not Showing Up (e.g. Cireson SMA Connector)

$
0
0

My colleague and I have been developing runbooks for the past couple of month for automating business services. During this time we have written close to two hundred runbooks. Key application in triggering the runbooks is System Center Service Manager (SCSM) in conjunction with Cireson’s SMA Connector. This works great up to the past month when we discovered that not all runbooks were showing up when we were creating SMA runbook templates in SCSM.

The problem looked like this…

Create a template in the SCSM console under Library/Templates

image

Next, in the dialog select the Runbook tab and select the SMA connector in this example SMACon01….

image

Another dialog pops up and you need to select a runbook. The problem was that the runbook was not visible in this list.

image

A possible problem could be that the runbook is not published in SMA, which was not the case. One workaround was, that we deleted the runbook in SMA and re-created it. After a while the runbook would appear some times and some times not.

We checked the SMA database, restarted the SMA services, SMA IIS services but nothing really helped. After a while we figured out it must me a SMA web service issue. My colleague and automation guru Fulvio Ferrarini finally found the solution (all credits to him for this solution).

It seemed that when SMA connector was requesting the runbooks, the SMA web service was limiting the number of entries that are returned by the request. Exactly this behavior is described in a MSDN article https://msdn.microsoft.com/en-us/library/dn688368.aspx.

Paging in Service Management Automation Web Service

System Center

For performance reasons, the Service Management Automation web service limits the number of entries that are returned by a single request. If you want to retrieve a number of members of a particular collection that exceeds the maximum number for that collection, then you must retrieve multiple pages using multiple requests. For most kinds of objects this limit is 50, but it can be changed through the configuration of the Service Management Automation application in Internet Information Services.

Changing the Number of Entries Returned

The number of entries returned from the Service Management Automation web service in a single request can be changed using the following procedure.

To change the number of entries returned in a single request
  1. On the computer running the Service Management Automation web service, select Start, then Administrative Tools, and then Internet Information Services (IIS) Manager.

  2. Expand the computer, then Sites.

  3. Select SMA.

  4. In the /SMA Home pane, double-click Application Settings.

  5. Double-click the type of collection that you want to change.

  6. In the Value field, type the number of entries to return and click OK.

In IIS it looks like this, the default value is 100 and we increased it to 300

image

 

After this change all “missing” runbooks appeared. We have not fully tested this setting and need still some more time to verify it. So make these changes at your own risk and test it in your lab first!

I hope this saves you some time!


Filed under: Configuration, SMA, Troubleshooting

SCOM 2012 / SCSM 2012 – CMP Tool V 1.0 (Compare Management Pack Tool)

$
0
0

This blog has been quiet for some time because I was quite busy with all kind of stuff. Anyway, this time I would like to share a tool I created to compare management packs in SCOM /SCOM or SCOM/SCSM, SCSM/SCSM management groups. In real life it is hard to compare the current management packs installed in management group A (Source) versus management group B (Target). This happens for example, if you want to have a comparison between your SCOM integration environment and your production environment. You probably want to know which management packs are equal in both management groups or which are missing in management group A and which management packs are missing in management group B compared to each other.

The tool looks like this..

image

How does it work? First you need to set your credentials and server FQDN for each of the management groups by selecting the “Connection Settings…” within the menu…

image

and set the credential, click save…

image

This will create a “config.xml” file in your application directory. The password is encrypted…

image

Next click “Get Source MP to XML” and “Get Target MP to XML”, this will connect to the source and target management group and get all management packs which are imported in each management group and create an “MPSource.xml” and “MPTarget.xml” file in your application directory. It will look like this…

image

If you just click “Compare MPs”, the tool displays all management packs that are EQUAL in green color, management pack name (MP Name), is the criteria…

image

In the “MP Comparison Criteria Selection” you are able to select what the comparison criteria is, in this case the management pack name (MP Name)…

image

If you select “Show difference by…” and click “Compare MPs” the tool will show which management packs are MISSING/DIFFERENT in each management group according to the selected criteria…

image

If you click “Export Excel” and and you have Excel installed on the computer you are running the tool, it will dump the result into an Excel sheet…

image

image

Because you are able to connect to each management group separately and create the “MPSource.xml” and/or “MPTarget.xml”, you are able to compare management groups which are not connected through the network nor reachable in any way. Just copy the tool to each management server, connect and create the XML files. Then copy both files “MPSource.xml” and “MPTarget.xml” into the same directory and run the tool offline. Cool huh?

The cool thing is, you are able to connect to SCOM 2012 and SCSM 2012 management groups.

Requirements:

  • SCOM 2012 / SCSM 2012
  • .NET Framework 4.5
  • SCOM 2012 SDK Binaries (included in the download, not necessary if you have the SCOM console installed)
  • Excel 2013 (optional)

Download from TechNet Gallery

I have tested the tool, but let me know if you run into a problem….

Update:

13.03.2015 Version 1.0.1 – Fix when reloading XML, Form Resize


Filed under: Configuration, Management Pack, Service Manager, System Center, Tool, Troubleshooting

SCSM – Passing Attachments via Web Service e.g. SMA Web Service

$
0
0

If you do automation in a “Microsoft World” you probably start these days with Service Manager and Service Management Automation (SMA). At one point you will have a need for building a connector to another system for exchanging data. One example could be that you need to submit incident data from Service Manager to another ticket system e.g. ServiceNow. If you are lucky the target system offers a web service and you just need to pass the data in XML. That sounds pretty easy but how are you going to pass the attachments to the foreign system via web service? One approach is to get the attachments from the source, save it as files on a file share and then the files are pushed to or pulled from the target system. Yes this is one way, but why should you make this extra step and saving the attachment files on a file share? Isn’t it possible just passing the data in one shot? Well, this is the purpose of this blog post.

I don’t have access to ServiceNow nor another ticketing system which would offer a web service interface. For that reason I will use the SMA web service to show how to convert the attachment and passing it via string and XML data.

For a better understanding I created a graphic…

image

The first runbook is called Get-Attachment and has an input parameter WorkItemID. The runbook connects to Service Manager and iterates through the attachments of the specified work item in this example IR1234 and converts it to a Base64String. The last line of Get-Attachment runbook uses the Start-SmaRunbook cmdlet to call the Add-Attachment runbook. The Start-SmaRunbook cmdlet uses the SMA web service to trigger the runbook and submit the attachment data.  This is the step which proofs, that you can submit the attachment data through a web service call.

The Add-Attachment runbook uses the “XML” output string from Get-Attachment runbook and converts it to XML. The actual attachment content is converted back from the Base64String and finally added as attachment to the work item.

Well, yes it adds the attachments to the same work item where it got the items from (IR1234), but the purpose of this post is to show the round trip of the attachments. It does not make sense in a real-world scenario I know, again it is just to show how to solve this problem technically.

There are few posts on the internet which show how to export the attachments from Service Manager. One example is provided by the guys from Litware, you can find it here. I used some of this code to retrieve the attachments from the work item. But instead of saving the attachments to the file system, the content is converted to a Base64String.

Workflow Get-Attachment:

workflow Get-Attachment
{

[OutputType([string[]])]
Param
(
        #Get the work item ID
        [Parameter(Mandatory=$true)][string]$WorkItemID
)
            #Define the web service endpoint, target SCSM & credentials in SMA (Assets)
            $WebServiceEndpoint = Get-AutomationVariable -Name "VARG-SMAWebServiceEndPoint"
            $SCSMServer = Get-AutomationVariable -Name "VARG-SCSMServer"
            $Creds = Get-AutomationPSCredential -Name "VARG-SCSMServerCredential"
            #Run the next code as InlineScript on Service Manager
            [string]$XMLString = InlineScript 
            {
                #Define classes and realtionships
                $ClassWorkItem = Get-SCSMClass -Name "System.WorkItem$"
                $ClassWIhasAttachment = Get-SCSMRelationshipClass -Name "System.WorkItemHasFileAttachment"
                $WorkItemObj = Get-SCSMObject -Class $ClassWorkItem -Filter "Id -eq $Using:WorkItemID"
                $Attachments = Get-SCSMRelatedObject -SMObject $WorkItemObj -Relationship $ClassWIhasAttachment

                $XMLString = ""
                $Content =""
                $Content= @()
                #If the incident contains attachments
                If($Attachments -ne $Null)
                {
                        #Iterate thorough each attachment and "convert" it into a memory stream
                        ForEach ($Attachment in $Attachments)
                        {      
                                $MemoryStream = New-Object IO.MemoryStream
                                $Buffer = New-Object byte[] 8192
                                [int]$BytesRead | Out-Null
                                while (($BytesRead = $Attachment.Content.Read($Buffer, 0, $Buffer.Length)) -gt 0)
                                {
                                    $MemoryStream.Write($Buffer, 0, $BytesRead)
                                } 

                                $Memory = $MemoryStream.Toarray() 

                        #Convert the bytes (attachments) in memory to a Base64String                          
                        $Content = [convert]::ToBase64String($Memory)

#This here-string is used to concatenate the data in a string format for passing it to the next runbook via web service                     
$XMLString +=@"
<Attachment>
<DisplayName>$($Attachment.DisplayName)</DisplayName>
<Description>$($Attachment.Description)</Description>
<Extension>$($Attachment.Extension)</Extension>
<Content>$Content</Content>
</Attachment>
"@
                            #Set memory stream to Null
                            $MemoryStream.Close()
                            $Memory = $Null
                        }
                 #Return the previously built here-string with all the attachment data     
                 Return $XMLString
             }         
         }    -PSComputerName $SCSMServer -PSCredential $Creds   
    #Start the SMA runbook via Start-SMARunbook cmdlet, which uses the SMA webservice to pass the data
    Start-SmaRunbook -WebServiceEndpoint  $WebServiceEndpoint -Name "Add-Attachment" -Parameters @{"XMLString"=$XMLString;"WorkItemID"=$WorkItemID}
}

The next step is adding the attachments to the work item. There are also few sources on the internet which discuss this topic, but not many really show how to do it. The best “hint” I found was this post here on TechNet forum . To complete our solution I needed to modify the code accordingly. Basically I create an XML object from the $XMLString input containing all the information of the attachments. The attachment content itself is converted back from the Base64String and finally used to create and relate the attachment object.

Workflow Add-Attachment:

workflow Add-Attachment
{
Param
(
        #Pass attachment data and work item ID
        [Parameter(Mandatory=$true)][string]$XMLString,
        [Parameter(Mandatory=$true)][string]$WorkItemID
)
        #Get the SMA automation variable (Asset)
        $SCSMServer = Get-AutomationVariable -Name "VARG-SCSMServer"
        $Creds = Get-AutomationPSCredential -Name "VARG-SCSMServerCredential"
        #Run the next code as InlineScript remotely on the SCSM server
        InlineScript
        {
            #Define the classes and relationships
            $FileAttachmentRel = Get-SCSMRelationshipClass "System.WorkItemHasFileAttachment"
            $FileaAttachmentClass = Get-SCSMClass -Name "System.FileAttachment"
            $WorkItemProjection = Get-SCSMObjectProjection System.WorkItem.Projection -Filter "id -eq $Using:WorkItemID"
            $ManagementGroup = New-Object Microsoft.EnterpriseManagement.EnterpriseManagementGroup $Using:SCSMServer
            #Embedd the $XMLString into <Attachments></Attachments> nodes and convert it to XML
            [xml]$XML = "<Attachments>$($Using:XMLString)</Attachments>"
                #Iterate through each attachment node and get the properties
                ForEach ($Attachment in $XML.Attachments.Attachment)
                {
                    #Convert the Base64String back to bytes
                    $AttachmentContent = [convert]::FromBase64String($Attachment.Content)
                    #Create a new MemoryStream object out of the attachment data
                    $MemoryStream = New-Object System.IO.MemoryStream($AttachmentContent,0,$AttachmentContent.length)
                    #Create the attachment object itself and adding the attachment properties from the received XML
                    $NewFile = new-object Microsoft.EnterpriseManagement.Common.CreatableEnterpriseManagementObject($ManagementGroup, $FileaAttachmentClass)
                    $NewFile.Item($FileaAttachmentClass, "Id").Value = [Guid]::NewGuid().ToString()
                    $NewFile.Item($FileaAttachmentClass, "DisplayName").Value = $Attachment.DisplayName
                    $NewFile.Item($FileaAttachmentClass, "Description").Value = $Attachment.Description
                    $NewFile.Item($FileaAttachmentClass, "Extension").Value = 	$Attachment.Extension
                    $NewFile.Item($FileaAttachmentClass, "Size").Value = 		$MemoryStream.Length
                    $NewFile.Item($FileaAttachmentClass, "AddedDate").Value = 	[DateTime]::Now.ToUniversalTime()
                    $NewFile.Item($FileaAttachmentClass, "Content").Value = 	$MemoryStream
                    #Add the attachment to the work item and commit the changes
                    $WorkItemProjection.__base.Add($NewFile, $FileAttachmentRel.Target)
                    $WorkItemProjection.__base.Commit()
                }
        } -PSComputerName $SCSMServer -PSCredential $Creds   
} 

It was a bit tricky to figure everything out but now I hope providing this example helps you automating your environment. Of course if you need to build e.g. a ServiceNow SOAP request you would have to construct the XML according to the ServiceNow SOAP specification.

I uploaded the PowerShell scripts to TechNet Gallery here.


Filed under: Configuration, Script, Service Manager, SMA, System Center

SCSM – Set Work Item State / Activity State via PowerShell

$
0
0

I do currently a lot of automation and today I had a need to change state of a service request. The service request itself was “Closed” and as you might know, you cannot change anything if the work item is set to “Closed”. Because I needed to trigger the runbook activities behind the service request again, I needed to “reactivate” the service request. How are we doing this? Of course, using PowerShell and SMLets.

The service request looked like this and nothing can be changed anymore…

image

So I created this script to change the state of work items and activities.

Save this script as Change-WorkItemState.ps1

Param
    (    
        [Parameter(Mandatory=$true)][String]$ObjectID,
        [Parameter(Mandatory=$true)]`
        [ValidateSet("Active","Cancelled","Completed","Failed",`
        "OnHold","Ready","Rerun","Skipped","Closed","Active.Pending",`
        "Resolved","New","Failed","InProgress","Submitted","Editing")][String]$Status
    )

try
{
    # Settings and modules
    #####################################################################
    $ErrorActionPreference = "Stop"
    $Error.Clear()    
    If (!(Get-Module SMLets)) {Import-Module SMLets -Force}
    #####################################################################
    # Classes and enumerations
    #####################################################################
    #Get the first two characters of the $ObjectID e.g. IR1234 = IR and then get the class and enumeration       
    Switch (($ObjectID.Substring(0,2)))
    {    
        "MA" {$Class = Get-SCSMClass -Name ^System.WorkItem.Activity$;$EnumStatus = Get-SCSMEnumeration -Name ^ActivityStatusEnum.$Status$}
        "RB" {$Class = Get-SCSMClass -Name ^System.WorkItem.Activity$;$EnumStatus = Get-SCSMEnumeration -Name ^ActivityStatusEnum.$Status$}
        "SA" {$Class = Get-SCSMClass -Name ^System.WorkItem.Activity$;$EnumStatus = Get-SCSMEnumeration -Name ^ActivityStatusEnum.$Status$}
        "PA" {$Class = Get-SCSMClass -Name ^System.WorkItem.Activity$;$EnumStatus = Get-SCSMEnumeration -Name ^ActivityStatusEnum.$Status$}
        "SR" {$Class = Get-SCSMClass -Name ^System.WorkItem.ServiceRequest$;$EnumStatus = Get-SCSMEnumeration -Name ^ServiceRequestStatusEnum.$Status$}
        "PR" {$Class = Get-SCSMClass -Name ^System.WorkItem.Problem$;$EnumStatus = Get-SCSMEnumeration -Name ^ProblemStatusEnum.$Status$}
        "RR" {$Class = Get-SCSMClass -Name ^System.WorkItem.ReleaseRecord$;$EnumStatus = Get-SCSMEnumeration -Name ^ReleaseStatusEnum.$Status$}
        "IR" {$Class = Get-SCSMClass -Name ^System.WorkItem.Incident$;$EnumStatus = Get-SCSMEnumeration -Name ^IncidentStatusEnum.$Status$}
        "CR" {$Class = Get-SCSMClass -Name ^System.WorkItem.ChangeRequest$;$EnumStatus = Get-SCSMEnumeration -Name ^ChangeStatusEnum.$Status$}    
    }
    ###################################################################
    #Find the object           
    $Object = Get-SCSMObject -Class $Class -Filter "Name -eq $ObjectID"
    #Set object state and write output information
    If($Object)
    {
        Write-Host "Changing Status of $Object to $EnumStatus" -ForegroundColor Cyan
        Set-SCSMObject -SMObject $Object -Property Status -Value $EnumStatus  
        Start-Sleep -Seconds 2
        Write-Host "Status of $Object changed to $EnumStatus" -ForegroundColor Green
    }
    Else
    {
        Write-Host "$ObjectID not found" -ForegroundColor Red    
    }    
}
catch 
{
    Throw $Error[0].Exception
}

finally
{
    If (Get-Module SMLets) {Remove-Module SMLets -Force}
}

You need to provide the ID of the object like IR1234, SR4978, RB3214 etc. for which you want to change state. For the different work item and activities you can set the following states  ( I hope I got all)…

Servicer Request (SR)

  • New
  • Closed
  • Completed
  • Failed
  • Cancelled
  • On Hold
  • In Progress
  • Submitted

Problem Record (PR)

  • Active
  • Resolved
  • Closed

Release Record (RR)

  • New
  • Cancelled
  • On Hold
  • Failed
  • In Progress
  • Completed
  • Closed

Incident Record (IR)

  • Active
  • Closed
  • Active.Pending (Pending)
  • Resolved

Change Request (CR)

  • New
  • Failed
  • In Progress
  • On Hold
  • Cancelled
  • Submitted
  • Completed

Manual Activity (MA), Runbook Activity (RB), Parallel Activity (PA), Sequential Activity (SA)

  • Completed
  • Cancelled
  • Skipped
  • In Progress
  • On Hold
  • Failed
  • Rerun
  • Pending

Now let’s run an example, let’s assume you have an incident IR5754 and the state is “Resolved” and you want to change it to “Pending” …

image

Run the script…

image

et voilà….

image

The same procedure works for all activities like Manual Activity (MA), Runbook Activity (RB), Parallel Activity (PA), Sequential Activity (SA) and all the work items listed above. Make sure you have the SMLets installed on the computer you are running this script.

Conclusion: You can change state of any work item or activity easily using PowerShell. It is even possible to “re-animate” closed work items like a service request.

Have fun!


Filed under: Configuration, Script, Service Manager, Troubleshooting

SCOM 2012 – NiCE Log File Library Creating Alert From Mail

$
0
0

image

Some time ago I blogged about the first release of NiCE Log File Library, you find the post here . I was very excited that NiCE released a free MP which provides lots of rules and monitors that can be configured in countless ways. It provide many capabilities to monitor any kind of log files on a Windows system. The good news is that NiCE released a new improved version of their Log File Library Version 1.30.

So what’s new?

  • Improved SCOM Authoring Console Wizards
  • Improved Cluster Support
    • Fixing some minor cluster issues
  • Self-Monitoring enhanced
    • Several rules to monitor the MP itself and discovering configuration issues.
  • Performance and stability improvements

Features & Functionalities of the NiCE Log File MP

  • Equipped with a set of custom authoring wizards so you can create
    • Alert Rules
    • Performance Counters
    • Unit Monitors
  • Allows you to filter log entries using regular expression language
  • Includes a powerful program execution interface to run scripts and programs
  • Correlate your log lines
  • Allows you to define log file names as absolute paths
  • Customize behavior if log file does not exist
  • Set the log file directory using a regex pattern with no restriction

The in-place upgrade from version 1.27 is supported and the new installation itself runs without any issue. NiCE provides a solid MP which rocks like hell (FOR FREE) but what could be a NiCE use case?

I have seen some questions in the TechNet forums how to create a SCOM alert from mail. I think this is a perfect use case for this MP. Ok, let’s  assume we would not have this MP how would we create a SCOM alert from mail? E.g you could use Orchestrator or SMA to connect to an Exchange or other mailbox and query / pull for the mail using Orchestrator IP or PowerShell. This is a valid solution, but I think there are too many systems involved and maybe not everyone has the necessary skills or systems to do it. I would suggest to try an other approach, like using the Windows built-in SMTP service to receive the mails and the NiCE MP to parse the mails and create an appropriate alert.

Install SMTP Server

For ease of use I installed the SMTP service on my SCOM server OM01. Go to the “Add Roles and Features Wizard” and select “SMTP Server”…

image

“Add Features”…

image

…and hit “Install”….

image

Configure SMTP Server

Open the IIS 6.0 Manager and select “Properties”…

image

Assign the IP address of the server to the server…

image

Depending on your needs you might need to enable all authentication methods on the “Access” tab like here…

image

In order to relay itself you need to add the server’s (OM01) loopback and IP address to the “Relay Restriction” tab. Depending on your situation you might need to add other relay access rules…

image

The rest of the SMTP server settings are not needed to change for this example. Of course in production you would need to make sure that all other settings fit your company policies.

What happen, when you try to send a mail to this SMTP server? The server will accept your mail and put it into the default C:\inetpub\mailroot\Queue folder, because it cannot be sent to another mail server…

image

Later on I’ll show you how to test it.

Installing the NiCE Log File Library

Download the package from NiCE.de and run the installer  NiCE_LogFileMP_0130.msi to extract the MPB file. There is just one management pack bundle file which you need to import, if you are installing the MP for the first time, just import it or you could upgrade it from a previous version 1.27. The prerequisites are minimal just make sure .NET 2.0 SP1 is already installed on all systems that are monitored.

Create Alert Rule

Go to Authoring in SCOM and select Rules and start the wizard. Select “Expression Filtered / Alert Rule (Advanced)”…

image

Click “Next” and give the rule a name, choose the target and disable the rule…

image

Click “Next” , on this page you could run a command which would e.g. in out scenario delete old mails from the directory before the rule actually runs…

image

Click “Next” and select the log file path and a name pattern for the mails. The mails start with “NTFS_” and end with “.EML”.

image

“Next” and enter the optional expression “Subject:” to find filter the log file entries. In order to split the log file in different parts use this expression:

Subject:\s+(?<Severity>\w+)\s+(?<Device>\w+)\s+(?<Location>\w+)

Using this expression, splits the line in “named” groups which can be used to refer e.g. in the alert description. I have three different parts “Severity” (e.g. Critical), “Device” (name of the device) and “Location” (location of the device RZ1, Datacenter1 etc.). Click “Regex testing tool”…

image

Using the built-in “Regex testing tool” helps you to check your expressions. Under “Logfile Line” enter the subject line you want to send by mail. In my example I just use spaces (\s+) to separate the information. The “Sample Output (XML)” shows you what the actual output is and if your expression works…

image

On the other tab “XPath” see the available variable you could use in the event filtering or alert description…

image

Click “Ok” and “Next”, on the next page setup the event filtering, here I will receive an alert if severity is “Critical” and the device is “Router01” in the subject line…

image

Click “Next” and enter the alert name and description. In my case I use the variables from before and refer to it like this…

The Router $Data/RegexMatch/Device$ in $Data/RegexMatch/Location$ sent a critical alert received by mail.

The mail is $Data/LogFileName$.

image

Next override the rule for just the object, where you have the SMTP service installed, in my case OM01. Choose “For a specific object of class Windows Server 2012 Operating System”, in my case the OM01, where the rule will be run…

image

Set “Enable” to “True”….

image

That’s it, everything is now configured to test the rule.

Create Mail

We have configured the SMTP service to receive mails from OM01. Because I don’t have a router or anything like that available to generate a mail I use PowerShell. PowerShell has a cmdlet Send-MailMessage which perfectly fits our needs. To fire a mail we type the following command…

Send-MailMessage -Body “Router Alert”  -From “Router01@rz.com” -SmtpServer 172.16.40.145 -Priority High -Subject “Critical Router01 RZ003” -To om01@services.lab.itnetx.ch

image

If this command succeeds you will find the mail in the queued directory of the SMTP server…

image

…and will look like this…

image

In my lab it took about 5-6 min for the rule to pick up the mail and parse it and finally creating the alert…

image

Of course you could configure any device to send a mail to the SMTP server and generating an alert. I have not tested it in a production environment how everything would behave, however I think it is a cool idea and a good starting point if you need such an approach. You could improve performance, by frequently deleting the “old” mails from the “Queue directory” e.g. by using the “Preprocessing Settings” of this rule.

There are many more rules to explore of the NiCE Log File Library, so make sure to download your copy today.


Filed under: Configuration, Management Pack, Script

Quick Post – Book “Microsoft System Center Operations Manager Field Experience”

$
0
0

Microsoft released recently a 128 pages e-book about System Center Operations Manger. It is a book talking more about the insights of SCOM and how you can troubleshoot, optimize and best practices. I highly recommend reading this book for all those who want to dive a bit deeper into SCOM.

image

Learn how to enhance your Operations Manager environment and better understand the inner workings of the product – even if you are a seasoned Operations Manager administrator. If you are responsible for designing, configuring, implementing, or managing a Microsoft System Center Operations Manager environment, this e-book is for you.

Thanks to the authors Danny Hermans, Uwe Stürtz, Mihai Sarbulescu, Mitch Tulloch for this excellent book. Find this book in different formats for FREE here.


Filed under: Book, Recommended

SCOM 2012 R2 Technical Preview 2 – What’s New?

$
0
0

image

Few days ago Microsoft released the technical preview of it’s Windows Server and System Center stack. One thing I am very interested in is SCOM, if there is anything new.

If you download SCOM in comes on a pre-configured VHD including the latest Windows Server (2016) TP2, SQL Server 2014 and SCOM 2012 R2 TP2. So there is no extra configuration necessary and you can start testing right away.

So what’s new?

GUI / Interface / Web Console

The console and also the web console do not show any changes in appearance or design. The version of SCOM 2012 R2 TP2 is 7.2.11097.0 as it shows on the help screen.

image

Even in the dashboard or widget section I did not find anything new…

image

Management Packs

Out of the box TP2 has 100 management packs installed. Most versions are higher than before, the only management packs which are equal to the current SCOM 2012 R2 version are the following management packs…

image

PowerShell Cmdlets

Comparing the cmdlets there are no changes. SCOM 2012 R2 TP2 contains 173 cmdlets and if I compare them against my current SCOM 2012 R2 (OM01), there is no difference.

image

Services

The Windows services are still the same as in the current SCOM 2012 R2 version.

Connectors

Out of the box the Azure Operational Insight connector is named correctly and also integrated out of the box…

image

Functionality

Here comes the fun part, I found TWO ONE improvement / feature.

Download Management Packs

Update 07.05.2015: Thanks to @rem8 for telling me that this task also exists in the SCOM 2012 R2 version, so this is NOT NEW in SCOM 2012 R2 Technical Preview 2! My mistake :(

There is a task to download the management packs to your server / client locally from Microsoft via management pack download wizard…

image

If you click “Download Management Packs”, you will be able to select a directory where the MPs should be saved….

image

…and if you click “Add” you get a familiar dialog to select the MPs from the online store from Microsoft…

image

…after you click “OK” all the *.mp files will be downloaded…

image

Now you have the management packs offline available. Very handy wizard and functionality!

Maintenance Mode Scheduler

Up to now it was not possible to schedule maintenance mode in the future. Of course you could use scheduled tasks and PowerShell scripts or third party tools to do this, but depending on the approach it was not resilient, reliable and flexible. Go to “Administration”  and you will find a new option called “Maintenance Schedules”…

image

This kicks off a wizard to select the objects to put into maintenance mode…

image

…the default class is “Computer” (Microsoft.Windows.Computer), which is probably the most used class to use for this purpose. You could also pick any other object of any available class. As you can see I am able to pick my SCOM server to put into maintenance mode, which you NEVER EVER should do if you just have one management server! It would have been perfect if Microsoft would have avoided to be able to select the management servers in this wizard or at least shoot a warning or so.

On the next page you are able to select the schedule…

image

After you set the schedule, you need to give it a name and select the reason. In addition you are able to “Enable/Disable” this maintenance mode schedule…

image

Finally you see you entries in the console…

image

I was wondering where SCOM saves these entries, so I explored the OperationsManager Database and found these probably most important tables among others. As you can see in dbo.MaintenanceModeSchedule we see your previously created entry…

image

…this is linked to the dbo.ScheduleEntity table….

image

..which is linked to the dbo.BaseManagedEntityId…

image

…there is also a view dbo.MaintenanceModeView, which shows the objects in maintenance mode…

image

…and also some stored procedures to start and stop the maintenance mode or grooming out the maintenance mode history etc….

image

I am happy to see that Microsoft is adding some new features to SCOM, especially that there is now a maintenance mode scheduler available that works. It is something that was really, really missing. Hopefully there will be more improvements and enhancements in the upcoming releases. I hope this gives you a good overview of SCOM 2012 R2 Technical Preview 2.


Filed under: Configuration, Software, System Center

Merger of itnetx gmbh and Syliance IT Services GmbH

$
0
0

image

BERN/ZÜRICH. Two of the most important companies in the field of Microsoft Workplace, Cloud and Management Solutions, Syliance IT Services GmbH and itnetx gmbh, will merge as of 1st July 2015 to form the new company, itnetX AG.

A situation of market collaboration will prove to be fruitful: „This fusion makes sense and is important”, says Markus Erlacher, CEO of itnetx gmbh, „we will combine our energies, create synergy and send a clear signal of intent that we are committed to growth and business expansion”. The merger of the two companies will strengthen the market leadership and enable more customers more intensive treatment from the company offices in Bern and Glattbrugg, Zürich.

„The ability to offer integrated solutions and services in the rapidly-evolving IT markets is essential for our success”, says Dieter Gasser, CEO of Syliance IT Services GmbH. „With this merger we are consolidating and expanding our range of services; we can offer our customers proprietary software solutions and architecture and process consulting, in addition to systems integration and managed services”.

The newly-formed itnetX AG is aiming at changes in the IT market and the Microsoft Cloud First, Mobile First strategy lies at the heart of the business model. The new organization comprises 5 strategic business units: Modern Workplace Solutions, Management Solutions, Cloud and Datacenter Solutions, and Software Solutions, all of which are complemented by Managed Services.

With focus on Microsoft-based solutions, the future portfolio will include SaaS offerings and services such as Office 365, Microsoft Azure, Microsoft Intune and System Center, Windows Client and Windows Server. The whole package is rounded off with a range of software products targeted at Private and Public Cloud environments. This allows itnetX AG to advise customers much more comprehensively and bring added value to their business.

The work of itnetX AG also includes a continued active participation in the community with various blogs, book publications, 5 highly qualified MVPs (Most Valuable Professional) and private events such as the annual System Center Universe Europe (www.systemcenteruniverse.eu).

More information available upon request via Email: info@itnetx.ch or visit www.itnetx.ch.


Filed under: Recommended

SCOM 2012 – Monitor SMA Runbook Instance (VSAE Sample MP)

$
0
0

When you are starting to automate processes in your company you soon will find out, that not all runbooks will run for a short time. There are also cases where you need a runbook, which is using a loop to monitor certain things or trigger other runbooks. In other words, there are cases where you need runbooks which run constantly. It is NOT recommended to have too many such long running runbooks, but if you need to trigger another runbook every 15 minutes there is no way currently in SMA to use the built-in schedule. The limitation is, that you cannot have a lower interval than 1 day (recurring). Although the SMA runbook infrastructure is highly available it could happen that your long running runbooks will stop for many reasons (reboots, shutdowns, bad handling etc.).

A simple version of such a trigger runbook could be this…

workflow Demo-Monitor {
 param (
 [parameter(Mandatory=$true)]
 [int]$Interval,
 [parameter(Mandatory=$true)]
 [string]$RunbookXY
 )

$Endpoint = Get-AutomationVariable -Name ‘SMA-WebServiceEndpoint’
while($true)
     {
     Start-Sleep -s $Interval
     #Start other runbook e.g. Start-SmaRunbook -Name $RunbookXY -WebServiceEndpoint $Endpoint
     }
 }

Most of times these runbooks are important and if you don’t get some sort of notification, you will not find out that the runbooks are not running anymore. One approach, which I think makes sense is to monitor these runbook instances by running a SQL query against the SMA database and check if there is one or more running instances of the runbook. If there is no instance we should receive an alert in SCOM.

For that reason I created a sample management pack, containing an alert rule which executes a PowerShell script. The SQL query looks like this…

SELECT  COUNT(*) as RunningJobCount
FROM    Core.Jobs INNER JOIN
        Core.vwJobs ON Core.vwJobs.JobId = Core.Jobs.JobId INNER JOIN
        Core.vwRunbooks ON Core.vwJobs.RunbookVersionId = Core.vwRunbooks.PublishedRunbookVersionId
        OR Core.vwJobs.RunbookVersionId = Core.vwRunbooks.DraftRunbookVersionID
WHERE JobStatus = ‘Running’ AND RunbookName = ‘$Runbook’

…this delivers the current SMA job instance count of a runbook.

The rest of the script creates a SQL connection to the SMA database / server and depending on the result returns a property bag containing “Error” or “OK”. The script looks like this…

param([string]$SQLInstance,[string]$Database,[string]$Runbook)
#Initialize Property Bag
$API = New-Object -ComObject "MOM.ScriptAPI"
$Bag = $API.CreatePropertyBag()
#SQL Query to check runbook instance count and state
$SQLQuery="
SELECT  COUNT(*) as RunningJobCount
FROM    Core.Jobs INNER JOIN
        Core.vwJobs ON Core.vwJobs.JobId = Core.Jobs.JobId INNER JOIN
        Core.vwRunbooks ON Core.vwJobs.RunbookVersionId = Core.vwRunbooks.PublishedRunbookVersionId 
        OR Core.vwJobs.RunbookVersionId = Core.vwRunbooks.DraftRunbookVersionID
WHERE JobStatus = 'Running' AND RunbookName = '$Runbook'
" 
#Script start logging
$API.LogScriptEvent("CheckRunbook.ps1", 10,0,"Preparing query against $SQLInstance on Database $Database with query: $SQLQuery .")
#Setup ADO connection & recordset
$ADOCon = New-Object -ComObject "ADODB.Connection"
$Results = New-Object -ComObject "ADODB.Recordset"
$OpenStatic = 3
$LockOptimistic = 3
#Setup provider & timeout
$ADOCon.Provider = "SQLOLEDB"
$ADOCon.ConnectionTimeout = 60
$ConnectionString = "Server=$SQLInstance;Database=$Database;Integrated Security=SSPI"
try     
    { 
    $ADOCon.Open($ConnectionString)
    }
catch 
    { 
    #Log error if connection cannot established
    $API.LogScriptEvent("CheckRunbook.ps1", 11,1,"Error connecting. ConnectionString: $ConnectionString Error: $Error[0]")
    }
if ($ADOCon.State -ne 0)
{
        try {     
            #Open SQL connection
            $Results.Open($SQLQuery, $ADOCon, $OpenStatic, $LockOptimistic)
            #Log event if connection is successful
            $API.LogScriptEvent("CheckRunbook.ps1", 20,0,"Successfully executed query against $SQLInstance on Database $Database. Value: $($Results.Fields.Item("RunningJobCount").Value)")
            #Check instance count, if no instance is running, return Error else OK
            If ($Results.Fields.Item("RunningJobCount").Value -eq 0)
            {
                $Bag.AddValue('State', "Error")
                $Bag.AddValue('Runbook', $Runbook)
                $Bag.AddValue('Database', $Database)
                $Bag.AddValue('SQLInstance', $SQLInstance)
            } 
            else 
            {
                $Bag.AddValue('State', "OK")
            }
            $Bag
        } 
        catch 
        { 
            #Log error if the query cannot executed        
            $API.LogScriptEvent("CheckRunbook.ps1", 21,1,"Error executing query against $SQLInstance on Database $Database with query $SQLQuery $Error[0]")
        }
    #Close all connection
    $Results.Close()
    $ADOCon.Close()
}

The script will log events in the OperationsManager event log, if there is a connection or query issue.

The rule itself is targeted in this example at the SCOM RMSE server and this agent will also execute the script. For that reason, make sure you give your SCOM management server action account proper permission on the SMA DB.

You can override the rule with your runbook name / database / server as you can see in this screenshot…

image

If you configured everything the way you want the alert looks like this…

image

..and some information in “Alert Context”…

image

…and “Alert Description”…

image

This management pack should not provide a full featured solution, instead it should give you a good starting point to customize the rule for your needs accordingly. The target and interval are just for demo purposes!

The VSAE project is available on TechNet Gallery.

I hope this helps!


Filed under: Authoring, Configuration, Management Pack, Script, SMA

SCOM 2012 – What Workflows, MP’s And Objects Run On A SCOM Agent?

$
0
0

I am a huge fan of PowerShell and SCOM and when it comes to combining both forces you can unleash the full power and do almost anything :). In this case I got (also) asked, how you can figure out which management pack is actually used by SCOM agents and eventually display it in a usable format. Well, I was starting to write a script as I stumbled across a very cool approach from Dirk Brinkmann (MSFT).

As you know SCOM has a built-in task to query the agent for rules and monitors BUT the output is somewhat not usable and delivers not the information we need see here “Show Running Rules and Monitors for this Health Service”. 

Dirk has written a script which calls this task via PowerShell and transforms the output into CSV and XML files. Finally you are able to use these files to analyze in e.g. Excel or PowerShell itself.

The script gathers the information just from one agent, but you could easily modify the script to collect the information from multiple agents.

image

I am really fan of his approach and urge you to check it out here. Happy analyzing!


Filed under: Configuration, Management Pack, Script, Troubleshooting

Microsoft MVP Award 2015

$
0
0

Microsoft MVP Banner

Today is a very exciting and special day as I received a mail from Microsoft letting me know that I received the MVP award. I think it is a very proud moment in every IT Pro’s life when he gets this award. I am a now a extremely proud member of a worldwide expert community in the Cloud and Datacenter Management space. This expert group consists out of approx. 90 experts around the world. Receiving this award shows that Microsoft recognizes my work and contribution to the community for the past few years. I am really looking forward to help, contribute and exchange know-how with other MVP’s and meeting them in person at the MVP Global Summit in November 2015. Such a recognition pushes and motivates me to contribute even more and whenever possible to help or providing my expertise.

At this point I would like to thank my company itnetx for providing me such a fun job, the capability to develop my skills and also providing me with daily challenges. A special thank you to my boss MVP Marcel Zehner who always likes to push and let me make the impossible, possible. In addition I also would like to thank Microsoft Switzerland for all the interesting projects and people to talk to. Also here a special thank you to René Hanselmann for his unbelievable support. Last but not least, there are some fun guys who always supported me like MVP Thomas Maurer, MVP Michael Ruefli and of course all my colleagues at itnetx who really make every day a fun day.

For sure, there are all the other inspiring guys around the community from whom I learned a lot. Especially all the “old SCOM dinosaurs MVP’s” like Marnix Wolf, Cameron Fuller, Kevin Greene, Tao Yang of course Kevin Holman and many others.

Thank you all!


Filed under: MVP

System Center Universe Europe 2015 – Speaker

$
0
0

image

I am very happy to speak at the annual conference System Center Universe Europe (SCU Europe). I will have 2 sessions together with Stefan Johner and the second session with my MVP buddy Stefan Koell.


Business Process Automation – A Real Real World Scenario, No Fakes Just Facts

Level:
300

Track:
Track “Systems & Cloud Management”

Time:
25.08.2015 – 14:30

In this session we would like to provide you a real world case where we used System Center Service Manager and Service Management Automation to automate different business processes. This is NOT one of these simple “One offering, one runbook example session”. This session offers just meat and no bones. Stefan Johner and MVP Stefan Roth will show you how we have architectured this solution, problems we have faced and what conclusion we came up.


Speed Dating SCOM – Make it sexy

Level:
300

Track:
Track “Systems & Cloud Management”

Time:
26.08.2015 – 10:45

This is NOT a regular “What’s New” nor a “Step-by-Step” session. We will present you 30+ tools, utilities, tips & tricks, scripts, community projects, 3rd party add-ons for System Center Operations Manager (SCOM) within 50 minutes. All of the presented solutions will help you getting your work done more efficiently. We try to cover all aspects of our SCOM candidate like appearance, personal qualities and reliability so that you and your SCOM relationship will last forever.


System Center Universe Europe is an annual conference hosted by itnetX. This event can be summarized like this…

System Center Universe (SCU) is a community conference with a strong focus on systems management and virtualization topics such as cloud, datacenter and modern workplace management.  We present top content with top presenters around Microsoft System Center, Microsoft Azure and Microsoft Hyper-V and want to build the number one conference for those kind of topics across Europe.

SCU is a technical conference for administrators, engineers, architects, technical project managers and other technical-oriented people that are focussed on Microsoft products and technologies. Less technical oriented people can still benefit from a conference attendance by getting a broad overview of problem-solving solutions and of course by connecting to exhibitors and community specialists.

There are not many other events worldwide which deliver so many high quality sessions held by real world experts. The location, infrastructure and food is also first class so you will find yourself very comfortable throughout your entire stay. Some key facts what you can expect…

  • 3 conference days
  • 60+ breakout sessions
  • 1 Keynote session
  • 1 Closing session
  • 4 parallel tracks
  • Lots of Microsoft MVPs on site
  • Ask the experts areato
  • Exhibition area (partners)
  • 1-to-few side meetings
  • Top WiFi infrastructure
  • Power available everywhere
  • Food & beverages
  • Networking Party
  • Closing Party
  • Good connected city
  • Hotels near the venue

I am really proud and happy to be speaker of this event again! If you are interested in more details please visit the conference website here.


Filed under: Recommended

SCOM – Database Performance & Configuration Sources

$
0
0

Most customers who are using SCOM need sooner or later tips and resources to check if their SCOM databases are in proper condition and might want to check if there are some performance issues. The Operations Manager team has just published a very useful post for troubleshooting database performance issues.

image

In addition there is a post, which I have written some time ago, that also discusses solutions to avoid the mentioned problems in Operations Manager team post.

image

Another excellent guide for setting up and configuring SQL Server for System Center products including SCOM I highly recommend reading the SQL Server Guide from MVP Paul Keely which is an awesome companion! You can download the guide from TechNet Gallery.

image

The last very good source is the recently published e-book “Operations Manager Field Experience” which also covers the SQL database performance and configuration topic for SCOM.

image

Go and check them out and I am sure you will be able to speed up or fix your SCOM DB performance (issues)!


Filed under: Configuration, Performance, System Center, Troubleshooting

SCOM – New SQL Server Management Pack 6.6.0.0 Dashboard Capabilities

$
0
0

3 days ago Microsoft published a new SQL Server management pack version. In every new MP version there are fixes for some bugs or new monitors / rules and support for new SQL Server versions. This MP is a bit different and I would like to spread the words about a very welcome new summer gift. The new management pack contains not only new default SQL Server dashboards (which look btw. awesome!), we actually get the opportunity to use the same kind of dashboard for all groups respectively objects in SCOM.

Just here few impression of the new default summary SQL Server dashboard…

image

…and if you double click on a tile from the dashboard above, it drills down to a detailed instance dashboard…

image

There are more default dashboards to explore, but you can also build these kind of summary / detail dashboards yourself! I would like to show a simple example for domain controllers.

Create a “Datacenter Dashboard” dashboard…

image

After creating the dashboard, you get an empty “workspace”…

image

Next you need to click on the upper right corner and the you are able to select any kind of SCOM group, in this example I choose domain controllers…

image

An (expandable) State and Alert tile will be placed onto the dashboard, you are able to expand it by clicking on the left upper triangle and also you can add additional monitors and performance widgets…

image

So, after adding some more monitor / performance tiles, it could look like this for Active Directory (I randomly picked some rules / monitors)….

image

And if you double click onto a tile you get an instance view, where you are able to add also additional monitor and performance tiles which act in context of the selected instance…

image

If you double click a single tile, either a performance view or the Health Explorer will be opened in context. If you click the icon in the upper right corner the details information will be copied to clipboard :). Pretty useful!

image

This dashboard is just a simple example, but shows you some of the power to use any group / object and create very appealing dashboards. If you go to the dashboard settings, you are even capable of changing the color, refresh interval and also time interval (influences the performance tile on the instance dashboard)…

image

This SQL Server MP contains a pretty good, separate  documentation of these new dashboard capabilities, I highly recommend studying this guide!

image

The SQL Server management pack and documentation can be downloaded here. Have fun!


Filed under: Dashboard, Management Pack
Viewing all 140 articles
Browse latest View live