Quantcast
Channel: Stefan Roth
Viewing all 140 articles
Browse latest View live

SCOM – Class Hierarchy & Management Pack Details

$
0
0

If you have been working with SCOM for some time, you want to know more what is going on under the hood. You might want to know, how classes relate to each other or maybe what the management pack contains what kind of elements like monitors, rules, tasks, views and so on. In other terms you want to know what the service model respectively the health model looks like. Of course you could use many tools incl. SCOM console to figure this stuff out, but there is a much cooler way. During my daily work, I hit two websites, which I just found to be very useful.

Management Pack Document (http://mpdb.azurewebsites.net/)

The first page shows you the class hierarchy in SCOM. It shows all classes in a nice flat overview. I can’t remember of any other place where you could see this nice diagram…

image

In addition the website contains information about common relationships as also detailed information about basic management packs…

image

Very cool stuff and I really like the class hierarchy overview.

System Center Core (http://systemcentercore.com/)

The second site I would like to show is System Center Core. It is similar to Management Pack Document, but it has a bit more graphical and technical details…

image

For example if you pick a management pack…

image

…you are able to see the details. It is even possible to click through the elements, until you hit the script of the rules or monitors or even details about the tasks in the MP like here…

image

I have not checked if this provided information is 100% accurate and reliable, but my impression is, that these sites are well maintained and provide a good source of information for discovering the intestines of SCOM and help also troubleshooting and developing management packs.

I hope this is a good tip in your daily SCOM life!


Filed under: Configuration, Development, Management Pack, Recommended, Troubleshooting

VMM 2012 R2 – Remove Corrupted SCOM Connector

$
0
0

Yet another interesting SCOM problem. Today I was at a customer for fixing some SCOM issues, respectively to reinstall SCOM. The problem was, that there was a SCOM incl. VMM integration in place. Because of many issues SCOM was removed without properly removing the VMM integration. So if you tried to remove the SCOM connector in VMM 2012 R2 an error appeared…

error

After some research I found a Cmdlet which seemed to be promising called SCOpsMgrConnection https://technet.microsoft.com/en-us/library/hh801688.aspx , if you study this article you find that there is a –force switch. My general rule is, if there is a –force switch, I use the –force switch:). So we executed the command on the VMM server and the result looked like this…

image

After this task, we could run the SCOM connector wizard in VMM 2012 R2 without any further issues. I like!


Filed under: Configuration, Management Pack, System Center, Troubleshooting

SMA – About Jobs & Sandboxes

$
0
0

SMA is a great tool for automation of many kind like IT related tasks or as an interface between two systems or business processes you would like to have done automatically. Building workflows is not that easy, but once you get the trick it is not that hard. After you built your workflows, at some point in time you need to troubleshoot, especially if you are dealing with highly available and highly automated environments which run many runbooks / workflows at the same time. Soon you will bump into the terms “job” and “sandbox”, but what is it and how do they relate to each other? Because there is a lot of confusion about these terms, I try to explain how this stuff fits together.

What is a SMA job?

Every time you start a runbook (= workflow) in SMA it generates a job, it is an instance of your workflow. If you open the Windows Azure Portal (WAP) you get a near real time overview of the jobs that are currently running in your SMA infrastructure.

image

It does not matter if you start the runbook by pressing the Start button in WAP or using Start-SMARunbook Cmdlet the runbook or if you schedule the workflow. In any case an instance of the workflow respectively a job is created.

What is a sandbox?

If you open the event log on a runbook worker you will find an event about the Orchestrator.Sandbox process.

image

So what is this event all about and what is this sandbox thing? When you start your first job in SMA a new sandbox (Orchestrator.Sandbox ) process is created.

image

The sandbox is basically a closed space where your SMA jobs live in. This means you have at least one sandbox on each runbook worker. Can you have more than one sandbox on your runbook worker? Yes. There are two cases when a new sandbox is created. The first case is if a new user module has been imported, which would allow to run runbook jobs to use the old module version and the runbooks using the new module version would run in a separate sandbox. The second case is, if the count of TotalAllowedJobs is hit. If you hit this limit a new sandbox will be created, but usually if a job is finished the job should be recycled. What does TotalAllowedJobs mean?

TotalAllowedJobs : The total number of jobs that a Sandbox can process during its lifetime. When this limit is hit, the Sandbox is no longer assigned new jobs and the existing jobs are allowed to complete. After that, the Sandbox is disposed. [Source: Microsoft]

Where can I find this setting? It is placed in the SMA configuration file C:\Program Files\Microsoft System Center 2012 R2\Service Management Automation\Orchestrator.Settings.config\Orchestrator.Settings.config.

image

So what is this sandbox for? A sandbox is an environment where SMA runs safely SMA jobs. Beth Cooper PM on the Microsoft automation team explained it like this:

A sandbox process is equivalent to a PowerShell runtime and ensures reliability.  By putting SMA jobs in sandboxes, if there is an unexpected error in a sandbox, we can kill that runtime process and resume the jobs in a new runtime with a clean state.  Sandboxes also allow for different PowerShell module versions since one PS session can have one version and the other can have an updated version.                                     

I think this explains it very well.

What is the relation between jobs and sandboxes?

So this means that there is always an Orchestrator.Sandbox process running on a runbook worker? No. If there is a runbook currently running, then yes, if the runbook / job is finished and no more jobs are running the Orchestrator.Sandbox process is disposed. Therefore no jobs, no Orchestrator.Sandbox process.

What if nested runbooks come into play?

What is a nested runbook? A nested runbook is if you create a Workflow A which calls inside Workflow B and Workflow C.

There are two different methods to call a nested / child workflow. Either you call Workflow B and Workflow C directly from Workflow A or you start these runbooks with Start-SMARunbook Cmdlet. So what is the difference? If you call the runbook directly from Workflow A, like in this picture, ONE SMA job and one sandbox is created.

image

If you call the child workflows using Start-SMARunbook like in this picture, THREE jobs and one sandbox are created.

image

I hope you get a better understanding of SMA and I would like to thank Beth Cooper from Microsoft for answering my questions! Additionally to this post, the automation team has published an excellent blog post about nested runbooks in SMA. After reading both posts, you should have the full understanding how things work in SMA / workflows.


Filed under: Configuration, SMA, System Center

E2EVC – Session “Business Process Automation – How it is done in a real world scenario”

$
0
0

Everyone talks these days about automation, services and self-service. Rarely you will find a session which is showing you a real world case showing how it is really done. I am happy to speak at the “Experts 2 Experts Virtualization Conference” in Lisbon together with my wingman Stefan Johner http://jhnr.ch/. The event takes place from November 13th-15th in Novotel Lisboa, Avenida José Malhoa 1 1A 1099-051 Lisbon Portugal.

We will show you what you can do with a self-service portal, System Center Service Manager (SCSM) and Service Management Automation (SMA). In addition we will give you insides how we started the project, things we faced during development and implementation, how the solution works and our “lessons learned”.

If you would like to attend the session, we will be speaking on Friday 13th from 17:10-18:00.

image

E2EVC Virtualization Conference is a non-commercial, virtualisation community Event.
Our main goal is to bring the best virtualisation experts together to exchange knowledge and to establish new connections. E2EVC is a weekend crammed with presentations, Master Classes and discussions delivered by both virtualisation vendors product teams and independent experts.
Started in 2003 with just 4 people and after 26 very successful events grown to awell-recognized event with over 150 attendees. In the last 12 years and 25 events, our conference has taken place in cities such as Munich, London, Copenhagen, Amsterdam, Barcelona, Berlin, Brussels, Frankfurt, Dublin, Orlando, Paris, Los Angeles, Munich, Nice, Lisbon, Rome, Hamburg, Hong Kong and Vienna. On average we have 30 sessions at each event. Topics are – server, application, desktop, storage virtualisation with products from vendors like Microsoft, VMware, Citrix and many more. Over 25 of the best virtualisation community experts present their topics. Many current virtualisation community leaders participated or still participate in our events. It’s the people that attend, presenters that present and our sponsors who make this event possible.

If you are interested in this event go and check out the website here .


Filed under: Recommended, Service Manager, SMA, System Center

SCOM 2012 R2 UR7 – Error Running Data Warehouse SQL Script “Invalid object…”

$
0
0

I have been doing SCOM updates for the past few years and I rarely bumped into any issues. If you stick to the recommended procedures, then you will succeed. But recently I bumped into an issue while updating SCOM 2012 R2 UR7. As you know there are installer files which you must run on each SCOM component, in addition you need to run SQL scripts on your backed databases and finally import management packs. The entire procedure is well described here in this support article.

In my case I had to update 2 management servers and both databases OperationsManager and OperationsManagerDW. The installation was brand new and just after adding the last management server, my intension was to install UR7. When I tried to run the SQL script UR_Datawarehouse.sql, the following error appeared…

image

Because I have never faced this problem before, I was quite surprised and started investigating.

I tried to run the script multiple times and the error just slightly changed. If you read the message carefully, you will notice, that there are “objects” missing. In my case vManagedEntityMonitor, Microsoft_SystemCenter_Visualization_Library_
AggregatedPerfValuesForMultipleSeriesGet. The first object is a view and the second a stored procedure which the script is expecting. After some checking for errors I noticed a lot of error events in the OperationsManager event log on each management server. One event I noticed was, that the management server action account (MSAA) could not log on locally to the SCOM servers. Well this is a very bad issue and after checking the management servers in the SCOM console I saw also, that the management servers did not appear healthy, instead they just had the “green circle”. After some more digging, I finally figured out that there was a change in the domain GPO’s which restricted user accounts from log on locally to the servers. I temporarily fixed the issue by adding the SCOM service accounts to the local administrator group of each management server. I know just the SDK account needs local administrator permission on the management servers, but it is just a quick and dirty fix. Soon the error events disappeared and the event log didn’t contain any errors anymore. Few minutes later the management server objects appeared also in a healthy state.

After the management server turned healthy, the expected views and stored procedures appeared in the OperationsManagerDW.

vManagedEntityMonitor:

image
Microsoft_SystemCenter_Visualization_Library_
AggregatedPerfValuesForMultipleSeriesGet:

image

The first time I ran the script, there were more views and stored procedures missing and it seems, that those objects are only fully created after the management servers are in a healthy state. The reason I bumped into this issue was, because of the restricted permissions of the domain user accounts, which prevented to turn the management server into a healthy state and finally creating all the necessary database objects.

I hope this helps!


Filed under: Configuration, System Center, Troubleshooting

SCOM 2012 R2 TP3 – Monitoring Apache Web Server

$
0
0

Apache-http-server

Few month ago, Microsoft released management packs for monitoring open source software like Apache HTTP Server or MySQL databases. In this post I would like to have an overview of monitoring Apache web server. So far there have not been many free options to actually monitor this web server, although it is a very common candidate out in the field. This management pack shows clearly Microsoft’s commitment to support open-source software in the SCOM world.

Support

This  current management pack supports version Apache HTTP Server version 2.2 and 2.4 if you install Apache from one of the SCOM supported Linux distributions. Find all supported *nix versions  here https://technet.microsoft.com/en-us/library/hh212713.aspx . In SUSE Linux Enterprise Server 11 there is Apache version 2.2 and in SUSE Linux Enterprise Server 12 there is Apache version 2.4 included. In this example I installed SUSE Enterprise Server 11 SP3.

The Apache management pack is part of the System Center 2016 Technical Preview 2 Management Packs for Open Source Software found here http://www.microsoft.com/en-us/download/details.aspx?id=46924 . Required is at least SCOM 2016 TP2, but SCOM TP2 has already expired few month ago, I will use SCOM 2016 TP3, which also works perfectly fine and can be found here https://technet.microsoft.com/en-gb/evalcenter/dn781241 .

Pre-requisites

First import all management packs for your Linux distribution from the SCOM source itself, depending on your distribution you need to add different MP’s. In my case I will use the SLES (SUSE) MP’s. Make sure you import all depended MP’s also …

image

After some time, you will find in the DownloadedKits directory,  the corresponding shell script bundles for your Linux distributions…

image

You might are used to have *.rpm or *.deb files in this directory, in this current version Microsoft delivers a *.sh package we need to extract later on. Because these are no installer file, I could not use the SCOM agent wizard to deploy these packages. So we need to take some manual steps to install the agent.

Copy this package to your Linux server using e.g. WinSCP https://winscp.net/eng/index.php . WinSCP will allow you to connect to your Linux server from your Windows box and upload any files from your local system…

image

Before we proceed make sure you have configured your Linux monitoring RunAs accounts properly. Please read this post for more details on how to configure the RunAs accounts https://stefanroth.net/2012/03/15/scom-2012-linux-monitoring-lab-part-1-setup-suse-11-1/ , it shows you exactly how to monitor a Linux server.

At this stage you should have the proper Linux distribution MP’s imported, RunAs accounts configured and also the agent source file scx-1.6.0-174.sles.11.x86.sh copied to the Linux server.

Note: We imported these management packs first, because this will place the agent file into the C:\Program Files\Microsoft System Center 2012 R2\Operations Manager\Server\AgentManagement\UnixAgents\DownloadedKits on your SCOM server, which is needed to install the agent.

Install SCOM Linux agent

Log into your Linux box and open a terminal session, in my case I am logged in as root into suse01.services.lab.itnetx.ch and changed to the /tmp directory where I copied the agent source scx-1.6.0-174.sles.11.x86.sh. There are multiple steps you need to execute:

  1. Make the shell file scx-1.6.0-174.sles.11.x86.sh executable by executing the command:  chmod +x scx-1.6.0-174.sles.11.x86.sh
  2. See all options you have for the scx-1.6.0-174.sles.11.x86.sh package: ./scx-1.6.0-174.sles.11.x86.sh – -help
  3. Install the agent only scx-1.6.0-174.sles.11.x86.sh – -Install

4

As you can see the installation routine checks, if Apache is already installed or not. In my case I haven’t installed the web server and therefore we need to take additional steps. I did this on purpose, to see how it works to install the Apache cim-provider. Now you basically have the blank OMI agent installed but there is no “connection” between SCOM and this agent. Therefore we need to run the agent discovery for discovering the Linux box, sign the certificate and finally manage this server. Just run the Linux discovery wizard, as you would install a regular Linux agent. Your wizard should look like this if you configured everything properly…

5

…hit Manage and finally your Linux box is monitored by SCOM…

6

Go to the Monitoring pane and click through the views, you should see some objects popping up after a while …

image

At this point you should have the Linux server monitored and if this is ok we will take the next step and install the Apache HTTP Server.

Install Apache HTTP Server

In this step we will install a plane Apache HTTP Server using the SUSE package installer. Go to Install/Remove Software and search for Apache…

image

The search result will show up with all Apache modules and packages, I selected the following (some depended packages are selected automatically)…

image

…some dependencies are resolved…

8

After installation we need to switch to the terminal window and create a default index.html file, because there is no such file per default installed. Of course this step is optional, but I want to check if my web server works:) and therefore I want to show a default website. Change into the /srv/www/htdocs directory and create a index.html file…

10

…add some content to the index.html file…

image

Save the file and start the Apache service…

9

If you open http://localhost you should see the default site like this…

image

At this stage we have Apache HTTP server installed and running.

Import Apache HTTP Server  management packs

Next we need to import the management packs for monitoring the Apache HTTP Server itself. The management pack can be found here http://www.microsoft.com/en-us/download/details.aspx?id=46924 and contains also the MP for MySQL, which we don’t need. Import the following two files Microsoft.ApacheHTTPServer.Library.mp and Microsoft.Oss.Library.mp

image

If you meet all dependencies, the import will succeed…

14

Configure CIM Provider

The last step is to configure the CIM Provider. The Apache CIM Provider package is automatically deployed during the installation of the Linux agent Operations Manager, if Apache HTTP Server is detected at that time. This detection and automatic installation occurs when installing the Linux agent for the first time on a computer, and it also occurs when upgrading a previous agent version to the current version. If the Apache HTTP Server is installed to the Linux computer after the Operations Manager agent is installed, the CIM Provider can be manually installed through the following mechanisms. Open a terminal window and change to /temp where uploaded the scx-1.6.0-174.sles.11.x86.sh  package.Then execute these steps…

  1. Extract the scx-1.6.0-174.sles.11.x86.sh  package : sudo sh ./scx-1.6.0-174.sles.11.x86.sh  – -extract
  2. Change into the directory: cd scxbundle.20141/
  3. Extract the CIM Provider package: sudo ./apache-cimprov-1.0.0-545.universal.1.i686.sh – -extract
  4. Change the directory: cd apache_22/
  5. Run the installation: rpm –I apache-cimprov-1.0.0-545.universal.1.i686.rpm
  6. Restart the agent: scxadmin –restart
  7. Restart apache service: service apache2 restart

15c

At this point the CIM Provider is installed and running. In order to load the module into Apache HTTP Server we need to modify the sudoers file. When you are monitoring *nix systems you need to modify the sudoers file which basically sets permission for the specific RunAs accounts we define in SCOM. For monitoring Apache HTTP Server we need to add an additional line…

  1. Open the file: sudo visudo
  2. Change to insert mode: press i
  3. Scroll at the end of the file and insert the following line, assuming your monitoring account is called monuser. This allows your privileged monitoring account to execute the apache_config.sh script, which we will start from the SCOM console:
    monuser ALL=(root) NOPASSWD: /opt/microsoft/apache-cimprov/bin/apache_config.sh
  4. Press Esc-key to leave vi editor
  5. Type :wq! to write and quit

18

Open the SCOM console and change to the Apache view, where you should see a not monitored object in the Apache HTTP Server view…

image

Click the Load Monitoring Module task, this will open this dialog…

17

…click Run, which will load the module into the web server using the privileged RunAs account. After a short time the task should finish successfully…

19

After some time you will see data dropping in…

Apache HTTP Servers…

image

Apache Virtual Hosts…

image

Performance counters…

image

Summary

There are multiple steps you need to take to monitor Apache HTTP Server. First monitor the basic Linux OS, if Apache has not been installed before, you need to install the CIM Provider afterwards. For monitoring Apache HTTP Server import the Apache MP’s and load the module apache-cimprov into the web server by executing the Load Monitoring Module SCOM task. At the end you get a pretty well monitored Apache HTTP Server having the most common performance counters out of the box. Be aware the discoveries run every 4 hours and the rule collect data every 5 minutes. If you need to know what exactly is being monitored and which data is collected read the MP guide provided here http://www.microsoft.com/en-us/download/details.aspx?id=46924 . The intention here was to play around with the Apache monitoring and therefore this procedure might does not exactly meet the production requirements. I hope you get an impression how things play together.


Filed under: Configuration, Management Pack, Testing, Xplat

System Center Europe 2015 – Session Recordings

$
0
0

scu_europe_2015_monday-5742_jpg

Back in August, the yearly System Center Universe Europe event was held in Basel-Switzerland. Around 420 attendees joined this event and it was a great success. Here some more short facts…

  • 420 people from 20 different countries
  • 19 sponsors (thank you!)
  • 62 breakout sessions
  • 8 early morning discussions
  • Multiple parties for fun and networking

If you missed this event you are now able to watch the recordings. Myself I had the great pleasure to present also two sessions as I already blogged here.

  • Business Process Automation – A Real Real World Scenario, No Fakes Just Facts
  • Speed Dating SCOM – Make it sexy

Sadly the recording god fooled with me/us and there is only the “Speed Dating SCOM – Make it sexy” session recording available which you can find here.

image

The other session recording from the entire System Center Universe Europe can be watched online on vimeo here.

image

If you missed this version of SCU Europe, make sure you don’t miss the upcoming SCU Europe event which will be held in Berlin at the Berlin Congress Center on August 24-26 2016. Keep an eye on http://www.systemcenteruniverse.ch for more news.

scu_europe_2015_wednesday-7354


Filed under: Recommended, System Center

itnetX AG – Winner of the “Partner of the Year 2015 – Cloud Platform” Award

$
0
0

image

During their annual “Connection Days” event one week ago, Microsoft Switzerland announced their “Partners of the Year” for different competencies. itnetX has been awarded “Partner of the Year 2015 – Cloud Platform”! This award underlines our commitment to helping customers transform their business into cloud-first scenarios. It also highlights the huge changes to our company over the last months: transitioning to a modern cloud platform partner.

The latest award joins an impressive list:

  • Microsoft Partner of the Year 2011 – Datacenter (itnetX)
  • Microsoft Partner of the Year 2012 – Datacenter (itnetX)
  • Microsoft Partner of the Year 2013 – Datacenter (itnetX)
  • Microsoft Partner of the Year 2014 – Datacenter (Syliance IT Services, since merged with itnetX)
  • Microsoft Partner of the Year 2015 – Cloud Platform (itnetX)

I am very proud to be part of this highly skilled company / teams / people who are deeply focused and deliver first-class Microsoft solutions to our customers.


Filed under: Recommended

Basel PowerShell User Group –“End-To-End Automation using Service Manager and SMA”

$
0
0

image

I would like to spread the words about an upcoming user group session event in Basel, Switzerland. My buddy PowerShell MVP Stéphane van Gulick will host this event and we both will have a session. I will present what you can do with System Center Service Manager and Service Management automation in a session called “End-To-End Automation using Service Manager and SMA” and Stéphane van Gulick will show the new feature building classes in Powershell V5 and some practical use cases.

The schedule looks like this…

image

If you interested in attending this event you are very welcome. Please use Eventbrite for registering to this free event!


Filed under: Recommended

OMS – Agent for Linux Installation (Preview)

$
0
0

Operational Management Suite (OMS) is probably the final product of a long evolution process starting back in 2011 known as System Center Advisor, later on transforming into Microsoft Azure Operational Insight and finally into a growing management suite for Azure and on-premise services. Read my post here for more historical information on this awesome solution.

I think it is a historical move for Microsoft to support Open Source and Linux track the relationship goes even that far, Microsoft writes “love blog posts (not love letters :))” Microsoft Loves Linux, read the blog post series here. Because of that I would like to show how OMS supports Linux data collection.

The OMS Agent for Linux enables rich and real-time analytics for operational data (Syslog, Performance, Alerts, Inventory) from Linux servers, Docker Containers and monitoring tools like Nagios, Zabbix and System Center

Currently this Linux agent is in a preview stage and supports at this very moment the following operating systems…

  • Amazon Linux 2012.09 –> 2015.09 (x86/x64)
  • CentOS Linux 5,6, and 7 (x86/x64)
  • Oracle Enterprise Linux 5,6, and 7 (x86/x64)
  • Red Hat Enterprise Linux Server 5,6 and 7 (x86/x64)
  • Debian GNU/Linux 6, 7, and 8 (x86/x64)
  • Ubuntu 12.04 LTS, 14.04 LTS, 15.04 (x86/x64)
  • SUSE Linux Enteprise Server 11 and 12 (x86/x64)

As things move fast this list will change within the next weeks / months. Keep an eye on GitHub where you find up-to-date information.

The amazing thing is, that the agent is built on open source components and for collecting the data and the aggregate it uses FluentD . FluentD has hundreds of existing plugins, which will make it really easy for you to add new data sources. So this perfectly to collect whatever data you want.

Pre-requisites:

In order to install successfully the agent you need to make sure the following packages are installed…

image

In addition if you want to collect syslog data either rsyslog or syslog-ng are required to collect syslog messages. The default syslog daemon on version 5 of Red Hat Enterprise Linux, CentOS, and Oracle Linux version (sysklog) is not supported for syslog event collection. To collect syslog data from this version of these distributions, the rsyslog daemon should be installed and configured to replace sysklog,

Installation:

Some of us are just Microsoft geeks and might not know too much about Linux. So here I set first hostname of the Linux server, in my case it is a SUSE Enterprise Linux 12 and running uname –m shows me x86_64 which tells me it is a x64 bit server.

image

Download the agent from GitHub https://github.com/MSFTOSSMgmt/OMS-Agent-for-Linux

image

or us wget

wget https://github.com/MSFTOSSMgmt/OMS-Agent-for-Linux/releases/download/1.0.0-47/omsagent-1.0.0-47.universal.x64.sh

image

Check the md5 checksum md5sum ./omsagent-1.0.0-47.universal.x64.sh and install the agent sudo sh ./omsagent-1.0.0-47.universal.x64.sh –install -w <YOUR OMS WORKSPACE ID> -s <YOUR OMS WORKSPACE PRIMARY KEY>. Get your WORKSPACE ID and PRIMARY KEY from the OMS portal by logging in into http://www.microsoft.com/oms…  image

…the installation should run like this…image

…immediately you should see the agent has connected…  image

Checks:

To see what data floating in there are multiple places to check.

First check you installation by running the command service omsagent status…

image

Go to the Log Search in OMS and type a query like this Computer=suse007 Type=Perf of course change the computer name accordingly…

image

…and some graphical details should appear…

image

Conclusion:

This is the basic installation of a single OMS Linux agent. It goes without a hitch, if you make sure you have the proper libraries installed and supported OS versions. Microsoft does the right move to support and contribute to Open Source, it even does it very well in a good quality and it reflects this strategy in other products / tools as well. Great!


Filed under: Azure Operational Insights, Configuration, OMS, Xplat

OMS – Agent for Linux Troubleshooting Help

$
0
0

In my previous post I introduced the OMS Agent for Linux. This time I would like to give you some troubleshooting starting points. There are countless possibilities for errors to occur, so it is nice to have at least a consolidated list where to find a log or configuration file. This should give you a pretty good overview of the most important places to look for. For detailed configuration scenarios read the documentation on GitHub .

Documents-icon

Log file paths:

In general  the logs for the OMS Agent for Linux can be found at:

/var/opt/microsoft/omsagent/log/

The logs for the omsconfig (agent configuration) program can be found at:

/var/opt/microsoft/omsconfig/log/

Logs for the OMI and SCX components (which provide performance metrics data) can be found at:

/var/opt/omi/log/ and /var/opt/microsoft/scx/log

Logs for the DSC setting can be found at:

/opt/microsoft/omsconfig/Scripts/


Document-icon

Specific log files:

The log files for omsagent (fluentd) can be found here:

/var/opt/microsoft/omsagent/log/omsagent.log

The log files for onboarding & certificates:

/var/opt/microsoft/omsagent/bin/omsadmin.log

The log files about DSC feature omsconfig (DSC):

var/opt/microsoft/omsconfig/omsconfig.log
var/opt/omi/log/omiserver.log

The log files for perf counters issues:

/var/opt/microsoft/scx/log/scx.log
/var/opt/omi/log/omiserver.log


speed-test-icon

Specific OMS agent tests:

Operating system namespace probe on OMI agent:

/opt/microsoft/scx/bin/tools/omicli ei root/scx SCX_OperatingSystem

Agent namespace probe on OMI agent:

/opt/microsoft/scx/bin/tools/omicli ei root/scx SCX_Agent

If you want to display the desired configuration:

sudo su omsagent –c /opt/microsoft/omsconfig/Scripts/GetDscConfiguration.py

If you want to test desired configuration:

sudo su omsagent –c /opt/microsoft/omsconfig/Scripts/TestDscConfiguration.py


settings-icon

Configuration files:

If you want to configure the Syslog collection edit one of these files, depending on your distribution:

/etc/rsyslog.d/rsyslog-oms.conf

/etc/syslog.conf

/etc/rsyslog.conf

/etc/syslog-ng/syslog-ng.conf (SLES)

If you want to configure general agent settings:

/etc/opt/microsoft/omsagent/conf/omsadmin.conf

If you want to configure performance counter, alert settings for Zabbix, Nagios and Container data:

/etc/opt/microsoft/omsagent/conf/omsagent.conf

If you want to configure omiserver:

/etc/opt/omi/conf/omiserver.conf

If you want to configure omicli:

/etc/opt/omi/conf/omicli.conf


Network-Panel-Settings-icon

General problems & solutions:

image

(Source: Microsoft)

Some more OMI specific troubleshooting steps you can find here http://social.technet.microsoft.com/wiki/contents/articles/19527.scom-2012-r2-manually-installing-and-troubleshooting-linuxunix-agents.aspx


Filed under: Azure Operational Insights, Configuration, OMS, Recommended, System Center, Troubleshooting, Xplat

OMS – Price & Size Calculator

$
0
0

image

You might have already  heard of Operations Management Suite (OMS) or you are already using the free OMS version which is great, besides the limitations:). Now you are deciding to actually buy a licenses for your company and you don’t know how much the licenses will cost. Luckily Microsoft has created an online calculator to estimate cost and the actual services you get. Navigate to http://omscalculator.azurewebsites.net/ and get an overview which license model is appropriate for you.

Data gathering page…

image

…and the actual comparison between the two license options…

image

Enjoy!


Filed under: OMS, System Center, Tool

PowerShell – Remote Desktop Cmdlets “A Remote Desktop Services deployment does not exist…”

$
0
0

PowerShellBanner

Recently while automating some cool stuff I needed to create a PowerShell workflow for deploying VDI clients using Windows Server 2012 R2 Remote Desktop Services. One of the first things I always do is checking the existing PowerShell support and I figured out there is a large number of cmdlets available for managing RDS services. So the first thoughts were, this is going to be an easy walk in the park. Well, not really…

One of the first things I wanted to know, which users are assigned to which client. The Get-RDPersonalVirtualDesktopAssignment cmdlet gives you this information by providing the connection broker and collection name…

Get-RDPersonalVirtualDesktopAssignment [-CollectionName] <String> [-ConnectionBroker <String> ]

Because I will execute the script in a PowerShell workflow from a remote machine (SMA) using WinRM, I did some tests and I used Invoke-Command to do some PowerShell Remoting just to get started. Usually we develop PowerShell workflows starting with its core part / functionality and then wrap all other stuff around it, like logging, error handling and PowerShell workflow structure.

My test command looks like this…

$ConnectionBroker = "ConnectionBroker.domain.com"
$VDICollection = "MyVDICollection"
$UserName = "domain\user"

Invoke-Command -ComputerName $ConnectionBroker -Credential (Get-Credential -UserName $UserName -Message "Enter credentials") -ScriptBlock `
{
Import-Module RemoteDesktop;`
Get-RDPersonalVirtualDesktopAssignment -CollectionName $Using:VDICollection -ConnectionBroker $Using:ConnectionBroker
}

The specified user has administrator permission on the connection broker and VDI deployment itself, so it should be working just fine. Well, it did not and I received an error…

A Remote Desktop Services deployment does not exist on ComputerName. This operation can be performed after creating a deployment. For information about creating a deployment, run "Get-Help New-RDVirtualDesktopDeployment" or "Get-Help New-RDSessionDeployment".
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Get-RDPersonalVirtualDesktopAssignment
+ PSComputerName : ComputerName

To make it short it seems that the Get-RDPersonalVirtualDesktopAssignment connects to the connection broker doing another hop, so we run here into a second hop problem. What is a ”second hop problem”? Don Jones has published a nice post here explaining the second hop. In this paper on page 39 Ravikanth Chaganti explains our problem a bit more in detail and how to handle it.

Finally to solve the problem we need to use CredSSP for passing the authentication to the second hop. In order to do that we need to use the parameter “-Authentication CredSSP” which will delegate our credential to the “second” hop. Be aware that you also need to enable CredSSP either via GPO or via PowerShell using Enable-WSManCredSSP cmdlet and then it worked like a charm.

$ConnectionBroker = "ConnectionBroker.domain.com"
$VDICollection = "MyVDICollection"
$UserName = "domain\user"



Invoke-Command -ComputerName $ConnectionBroker -Credential (Get-Credential -UserName $UserName -Message "Enter credentials") -ScriptBlock `
{
Import-Module RemoteDesktop;`
Get-RDPersonalVirtualDesktopAssignment -CollectionName $Using:VDICollection -ConnectionBroker $Using:ConnectionBroker
} -Authentication CredSSP

I would like to thank my buddy Fulvio Ferrarini and Marc van Orsouw for helping troubleshooting this issue.

This is an old problem but it does not always present you with an “Access Denied” error or anything like that as you can see in this example. I hope it save you some time!


Filed under: Configuration, Script, SMA, Troubleshooting

SCOM – Authoring History and System Center Visual Studio Authoring Extensions 2015

$
0
0

mp

I usually don’t blog about new releases of management packs or similar things, but this time I feel I have to do so. If you have been working for some time with SCOM, you know there is a (long) history behind authoring MOM/SCOM management packs. Back in the days where MOM 2005 used to rule the monitoring world, you had these AKM management pack files which could not be changed or authored outside of MOM. In 2007 when SCOM 2007 was released, Microsoft changed that concept to the sealed (MP extension) / unsealed (XML extension) management pack concept which is still valid up to this point. In the same wave Microsoft released the widely loved Authoring Console which was a GUI driven approach and more or less intuitive to work with for an IT Pro.

ac

In 2009 the next version of SCOM 2007 R2 was released and it also included a newer version of the Authoring Console which was included in the System Center Operations Manager 2007 R2 Authoring Resource Kit which also included MP Best Practice Analyzer, MP Spell Checker, MP Visio Generator, MP Diff etc. to make your MP authoring experience a bit more comfortable. 3 years later in 2012, Microsoft released SCOM 2012 and also a new way of authoring management packs – Visual Studio Authoring Extensions for System Center Operations Manager were born.

vsae

This extension was basically an Add-On for Visual Studio 2012 and would let you author MP fragments in XML in a mix of a semi-GUI driven way. It has several advantages:

  • Work directly with the XML of the management pack allowing you to create any management pack element and monitoring scenario.
  • Provides XML templates and IntelliSense for different management pack elements so that you don’t have to have detailed knowledge of the schema.
  • Allows you to create XML fragments containing different management pack elements. The fragments can be copied within the management pack, to another management pack, and combined to build the final management pack.
  • Allows multiple authors to work on a single management pack project at the same time.

(Source:TechNet Wiki)

The downside of VSAE was / is that it is focused on experienced IT Pros or MP developers, but not for an average SCOM administrator because of the MP authoring knowledge needed – “You need to know what you do”.

Microsoft’s answer to this problem was a huge flop, called System Center 2012 Visio MP Designer VMPD and was an add-in for Visio 2010 Premium. The idea behind was, to author MPs in a graphical way, utilizing Visio as a graphical interface and by pressing a button the MP was pushed to SCOM. This way of authoring was very limited by some basic monitors, rules and health model.

vmpd

Some time later, Microsoft discontinued to invest in this tool and started a co-operation with Silect, to build a free “successor” of the former Authoring Console called Silect MP Author. This tool was / is meant for the IT Pro who gets wizard driven support for authoring management packs. In its first version, MP Author was a kind of buggy and also some basic functionality like editing an authored MP, PowerShell script support etc. which was fixed in the later released service packs. The current version of MP Author is MP Author SP 5 . In the mean time Microsoft released support of Visual Studio 2013 for Visual Studio Authoring Extensions 2013 for System Center Operations Manager. Which was basically only a compatibility support release for Visual Studio 2013.

Up to the year 2015, not much changed and soon Visual Studio 2015 was released. The problem was that the Visual Studio Authoring Extensions 2013 for System Center Operations Manager was not supported. Microsoft did NOT even consider supporting Visual Studio 2015 and any new version of Visual Studio! In summer 2015 Microsoft released a UserVoice questionary asking for feedback on any SCOM topic and the community feedback was that strong and powerful, that Microsoft decided to release a new version of Visual Studio Authoring Extensions 2015 for System Center Operations Manager which supports Visual Studio 2012/2013/2015 (all editions) . The release date was yesterday:).

The feature summary looks like this:

  • VS Projects for Monitoring MPs, System Center 2012 and later MPs including Operations Manager and Service Manager.
  • MP Item Templates for quick creation of MP Items.
    • XML MP Item Templates (generates MP XML for editing).
    • Template Group Item Templates (Abstract your intent from MP XML).
    • Snippet Templates (generates MP XML from CSV)
  • IntelliSense for MP XML for the following versions:
    • System Center Operations Manager 2007 R2
    • System Center Operations Manager 2012 and later
    • System Center Operations Manager 2016
    • System Center Service Manager 2012 and later
  • Integrates into Visual Studio Project System with *.mpproj.
    • Enables building within VS & MSBuild.
    • Supports custom build tasks (simply edit *.mpproj or *.sln)
    • Build multiple MPs (multiple *.mpproj) in a solution.
    • Integrates into any VS supported Source Control systems.
  • MP Navigation Features
    • Management Pack Browser for browsing MP Items.
    • Go to Definition
    • Find All References
  • ResKit Tools integrated
    • Workflow Simulator
    • Generate Visio Diagram
    • MP Best Practice Analyzer
    • MP Spell Checker
    • MP Cookdown Analyzer

I am very happy with this decision and this short history lesson shows you, how Microsoft listens to you and it also shows you how strong the community feedback can be. It even can steer the US Titanic a little bit in its direction.


Filed under: Authoring, Development, Management Pack, Tool

PowerShell – SCCM Cmdlet Library “Get-CMDeviceCollection : Specified cast is not valid.”

$
0
0

UPDATE: This issue has been resolved in the latest version here https://www.microsoft.com/en-us/download/details.aspx?id=46681

While doing some SCCM automation we bumped into an issue with the SCCM Cmdlet Library 5.0.8249.1128.

When you try to execute a workflow using PowerShell Remoting in SMA like this…

workflow test {

InlineScript{ $VerbosePreference = "Continue"

$ModuleName = (get-item $env:SMS_ADMIN_UI_PATH).parent.FullName + "\ConfigurationManager.psd1"

Import-Module $ModuleName

cd P01:

$DeviceCollection = Get-CMDeviceCollection -CollectionId "P010000C"

Return $DeviceCollection

} -PSComputerName "SERVERFQDN"

}
Get-CMDeviceCollection : Specified cast is not valid.
At test:3 char:3
+
+ CategoryInfo : NotSpecified: (:) [Get-CMDeviceCollection], InvalidCastException
+ FullyQualifiedErrorId : System.InvalidCastException,Microsoft.ConfigurationManagement.Cmdlets.Collections.Commands.GetDeviceCollectionCommand
+ PSComputerName : [SERVERFQDN]

After some investigation we could not determine the cause of it, so the last option was to rollback to Cmdlet Library Version 5.0.82.31.1004 and then everything worked fine. The problem exists if we provide a named parameter like -CollectionId or –Name the problem also exists in other Cmdlets and also in the latest version of SCCM 2016 (vNext). Microsoft has confirmed / fixed this issue and it will be available in the next version. I have filed this bug on connect.

I hope this helps!


Filed under: Configuration, SCCM, SMA, System Center, Troubleshooting

OMS – Free Microsoft Operations Management Suite (OMS) E-Book

$
0
0

image

Great achievements deserve great attention! The”Black Belts” of OMS Tao Yang, Stanislav Zhelyazkov, Pete Zerger and Anders Bengtsson have just released a free new e-book about the latest and greatest  Microsoft Operations Management Suite (OMS). It has over 400 pages and covers the following topics…

Chapter 1: Introduction and Onboarding
Chapter 2: Searching and Presenting OMS Data
Chapter 3: Alert Management
Chapter 4: Configuration Assessment and Change Tracking
Chapter 5: Working with Performance Data
Chapter 6: Process Automation and Desired State Configuration
Chapter 7: Backup and Disaster Recovery
Chapter 8: Security Configuration and Event Analysis
Chapter 9: Analyzing Network Data
Chapter 10: Accessing OMS Data Programmatically
Chapter 11: Custom Management Pack Authoring
Chapter 12: Cross-Platform Management and Automation

If you are curious about OMS and need to get some guidelines how to get you started or even some deeper knowledge, I highly recommend reading this book. You can download it here!

Thank you guys for providing such great contribution to the community!


Filed under: Book, OMS, Recommended

Azure Automation – ISE Add-On Editing Runbooks

$
0
0

image

Well it has been a while since last post, because there is a lot going on in my private life as also in my job. But now some “tasks” are completed and I will have more time for community work again. Microsoft product machinery is running at high speed in all areas. One tool I really appreciate is the ISE add-On for Azure Automation. I have written quite a lot of runbooks in the past for SMA using regular ISE and Visual Studio but a tool for writing runbooks which integrates into the SMA environment is missing. This add-On integrates seamlessly into your ISE environment and lets you write runbooks for Azure Automation in different flavors like regular PowerShell scripts and PowerShell workflows and executes them using Azure Automation. As a target you are able to choose either Azure itself or a Hybrid Worker Group. Joe Levy (PM Azure Automation) has already written a post about this add-on. I would like to dive a bit more into this.

How does it look like?

As you can see it seamlessly integrates into ISE…

image

How do I install it?

The installation is quite easy, depending on your needs. The ISE add-on module is available from PowerShell Gallery . Just open ISE and run

> Install-Module AzureAutomationAuthoringToolkit -Scope CurrentUser

Then, if you want the PowerShell ISE to always automatically load the add-on, run:

> Install-AzureAutomationIseAddOn

Otherwise, whenever you want to load the add-on, just run the following in the PowerShell ISE:

> Import-Module AzureAutomationAuthoringToolkit

The add-on will prompt you to update if a newer version becomes available.

How does it work?

I just started ISE and on the right side of the ISE there you can provide all necessary configuration. You are able to connect to Azure with your account and subscription in the Configuration tab…

image

As soon you are connected, you are able to manually download existing runbooks and assets or upload locally created runbooks and assets…

image

If you leave the default, all your configuration like runbooks and assets gets downloaded into your user profile path…

image

…within that folder and some folder hopping you find the actual files…

C:\Users\StefanRoth\AutomationWorkspace\[Subscription]\[Ressource Group]\[Automation Account]

image

if you look at the encrypted (SecureLocalAssets.json) and unencrypted (LocalAssets.json) files you will see this…

image

The strange thing is, that the connection strings are saved within the encrypted file although they are not encrypted.

What’s cool?

Well you are able to run your PowerShell scripts or PowerShell workflows either on Azure or on your Hybrid Worker Group, the output will be displayed in a separated window…

image

Right from ISE you are able to create the scripts or workflows…

image

…and of course all necessary Assets either encrypted or not…

image

Conclusion:

It is a very lightweight tool that works just right. I really like this approach and hope Microsoft will do the same for SMA. Few enhancements I would suggest are:

  • Some sort of grouping in a folder structure for runbooks and assets
  • Managing the runbooks with some Tags for classification
  • Having some sort of version control, like integration into TFS Online
  • Dependency view (TreeView) to see which child runbook belongs to which parent runbook
  • SMA integration

I hope this gives you a good overview of this add-On! Download the source code here


Filed under: Authoring, Azure Automation, Configuration, Development, Script, Software

Quick Post – Get Cmdlet Related DLL

$
0
0

image

In some situation you are running a cmdlet, but you have no idea where it is stored. I mean you don’t know to which “*.dll” it belongs to or maybe you want to know some more details about the command.

A very easy way to figure this out for the Get-AzureRmResource cmdlet…

(Get-Command Get-AzureRmResource).DLL

image

…as you can see the output will be the path to the “Microsoft.Azure.Commands.ResourceManager.Cmdlets.dll”. Of course you could run this command with any other cmdlet.

If you want to see other interesting details just run…

Get-Command Get-AzureRmResource | Select *

image

Finding the related DLL was quite useful for me in the past, so I though it might help you as well.


Filed under: PowerShell, Script

Office 365 – Microsoft. Exchange.Data.Storage. UserHasNoMailboxException

$
0
0

While playing around with Office 365 I bumped into an issue, which you might also could face. I created an administrator role in Azure Active Directory and activated my Office 365 E3 License (thank you Microsoft for this free license!). After setting up my tenant properly I assumed I could log into my Office 365 mailbox. But the I faced this error here…

1

…hmm and when I tried to add the user in the Exchange admin console to add a mailbox I saw this greyed out “pencil” sign…

image

I did “bing” around, but found not any solution to this problem. After a short while, inspecting my admin user in the Office 365 Admin Center…

image

I figured out that I did not assign a license to my user….

image

After assigning the license I could create a mailbox for my user and also login into the mailbox.

5Well, the error message is correct, but you will find misleading information when you try to find the answer online. Sometimes the solution is not that complicated Smiley. I hope this saves you some time…


Filed under: Office 365, Troubleshooting

PowerShell – PowerShellGet Module “Publish-PSArtifactUtility : Cannot process argument transformation on parameter ‘ElementValue’”

$
0
0

In PowerShell 5.0, Microsoft introduced the PowerShellGet module. This module contains cmdlets for different tasks. E.g. it lets you easily install / upload PowerShell modules / scripts from and to an online gallery such as PowerShellGallery.com. It even lets you find scripts, modules and DSC resources in such repositories. This is a fantastic way to share your script goodies and make it available to others, who can use them on-premise or even in Azure Automation for their runbooks or DSC projects.

In every collaboration scenario, there must be some rules. Publishing scripts has also some rules to follow, otherwise all scripts will end in a chaos and no one will ever find an appropriate script with the latest version etc. Therefore we need to provide structured data for version control, prerequisites and author information. This can be done using the PowerShellGet module.

Here just an overview of the cmdlets provided by this module…

image

Here comes the first pain point, if you try to run a cmdlet e.g. from your Windows 10 client, check the version of the module. In the screenshot above I ran it on an Azure VM with Windows Server 2016 TP4 installed. On my actual Windows 10 client I see this…

image

As you can see, there is a difference in version and cmdlet count. If you think now, that you could just upgrade the PowerShell version to the latest release on your Windows 10 box, well you need to wait until end of February 2016. Microsoft has pulled the latest RTM release back, because of some major issues. Find the post and details of the status on the PowerShell blog . If you managed to get to the latest release of the PowerShellGet module and you have the full set of cmdlets available, you are ready to start.

So how does that work?

Let’s assume we want to publish a PowerShell script to http://PowerShellGallery.com . Before you can start, you need to register with your Microsoft or Organizational account and then you will be ask to give PowerShell Gallery access to your account.

image

After registration you will get a key which will be needed later for uploading your files.

The least information needed to publish a script or a module is the following metadata provided in the header part of the script:

  • Version number
  • Description
  • Author
  • A URI to the license terms of the script

[Source]

In order to get a template structure for the metadata just run…

New-ScriptFileInfo -Path C:\Temp\myscript.ps1 -Version 1.0 -Description “My description”

image

This will create a new script file with a bunch of header data. As I have mentioned before, a must requirement are only VERSION, DESCRIPTION, AUTHOR and LICENSEURI, if you want to publish your script. If you don’t add this data, the Publish-Script or Publish-Module cmdlet will complain and you won’t be able to upload the files to the PowerShellGallery.com . After you finished editing the data and you feel like having everything the way you want it, then you are ready to publish your script. As an example I have just played with it and this is, how it could look like….

image

If you already have a file written and you just need to update the metadata you could use Update-ScriptFileInfo -Path “C:\Temp\Script.ps1” -Version 2.0 –PassThru. I was not able to do so, the cmdlet always failed requesting to provide all parameters (null value was not allowed).

If you are in doubt about your metadata, you simply can test it by using the cmdlet Test-ScriptFileInfo -Path C:\temp\Get-ExpiredWebhook.ps1 this will read the information and display it accordingly…

image

…and all properties shown here…

image

But there is another problem, which I initially wanted to blog about and took me few minutes to figure out. If you have a line break within your description, it look like this…

image

…it shows a comma although there is no comma in the description…image

Trying to upload to PowerShellGallery using the Publish-Script -Path C:\Users\returnone\Desktop\Get-ExpiredWebhook.ps1 -NuGetApiKey 12345678-1234-1234-1234-123456789123   fails with the following error which is in my opinion not very clear…

<br>Publish-PSArtifactUtility : Cannot process argument transformation on parameter 'ElementValue'. Cannot convert value<br>to type System.String.<br>At C:\Program Files\WindowsPowerShell\Modules\Powershellget\1.0.0.1\PSModule.psm1:2154 char:17<br>+ ... Publish-PSArtifactUtility -PSScriptInfo $PSScriptInfo `<br>+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~<br>+ CategoryInfo : InvalidData: (:) [Publish-PSArtifactUtility], ParameterBindingArgumentTransformationExce<br>ption<br>+ FullyQualifiedErrorId : ParameterArgumentTransformationError,Publish-PSArtifactUtility</p> <p>

image

After removing the line break, the Publish-Script cmdlet worked perfectly. I could reproduce the error and each time I saw this problem. The encoding was UTF-8 and just a plain text file / script.

If you want to know more about publishing scripts to PowerShellGallery.com go to that site and explore it.  If you want to know more about the PowerShellGet module in general, which is available in PowerShell 5.0 go to TechNet here .

The idea behind these cmdlet is very cool and also easy to use, but there is still some work to do fixing some of these bugs.


Filed under: Configuration, PowerShell, Script, Troubleshooting
Viewing all 140 articles
Browse latest View live