Archive | March, 2012

Hey 2012 SNMP probe don’t give me a leading dot..

28 Mar

Quick post. I am now rewriting all the old 2007 SNMP management packs to the new 2012 SNMP modules. After a lot of text replacing and solving build errors I had a 2012 version of my SNMP management pack. After importing it into my lab I noticed nothing happened. So I started WFAnalyzer (special scom 2012 version , the 2007 version will not work) and noticed that the discovery’s where running but the matches didn’t match any OID. Hmmm… that is strange the MP worked perfectly in SCOM 2007. So I looked at the trace output and …

  image

.. what did you noticed at the picture ? Yes a leading dot! So it seems the new 2012 SNMP module (System.NetworkManagement.SnmpProbe) gives you as a bonus an extra dot .. Knipogende emoticon

So also change any REGEX match to solve this extra dot.  ^(.1\.3\.6\.1\.4\.1\.(.+))$

Just you know.

 

Happy scomming.

Michel Kamp

Great resources on Visual Studio Authoring Extensions

28 Mar

Almost forgot to share. Don’t know if you authors know it already.

Remember you can also use this for making 2007R2 MPs!!

Visual Studio Authoring Extensions for System Center 2012 – Operations ManagerVisual Studio Authoring Extensions – Visual Studio Features
Visual Studio Authoring Extensions – Simple Templates
Visual Studio Authoring Extensions – Advanced Templates
Visual Studio Authoring Extensions – Example Management Pack

Don’t forget to look at my latest post on creating a MP with VSAE

Happy Scomming

Michel Kamp

Dude where is my Availability Report data from the SCOM DWH ??

23 Mar

” Huston we got a problem!” when I run a availability report  the data isn’t complete. I’m missing a huge number of days. The graph shows UP (Monitoring unavailable) But I am really sure the server was up and monitored !!

image

 

Analyze:

Don’t panic, we are going to solve this. (I hope..) First we are going to look up the days we are missing. Simply click on the white bar. And the detail report will be rendered.

image

Okay looks like we are missing the most data from of 4-3-2012. And we see strange gaps of data that is present.

Okay that’s what the report says, but I am a core stuff guy I check it this way:

Open an SQL session and connect to the DWH db. Run this query. The last aggregated data will be on the first row. So you know what the last data date is you have. We change the DateTime to the same datetime we used in the report.

SELECT     ManagedEntity.FullName, vStateHourly.*
FROM         ManagedEntityMonitor INNER JOIN
                      ManagedEntity ON ManagedEntityMonitor.ManagedEntityRowId = ManagedEntity.ManagedEntityRowId INNER JOIN
                      State.vStateHourly ON ManagedEntityMonitor.ManagedEntityMonitorRowId = State.vStateHourly.ManagedEntityMonitorRowId
WHERE     (ManagedEntity.FullName LIKE ‘Microsoft.Windows.Computer:opsrms01%’)
AND vStateHourly.DateTime between  ‘20120301’ and ‘20120401’
order by vStateHourly.DateTime desc

The output will be:

image

So the last successful hourly aggregation was 02-03-2012 (dd-mm-yyyy). Hmmmm but when I look at the rendered report I see periods of data after this date ??? I must confess I really don’t have a idea now why Knipogende emoticon

Now we have to find the root cause and fix this missing aggregations. Luckily we can enable debug information for the aggregation process so we can see more what going wrong.

Open SQL and run the query below to enable debugging for the State aggregation .

UPDATE [OperationsManagerDW].[dbo].[StandardDataset]
   SET [DebugLevel] = 5

     WHERE [SchemaName] = ‘State’
GO

Now we can see the debug date with this query:

SELECT     TOP (100) DATEADD(hh, 1, DebugMessage.MessageDateTime) AS CET_datetime, StandardDataset.SchemaName , DebugMessage.MessageLevel, DebugMessage.MessageText
FROM         DebugMessage WITH (nolock) INNER JOIN
                      StandardDataset ON DebugMessage.DatasetId = StandardDataset.DatasetId
WHERE    (StandardDataset.SchemaName = N’State’)
order by messagedatetime desc

The output will be as below:

image

It looks like my aggregation process is way behind !

Since the maintenance for the DWH has a sequence run it means when some procedure before fails (lets say the event staging) the other won’t be hit. So I look in the debug table for other messages with ‘failed’ in the message.

Notice that we are now going a little of track , we main problem was the State report incomplete , but now we are looking at the Events. Just follow me.

SELECT     TOP (100) DATEADD(hh, 1, DebugMessage.MessageDateTime) AS CET_datetime, StandardDataset.SchemaName , DebugMessage.MessageLevel, DebugMessage.MessageText
FROM         DebugMessage WITH (nolock) INNER JOIN
                      StandardDataset ON DebugMessage.DatasetId = StandardDataset.DatasetId
where messagetext like ‘%Failed%’
order by messagedatetime desc

O no this is not good:

image

It looks like the event staging is broken. The error is:

Failed to process stagingarea for data set. Error 777971002, Procedure EventProcessStaging, Line 398, Message: Sql execution failed. Error 515, Level 16, State 2, Procedure DebugMessageInsert, Line 15, Message: Cannot insert the value NULL into column ‘MessageText’, table ‘OperationsManagerDW.dbo.DebugMessage’; column does not allow nulls. INSERT fails.

Mmmm When I look at the error I see the debug procedure that writes to the debug log has a problem writing a debug message. Strange this error… So we have to find the real error.  So I open the stored procedure “EventProcessStaging”. And there I found a BUG .. brrrr. The variable @InsertTableName is not set to a value before it is used as part of the debug message variable @MessageText. Because you can’t concat NULL to a string variable an exception is raised. I fixed this by moving the sql where this variable @InsertTableName is assigned to above the first use of the @InsertTableName variable.  (this is for SCOM 2007 and 2012!) I raised already a bug request @Microsoft throughout the TAP program. This only occurs when you set the debuglevel > 3.

For you it simple means don’t set it above 3  or fix this stp own your own risk. (as I have done ;-0 ) In our case the debug level was already above 3 for the state dataset for the last month. So because the event processing was braking the total maintenance (bad architecture , sorry) all my state staging  was stopped. And caused my empty reports.

So now we know it we can go back to the real issue. The missing states fix.

Fix:

This is now very simple. And if you Bing you can find plenty of info on it. One I found out in the beginning very helpful was this one from the Microsoft SCOM team ( http://blogs.technet.com/b/operationsmgr/archive/2011/09/06/standard-dataset-maintenance-troubleshooter-for-system-center-operations-manager-2007.aspx )

What I do is most of the time :

Set a enable = false override on the rule “Standard Data Warehouse Data Set maintenance rule” for all instances of “Standard Data Set”.

SNAGHTML15ff6a4a

Now I am really sure no maintenance process is running.

And I run my own maintenance process every 1 min. Because I know catching up the state data aggregation will take some time and I don’t want to create problem’s for the other datasets (performance , events ..) I will also run the important ones in the same script.

Open a query to the DWH and run:

USE [OperationsManagerDW]
DECLARE @DataSet uniqueidentifier

Print ‘starting loop of StandardDatasetMaintenance jobs’   
Print ‘Processing dataset:’   
Print @DataSet

while (1=1)
begin
    PRINT getdate()
    Print ‘Start StandardDatasetMaintenance’   
SET @DataSet = (SELECT DatasetId FROM StandardDataset WHERE SchemaName = ‘Perf’)
    EXEC StandardDatasetMaintenance @DataSet

SET @DataSet = (SELECT DatasetId FROM StandardDataset WHERE SchemaName = ‘Exchange2010’)
    EXEC StandardDatasetMaintenance @DataSet

SET @DataSet = (SELECT DatasetId FROM StandardDataset WHERE SchemaName = ‘State’)
    EXEC StandardDatasetMaintenance @DataSet

SET @DataSet = (SELECT DatasetId FROM StandardDataset WHERE SchemaName = ‘Event’)
   
    EXEC StandardDatasetMaintenance @DataSet
    –EXEC StandardDatasetProcessStaging @Dataset
    PRINT getdate()
    PRINT ‘END StandardDatasetMaintenance’
    WAITFOR DELAY ’00:01′
end

now you check the debug log on regularly base to see if the state aggregation is completed.

You can also use the query below:

———————————————————————–
— check first and last aggregation time from still to be processed  data
— first and last date must be equal
———————————————————————–
Declare @DataSet as uniqueidentifier
Set @DataSet = (Select DataSetId From StandardDataSet Where SchemaName = ‘State’)
Select AggregationTypeId, COUNT(*) as ‘Count’, MIN(AggregationDateTime) as ‘First’, MAX(AggregationDateTime) as ‘Last’ From StandardDataSetAggregationHistory
Where DataSetId = @DataSet AND LastAggregationDurationSeconds IS NULL
group by AggregationTypeId

So lets check if the process is running okay. Simply rerun the report. the output will be:

image

looks like its all going to be alright. Just be patient.

DO NOT FORGET:

after the states are complete to remove the overrides , other wise you will have for sure the same and more , problem again.

Not to be continued:

I really hope not. Because in my case we have a DWH size almost against 1TB and because of this size it can be very complex and tricky to solve this sort of problems. So if mr. Murphy is reading this , skip my place please …

Happy SCOMMING,

Michel Kamp

SCOM meets Excel Power Pivot part 2

17 Mar

This post will be describing the extensions mention in my pervious post about SCOM and  Power Pivot:

Predictive trend line so we can see when the disk space will be 0 in the future.

Monthly / Weekly / Yearly Aggregations so we can drill in and out of the data. SCOM is only providing daily and hourly aggregations but if you have a lot of data you will have to look at a higher perspective.

Real data mining so you can detect strange behaviors in your data.  [I am out of time this week. Maybe I will blog on this later]

So here we start.

Predictive trend

This one is quite simple to realize. I really love Power Pivot , ill show you!

  1. Select in the graph the line you want to add a predictive forecast to. The data point will be highlighted.
  2. Select in the “Layout” tab the “TrendLine”
  3. Select “Linear Forecast trendline”

image

Now you will see a trend forecast line. Leave the line selected and now go to the same menu and select “More trendline Options”

image

In the following menu we set “Forward” to lets say 10. Now the forecast will calculate 10 data points forwards on the X (date) axis.

SNAGHTML9694b7c1

Next we select “Glow and Soft Edges” because we want to give the trendline forecast some highlighting. Select Presets and take Red.

SNAGHTML9697f7ae

The result will be:

image

Cool isn’t it !!

I don’t have a solution to calculate the dates on the x axis so you can see on what date the line will hit 0. But I guess with a macro you could do this. OR by RTM Knipogende emoticon

 

Monthly / Weekly / Yearly Aggregations

Now comes the part that is most interesting. When a KPI outputs an logical disk measurement it will be written to the OPS DB and to the OPS DWH. Since the DWH will be holding a long time period of this data you can imagine that the size of the DWH DB will grow huge. To solve this the data must be aggregated to periods. If the aggregate process is successful completed the RAW data will be deleted/groomed.  This sounds good but not for the performance data, the only default aggregations are Hourly , Daily periods. But what if you want Monthly or even Yearly aggregations. ???? Again Power Pivot to the rescue !

Thanks to a friend of mine , expert in DWH Datamining and also Excel  (Albert van Dok), I found the solution. In excel we create a new work sheet. In this sheet, name it “Date Sheet”,  we make a Day (dag), Month (mnd) , Year (jaar) column. And fill it with a date range of lets say 2 years. (2011,2012).

image

Now we link this table to the data table we got from the SCOM DWH. See next steps. Select all the data rows from the A B and C columns. Now we select in the “PowerPivot” tab “Create Linked Table”

image

The power pivot windows will popup and show you the date range data you selected. Next step is to link the date columns to the date columns from the SCOM logical disk data.

Select in the right corner “Diagram View”

image

Now we select the DateTime column from the Query table and drag it to the to the dag column in the Table1. The result should be as below.

image

Next step is to create the Hierarchy calculation in order for getting the correct aggregations.

Select on Table1 all the columns and select right click “Create Hierarchy”

image

And the result will be :

image

You may/can rename the default created names.

Now we are going back to the excel sheet. Select the Trend Graph you want to display the monthly aggregation into and minimize in the Field List windows the query tree. And maximize the Table1 tree. Now drag the Jaar and mnd columns to the axis fields and the jaar,mnd,dag columns to the Slicers Horizontal.

image

The end result will be on the sheet: Selection windows for the year month and day.
When you select 1 or more columns in this windows the graph will be recalculated and the aggregated monthly data will be displayed. Look at the x axis. It displays the selected months and the y axis will contain all the AVG diskdata.

image

Next step is to create a new graph and do the same for the yearly aggregated data. Almost the same as the field selections above but now only the jaar (year) column in the Axis Fields window.

The end result will be:

image

Now you have created 2 new aggregations for the SCOM data. Good work !! Knipogende emoticon

 

Happy SCOMMING!!

Michel Kamp

Today SCOM authors lucky day ?

16 Mar

Will it be a lucky day for all our SCOM author’s ??

Maybe or Maybe Not….

Stay focused !

image

SCOM Meets Excel PowerPivot

10 Mar

 

In this post I will share some ideas I had a long time but never had time to realize it. It is just a showcase in the future I will of course extend this to a real production version.

The idea:

I wanted to do data mining on performance data that System Center Operations Manager SCOM has collected. So a data analyst could look at the data , do real-time / interactive actions to the data view perspectives and use forecasting to find quickly the bottlenecks to solve. Again a great predictive approach for the availability increment of your IT environment!

Solution(s):

Some solutions for this came up in my mind:

writing my own SQL query’s to do this.

Disadvantage: complex to maintain and not end user (data analyst) friendly

writing my own SQL reports to do this.

So combine the SQL query’s and use the output to render a report.

Disadvantage: In most cases a data analyst wants to change the perspectives at runtime. A report is static and needs to be rerendered every time and this is time consuming.

– Performance Point !

Yhea that sounds good. A few years ago Microsoft it self posted a demo to use performance point and SCOM DWH to do this. ( BI – Dashboard Integration )

Disadvantage: O course I tried this , It works but it is really hard to configure and extend and for sure if you aren’t a expert in maintaining SharePoint , SQL analyze services and OLAP.

Hmm … then I found the magic word: “PowerPivot”

Powerpivot is a really good tool for a data annalist. The pivot table views can be changed real-time and also aggregations can be changed on the fly. Since PowerPivot integrates smoothly with SharePoint it would mean the end user (data annalist) won’t have to install a client site software. Only a internet explorer is needed. (The SharePoint part I will blog in a future post.)

So the solution to try out was found. PowerPivot it will be….

PowerPivot:

What do you need to start:

1) operational SCOM 2007/2012 environment with the DWH role enabled.

2) Office 2010 with excel installed on your analyst end-user computer

3) PowerPivot for Excel 2010 plugin installed on your analyst end-user computer. ( http://www.microsoft.com/downloads/nl-nl/details.aspx?FamilyID=4b5d8276-d8c8-49b3-a653-f09f744ed9c5 )

4) SQL server account to access the SCOM DWH database and read tables.

If you have done all of this you are ready to start:

Filling the Pivot Table

Open Excel and create a new work book. I am not going to show you a screenshot of it Knipogende emoticon

Now notice the new powerpivot tab in the top bar. If not then you have to you google bing to solve this. Hint: add-inn security.

image

Click on the PowerPivot and select the icon “PowerPivot”

image

 

Now the powerpivot tool will be opened in a separate window.

In the Home Tab select “Get External Data” and “From SQL Server”

SNAGHTML6bc0d31c

Now we must specify the SQL server connection to the SCOM DWH. Fill in the correct SQL server instance/Database name and credentials

SNAGHTML6bc46fe8

Next steps will be something you will have to follow me blind on. For now I am not going in to deep on explaining the details.

We are going to add the SQL query we want to use for getting the data

SNAGHTML6bc79f22

before we know what to past in this window we must make the SQL query.

Open SQL management studio connect to the SCOM DWH and run the query below:

select displayname , ManagedEntityRowId from vManagedEntity
where displayname like ‘Windows Server 2008 Computer Group%’

Now the result is the instance id of the scom group “Windows Server 2008 Computer Group” that contains all the Windows 2008 computers targets.

image

Next we want to get the performance data from the logical disk concerning:

Logical Disk Free Megabytes 2008

and

% Logical Disk Free Space 2008

We have to find the rule GIUDs for this. Run the query:

select * from dbo.vRule where
rulesystemName like ‘Microsoft.Windows.Server.2008.LogicalDisk.%’

image

Select the 2 highlighted GUIDs. This are the performance rules that are collecting the wanted disk data.

Now we combine this in the query below:

exec Microsoft_SystemCenter_DataWarehouse_Report_Library_PerformaceReportDataGet
@StartDate=‘2012-03-5 09:10:00:000’
,@EndDate=‘2012-03-9 09:10:00:000’
,@OptionList=N'<Data><Values><Value>59667
<Rule>B49F6325-F53D-9309-C133-2C73653CA29F</Rule><Color>63,63,255</Color><Type>Line</Type>
<Scale>1</Scale></Value><Value>59667
<Rule>F01B368F-AFB9-7AA2-0C80-2F9C924790B8</Rule><Color>63,63,255</Color><Type>Line</Type>
<Scale>1</Scale></Value></Values></Data>’
,@DataAggregation=0

Change the @startDate and @EndDate to the period you want to get the data. Test the query to see if it gives output.

image

 

So now you have the query we can use in the PowerPivot. Copy and past it in to this window.

SNAGHTML6be12df9

Press Save and the query will be executed and the results will be displayed.

image

Next we have to create some Calculated fields , otherwise we can’t add this to the Value  pane window in the graph.

Go select the AverageValue column. And select in the “Home Tab” > “AutoSum”> “Average”

image

Now you have created a calculated field. Do the same for all other values you want to display in a graph.

image

At this point we have a pivottable with value data!

Pivot to Graph

Next we are going to create the trend graphs for the annalist. In the PowerPivot windows Press the PivotTable Icon > “Two Charts (Vertical)”

image

Now there will be in excel created 3 sheets. Two sheets containing the data from the pivot table and 1 sheet containing the 2 graphs. I rename the sheets to better names:

image

Now we are going to configure the graphs. Select in Excel the Graph Sheet (diskspace Analyze). Click on the top graph. On the right the field select windows will be shown. Select the fields show in the picture below and drag it to the correct pane on the bottom. (click on the screenshot to enlarge)

image

Some short explanation on what you just did:
The field in the Axis Fields is the X axis and shows the date times. The field dropped in the Values pane will display on the Y axis, this will be the disk space value. This are the 2 most important fields. But…. what if a computer has 2 logical disks then the values will be added to one line instead of 2 lines. So we need to get categories configured. This is done by the fields in the Legend pane. So now we have a graph but it shows all the data from the 2 performance rules. That not really interactive, we want to select the rules runtime. So we drag the RuleDisplayName into the Slicers vertical pane and whala we see our interactive selection list. We do the same for the interactive selection of the computers. Drag the Path field into the Slicers Horizontal pane.

Last we play with the Graph styles. Select the Graph and then change the style.

image

Select the type “Line with markers”

SNAGHTML72618c31

Next we change the looks of the graph. Select the graph and play with the Layout options.

image

The result will be:

image

Cool it is isn’t it ???? Now you can select the logical disks interactively and the graph will be rebuild. 

To be continued…

Because this post is becoming to long I will post in my next blog about some extras a data annalist has to have.
– Predictive trend line so we can see when the disk space will be 0 in the future.

– Monthly / Weekly / Yearly Aggregations so we can drill in and out of the data. SCOM is only providing daily and hourly aggregations but if you have a lot of data you will have to look at a higher perspective.

– Real data mining so you can detect strange behaviors in your data.

Below a teaser.. Knipogende emoticon

image

 

I hope you see how easy it is to use PowerPivot to analyze your SCOM data.

If you want the excel source file you will have to leave a message below  Knipogende emoticon and maybe it will share it …

Happy Scomming,

Michel Kamp