Archive | Reporting RSS feed for this section

SCOM console could not load Scheduled Reports Subscriptions

8 Aug

 

 

A quick note since I could not find it on the web.

 

Today I had a call with a customer. He had a issue with opening the Scheduled Reports view in the SCOM console.


 

He got the error below:

 


 

The error was as cryptic as we are used to known form MSFT :

 

System.NullReferenceException: Object reference not set to an instance of an object.

 

Note:  The following information was gathered when the operation was attempted.  The information may appear cryptic but provides context for the error.  The application will continue to run.

System.NullReferenceException: Object reference not set to an instance of an object.
   at Microsoft.EnterpriseManagement.Mom.Internal.UI.Reporting.ManagementGroupReporting.GetSubscriptions(String owner)
   at Microsoft.EnterpriseManagement.Mom.Internal.UI.Reporting.Views.ReportSubscriptionsView.ReportSubscriptionsView.LoadSubscriptionsJob(Object sender, ConsoleJobEventArgs args)

 

So I looked at the subscriptions table in the reporting database where the subscriptions are stored:

SELECT
*
FROM [ReportServer].[dbo].[Subscriptions]

And all the 40 subscriptions where there.

So what could be the issue ?

After some digging it seems that the delivery option EMAIL was missing .


 

So by running the Reporting Services Configuration Manager and configure the email settings again it all started to work again

 

 

So happy SCOMMING again

 

Michel

 

 

Advertisements

How To fix the 500 Error using Favorite Reports in the SCOM web console.

22 May

First mention this is a non official solution.

The Problem

You have installed SCOM 2012 Sp1 UR2 and have implemented the scom webconsole and reporting service to be running under HTTPS mode. You have created using the native scom console a favorite report and now when you try to open this favorite report in the scom webconsole you get a error 500.

image

image

Analyzing

To see the real error we have to do some web.config changes. So open the web.config file on this location: C:\Program Files\System Center 2012\Operations Manager\WebConsole\MonitoringView

Now we enable the SCOM error logging

image

And to get it displayed on the user page we do

image

Now when you run the favorite report again we get in the webconsole  the real error

image

Okay looks like the reportviewer web component binary dll can’t be found. Hmm but wait wasn’t this a prereq at installation time. So I checked if the 2010 ReportViewer components where installed and yes it was and the dlls where also spotted in the assembly cache. It looks like the webconsole has problems finding the correct version of the Microsoft.ReportViewer.WebForms.dll in the assembly cache.

The Quick non Official Solution

Copy the missing dlls to the correct directory will force the web runtime to first look in this directory for the dlls and then go to the assembly cache. So that’s what i did.

Copy the Microsoft.ReportViewer.WebForms.dll file from the assembly cache to path : C:\Program Files\System Center 2012\Operations Manager\WebConsole\MonitoringView\bin

Come on give me some script to do that ! Okay open PowerShell as admin and run

Copy-Item c:\Windows\assembly\GAC_MSIL\Microsoft.ReportViewer.WebForms\10.0.0.0*\*.dll “C:\Program Files\System Center 2012\Operations Manager\WebConsole\MonitoringView\bin”

And now you try to run the favorite report again in the webconsole …

image

… and Yes its working!

The End.

For me this looks  like a bug and I will address this to Microsoft.

Happy Scomming!

Michel Kamp

Touchdown : ScomExcelWorkbook V2 is released

28 Jan

Hi Community,

After some delays I have finished the V2 version of the ScomExcelWorkbook.
See old  V1 post here: https://michelkamp.wordpress.com/2012/11/25/scom-and-excel-a-perfect-couple/

New features

1) More types to query

The types below can now be used in the Type column:
Events
Objects
Alerts
Performance
TaskResults
Discoveries
Rules
Overrides
Monitors
ManagementPacks

2) Combine data on same sheet

When you make a pivot table from a SCOM data sheet you will notice that if you want to combine data of 2 data sheets into one pivot table it will be a challenge to get it work. So I have made a feature that you can append data to one SCOM data sheet. The only thing you will have to do is to use the same sheet name and type in the query rows. See below for a example.

image

3) Extended Properties

Since I was too lazy to hard type every property name in a column I made it dynamic. Cost me some time to get the object type casting generic but again I learned a lot more of C# reflection.  And that’s what I do it for “learning on the job”.

4) Optimization

As always , some speed-up and code simplifying was done.

 

Okay nice but how to get it…

In my V1 version I decided to only give the download to people that did a PM. I have had a lot of PMs and good responses. This was great but took a lot of time to process. So now I will publish it on my public SkyDrive link below. But you are free to leave me a comment I really would appreciate that.

https://skydrive.live.com/redir?resid=2FFA0FC5B0B89EED!1363&authkey=!AOoo44wQmagQyOI

Download it and give it a try. Please please let me know if you like it or have suggestions/problems.

O yea before I forget:

The sheet contains sample query’s you can delete every line after row 4.  And put your own ones in. Some help can be found here:

https://michelkamp.wordpress.com/2012/12/12/your-scom-sdk-query-cheat-sheet/

 

Happy Scomming,

Michel Kamp

No Mr. SCOM I told you not a availability state report but a performance state report I want!

16 Jan

Sometimes you wonder why not all the reports are as the should be. For example of course you are known with the availability report . Just pick a target and period and you will get a nice report telling you when a target when unhealthy.

image

The challenge.

Okay nice …. but I want a report not based on the availability data but on the performance or configuration or security data. But wait this is build into the availability report isn’t it ?

looking at the report description:

Description:

“For every managed object within System Center Operations Manager, monitors configured in each of the disciplines below determine an objects time in state and then roll-up to an objects overall health. The availability report by default shows an objects time in state as per the monitors that roll-up within the availability discipline.

Entity health

Availability   <= this you get

Configuration <= this you want

Performance <= ..

Security <= ..

O no , it looks like not. So yes it’s a real challenge. That the way we like it.

Solution

Since the availability report was intended to be used for this but at the end it looks like the SCOM program team decided to make it locked on ‘availability’ only.  I know this because when you look into the report definition you will see:

image

So the report is using only the availability rollup as state calculation data. AND this parameter is hidden for gurus as us. How dear they Knipogende emoticon

So we can solve it on several ways. The root solution is that we want to change the value ‘System.Health.AvailabilityState’ to ‘System.Health.PerformanceState’ or ‘System.Health.ConfigurationState’  or ‘System.Health.SecurityState’ to get the report state type we want.

1) export the report from report service and edit the hidden value to false. Import the report and open it in the SCOM console and edit the MonitorName value to for example System.Health.PerformanceState . Run the report and you are done.

2) make a normal report run using the non modified availability report and save it to a Management pack. Now export the MP and open it in notepad and edit the MP.

3) make a normal report run using the non modified availability report save it as favorite. Now open SQL enterprise and lookup the report in the table dbo.favoritereport . Change the ReportParameterValues with the changed parameters.

I know you are thinking right now… what would you do Michel…

I would go for option 1. Because I would also change the report definition to have the correct name as ‘Performance availability’ ect.. and save it also under a different name. Because you must be aware that if you only change the report value to hidden = false and don’t change the report file name….. The next time you import a new service pack or MP version it could be that your report is going to be overwritten… So said that go for the more save one and choose 2.

Let’s go!

1) So make the normal availability report in the SCOM console

2) Save it to a MP

image

3) Export the MP

4) Edit the MP with notepad

image

5) import it in scom. (leave the mp version number unchanged)

6) wait a few minutes and you will see the report in the console

Below the end result. Also notice that you can still click to sub report that that this report are also of the state type you wanted!.

 image

Yes I know that you will have to do this for every 3 report types because you can’t change the monitor type runtime. At the end the decision is at you to use step 1 , 2 or 3.

The End

Every time I tell my self make a short blog post! But every time I notice that I am failing.. But who cares…  (yes okay.. my wife) Knipogende emoticon 

Happy scomming!

Michel Kamp

Authoring SCOM Reports in VS 2010

14 Jan

Hi,

Short post on how to get you dev environment ready for authoring scom reports.

Challenge:

You have installed SCOM 2012 on SQL 2008. You want to author a custom report using Visual studio 2010. When you open visual studio you will notice that NO BI project template is shown. Normally you selected this project template and selected the new report project to make your custom report. How now to continue ?

Solved:

Grab a SQL 2012 ISO (YES 2012) and startup the setup.

1) Select installation:

SNAGHTML1045608b

2) New sql or add features

SNAGHTML104737a8

3) Select SQL features Install

SNAGHTML104bfe9e

4) Now the important step. Select the 3 options here. Most important is the “SQL Server Data Tools”. This features contains the VS BI project template.

SNAGHTML104f3ae3

5) Step though the install windows.

And now open Visual studio 2010 and create a new project. And what do we see ?

Yes the BI template Knipogende emoticon 

SNAGHTML1058011e

 

Now you can create the new SCOM reports. Notice also the NEW chart types !!!

image

 

Remember that if you use custom report code components you must copy the correct .dll assemble to the directory:

C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies

The End.

Happy Scomming

Michel Kamp

Don’t let the data warehouse write action fool you!

26 Sep

Yes I know. It’s a long time ago I posted. Vacation and most work pressure were and are still the reason. But never less I will share a problem I undergone that looks a small one but can have big impact.

The problem.

You have a workflow that has a PowerShell/vbs script that outputs a property bag with performance data. The performance data contains multiply counters. Now the performance data is going to be written to the OPSDB and DWHDB.  All works okay, you see the performance data counters in the native console. So you say now its okay because the DWH write actions is also writing the same counters to the DWH….  but when you look in the DWH you see that only one counter is stored. But you are sure the workflow outputted multiply counters…. 

Below the performance counters in the native console. All the 4 perf counters are there (yellow) in the ops console

image

Below the DWH.

You see only one rule (yellow) , this was the first in the property bag.

clip_image002

What could be wrong ???

Analyze

The workflow looks like this:

   <Rule ID=”TransferFile.ReadSec” Enabled=”true” Target=”FileTransferClient” ConfirmDelivery=”true” Remotable=”true” Priority=”Normal” DiscardLevel=”100″>
        <Category>Custom</Category>
        <DataSources>
          <DataSource ID=”SMBFileTransfer” TypeID=”FileTransfer”>            <Debug>false</Debug>
            <IntervalSeconds>300</IntervalSeconds>
          </DataSource>
        </DataSources>
         <WriteActions>
          <WriteAction ID=”ToOps” TypeID=”SystemCenter!Microsoft.SystemCenter.CollectPerformanceData” />
          <WriteAction ID=”ToDWH” TypeID=”SCDW!Microsoft.SystemCenter.DataWarehouse.PublishPerformanceData” />
        </WriteActions>      
      </Rule>

1. Frist you check what the property bag output from the datasource SMBFileTransfer  is containing

<Collection><DataItem type=”System.PropertyBagData” time=”2012-09-20T19:55:28.0638791+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><Property Name=”Instance” VariantType=”8″>c:\destionation</Property><Property Name=”Counter” VariantType=”8″>Read Transfer Kbyte Sec</Property><Property Name=”Value” VariantType=”5″>14450.625</Property></DataItem><DataItem type=”System.PropertyBagData” time=”2012-09-20T19:55:28.1079971+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><Property Name=”Instance” VariantType=”8″>c:\destionation</Property><Property Name=”Counter” VariantType=”8″>Read Transfer Total Sec</Property><Property Name=”Value” VariantType=”5″>0.3</Property></DataItem><DataItem type=”System.PropertyBagData” time=”2012-09-20T19:55:28.1079971+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><Property Name=”Instance” VariantType=”8″>c:\destionation</Property><Property Name=”Counter” VariantType=”8″>Write Transfer Kbyte Sec</Property><Property Name=”Value” VariantType=”5″>14450.625</Property></DataItem><DataItem type=”System.PropertyBagData” time=”2012-09-20T19:55:28.1079971+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><Property Name=”Instance” VariantType=”8″>c:\destionation</Property><Property Name=”Counter” VariantType=”8″>Write Transfer Total Sec</Property><Property Name=”Value” VariantType=”5″>0.3</Property></DataItem></Collection>

You see multiply counter values that have to be converted to performance data.

2. Now we check using the WFAnalyzer the converted performance data. See below. It looks okay.

Recieved DataItem <DataItem type=”System.Performance.Data” time=”2012-09-20T19:55:28.1109383+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><TimeSampled>2012-09-20T19:55:28.0638791+02:00</TimeSampled><ObjectName>SMB File Transfer</ObjectName><CounterName>Read Transfer Kbyte Sec</CounterName><InstanceName>c:\destionation</InstanceName><IsNull Type=”Boolean”>false</IsNull><Value>14450.625</Value></DataItem>

Recieved DataItem <DataItem type=”System.Performance.Data” time=”2012-09-20T19:55:28.1109383+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><TimeSampled>2012-09-20T19:55:28.1079971+02:00</TimeSampled><ObjectName>SMB File Transfer</ObjectName><CounterName>Read Transfer Total Sec</CounterName><InstanceName>c:\destionation</InstanceName><IsNull Type=”Boolean”>false</IsNull><Value>0.3</Value></DataItem>

Recieved DataItem <DataItem type=”System.Performance.Data” time=”2012-09-20T19:55:28.1109383+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><TimeSampled>2012-09-20T19:55:28.1079971+02:00</TimeSampled><ObjectName>SMB File Transfer</ObjectName><CounterName>Write Transfer Kbyte Sec</CounterName><InstanceName>c:\destionation</InstanceName><IsNull Type=”Boolean”>false</IsNull><Value>14450.625</Value></DataItem>

Recieved DataItem <DataItem type=”System.Performance.Data” time=”2012-09-20T19:55:28.1109383+02:00″ sourceHealthServiceId=”0F6B7345-4C8E-CFAF-BD7A-454E6C94B77F”><TimeSampled>2012-09-20T19:55:28.1079971+02:00</TimeSampled><ObjectName>SMB File Transfer</ObjectName><CounterName>Write Transfer Total Sec</CounterName><InstanceName>c:\destionation</InstanceName><IsNull Type=”Boolean”>false</IsNull><Value>0.3</Value></DataItem>

3. Next step is to check the write actions. This also looks okay. The “ToDWH “ writeaction should write the data to the DWH.

<WriteActions>

<WriteAction ID=”ToOps” TypeID=”SystemCenter!Microsoft.SystemCenter.CollectPerformanceData” />

<WriteAction ID=”ToDWH” TypeID=”SCDW!Microsoft.SystemCenter.DataWarehouse.PublishPerformanceData” />

</WriteActions>

All looks okay….

Solution

After some mailing with the OM development team the answer was found: Writing multiply counters to the DWH from 1 property bag output is NOT supported! So the DWH write module has a one to one reference map that means only one rule can contain one counter. Be aware no error is reported if this happens..

The only way to solve this is to make 1 rule for every performance counter you want to store in the DWH.  Use a condition detection in the rule for filtering the correct performance counter. See below for a example.

<Rule ID=”TransferFile.ReadSec” Enabled=”true” Target=”FileTransferClient” ConfirmDelivery=”true” Remotable=”true” Priority=”Normal” DiscardLevel=”100″>
<Category>Custom</Category>
<DataSources>
<DataSource ID=”SMBFileTransfer” TypeID=”OPS.SMB.Performance.FileTransfer”> <Debug>false</Debug>
<IntervalSeconds>300</IntervalSeconds>
</DataSource>
</DataSources>
<ConditionDetection ID=”Filter” TypeID=”System!System.ExpressionFilter”>
<Expression>
<SimpleExpression>
<ValueExpression>
<XPathQuery Type=”String”>CounterName</XPathQuery>
</ValueExpression>
<Operator>Equal</Operator>
<ValueExpression>
<Value Type=”String”>Read Transfer Total Sec</Value>
</ValueExpression>
</SimpleExpression>
</Expression>
</ConditionDetection>
<WriteActions>
<WriteAction ID=”ToOps” TypeID=”SystemCenter!Microsoft.SystemCenter.CollectPerformanceData” />
<WriteAction ID=”ToDWH” TypeID=”SCDW!Microsoft.SystemCenter.DataWarehouse.PublishPerformanceData” />
</WriteActions>
</Rule>

THE END

Maybe this will help you. Till next Time.

Happy SCOMMING

Michel Kamp

SCOM DWH aggregations data loose Tip and Tricks

10 Apr

 

This ‘short’ post will be about the DWH aggregations again. It will contain some tips on how not to loose any data.

!!! All I suggest and do here is at own risk and totally unsupported without instructions given from Microsoft support. !!!!

The problem:

You run a performance report on 1 month. You notice that you are missing some days of aggregated hourly/daily data. You were not having any troubles as you know… till now.

image

Analyze:

First we are going to look if we have any aggregations that are not completed yet.
Run the SQL Query below on the DWH database:

— checking the to be processed aggregations ————–

SELECT     COUNT(*) AS Aggr_behind, Dataset.DatasetDefaultName
FROM         StandardDatasetAggregationHistory INNER JOIN
                      Dataset ON StandardDatasetAggregationHistory.DatasetId = Dataset.DatasetId
WHERE     (StandardDatasetAggregationHistory.DirtyInd = 1)
GROUP BY Dataset.DatasetDefaultName

The result could be as shown below. The Aggr_behind number shows you the aggregations that are not completed yet.

image

In this case with this high number we are having a serious problem. Okay then you just follow the my pervious blog post on how to solve this , this is for States missing but can also be applied for performance data. Look at the FIX: part. To kickoff the aggregation processing.

(https://michelkamp.wordpress.com/2012/03/23/dude-where-my-availability-report-data-from-the-scom-dwh/)

But if you see a performance data set number around 2.  (See picture below) It means 2 aggregations have to be processed yet. This is what we want to see. So everything seems okay. But why are we missing the date period 01-02-2012 till 20-01-2012 ?

image

We could have 2 scenarios here:

1. The data was simply not provided to the DWH ?

2. The data was provided but due to stage/aggregation problems not processed.

For case 1 we have to look at the agents what went wrong. That is for this post out of scope.

For case 2 we have some solutions see below.

Case 2

First let me explain how the aggregation process works at helicopter view.  I am sure I miss some details (so feel free to add / correct me on this!)

image

Looking at the picture above: (click on it to expand)

1. The SCOM Management server DWH writer Datasource writes the Performance Data to a RAW staging table.

2. The DWH staging process processes this data by copying the RAW rows into a process table. Sometimes the table is simple renamed and recreated if the new RAW data count is less then a configured number. If you have a big number of new RAW rows the table rows will be copied in batches. This to minimize the transaction log impact. At last the RAW data is copied into the RAW data partitions tables.

3. The Standard Maintenance process generate the Aggregation sets that have to be processed in step 4. During this process there will be created aggregation process rows in the Aggregation history table with a Dirty Indication (DirtyInd) of 1.

4. The RAW staged partition data will be processed to aggregated hourly and daily data. When the aggregation is complete the Dirty Indication for that aggregation will be set on 0.

5. The stored procedure reads the just aggregated data.

6. Data received from step 5 will be used to generate the report for the end user.

 

So now knowing the data flow what could be wrong ?

The answer we have to search at the grooming process (?) yes, the grooming process. The data in the RAW partitions tables from step 2 has a grooming/retention period. This period is standard 10 days. So if your aggregation is broken for more than 10 days (and you didn’t detected this) you will LOOSE your RAW data and as a result the aggregation process will have nothing to aggregate. So no performance data, resulting in our root problem the date gap in the report.

Solution:

Pfff … nice all of this theory stuff but how do I fix this ?

Simply by :   😉

1. Manually insert the missing RAW data and kickoff the aggregation process. I will blog post on how to do this later. (would be after the MMS)

2. Prevent that this is going to happen again.

To prevent this you can increase the retention/grooming period from 10 days to lets say 30 days. Check if you have enough DB space first. Execute the query below:

update StandardDatasetAggregation
set MaxDataAgeDays = 30
where GroomStoredProcedurename = ‘PerformanceGroom’  and AggregationTypeID= 0

Now you will have 30 days to solve your aggregation problems. Of course this is a workaround to get more air to breath during fixing your aggregation problems.

The best way is to monitor it pro active. Since we can monitor everything we create a monitor that checks the outstanding aggregations every 60 minutes and alerts when a threshold is hit. You can use the query from the analyze part in this post to do this. I would set the threshold on 10 so you will be notified if your aggregation process has a delay of 10 datasets (about 10h). If I have time before I’m going to the MMS I will blog post this extra monitor because with the normal DB watcher you can’t make this one. And of course I will use the VS Authoring extensions for this.

Happy scomming.

Michel