Archive | AUTHORING RSS feed for this section

ARM: A parameter cannot be found that matches parameter name ‘_artifactsLocationSasToken’

5 Dec

 

Short note to myself.

Problem

If you are trying to deploy an ARM template using Visual studio and you get the error below…..

AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name ‘_artifactsLocationSasToken’

… do not spend 1 hour on trying to figure out why the parameter isn’t found. Just continue reading the solution below.

Solution

 

Scan your ARM template(s) if it has the correct formatting!!! Even when the deployment doesn’t mention any validation errors (because the deploy hasn’t reached this step yet)

So in my case I had one } too much , so the next part was seen as a parameter for the main resource. The compiler and syntax highlight didn’t complain but the deployment gave the ‘_artifactsLocationSasToken’ error.

So just be warned 😉

 

Happy Azurering

Michel Kamp

Michelkamp.wordpress.com

Advertisements

Azure Marketplace Solution UI Test

29 Nov

Hi,

A quick note for all DevOps under us:

Currently I am doing an investigation on how to publish a solution to the Azure marketplace. You have 2 ways to do this. A Virtual Machine offer or a Solution template offer.

A Virtual Machine offer will be a sysprep’d VM that will contain all your pre-installed software and just will be deployed as a new virtual machine. So you will have to prepare a VM and sysprep/generalize it and upload it to Azure.

A Solution template offer is somehow more advanced. Here you don’t have to configure a sysprep’d VM image but you just use an ARM template to rollout a brand new VM and then use a script extension resource to deploy your artifacts (aka your software to install). The cool part is that you can also change the UI interface on the Azure portal when configuring the deployment. This is done with the createUiDefinition.json file that has to be a part of the solution zip file you upload to azure.

An example of this can be found here: https://github.com/Azure/azure-quickstart-templates/tree/master/marketplace-samples

So when you have created all your ARM templates and put it into a Solution zip file (aka just zip all the ARM templates in one file) you will upload it using the https://publish.windowsazure.com portal and make a staged publish. (aka test rollout/publish)

Now you have to wait for a couple of hours before it will be ready to test it out. And this is the part that is annoying. There is no way to test the custom UI interface without having to do a staged publish and wait for a couple of hours a again……

Or is there …. ??

Yes there is! Thanks to this link I was able to test my custom deployment UI.

All you will have to do is:

  1. Create using the Azure storage explorer a new public container. (so set the public access level). For example named : “test”
  2. Copy the file “createUiDefinition.json” to this container.
  3. Check if you can open the file using the “copy Url” in a new browser.
  4. Now you will have to encode this URL , you can use http://meyerweb.com/eric/tools/dencoder/ for this.
  5. The URL looks like this now: https%3A%2F%2Fegnlmkotkizce.blob.core.windows.net%2Ftest%2FcreateUiDefinition.json
  6. Now replace the URL in the text below: (note: if you see ” replace it by the normal double quotes , this is a WordPress issue)

    https://portal.azure.com/#blade/Microsoft_Azure_Compute/CreateMultiVmWizardBlade/internal_bladeCallId/anything/internal_bladeCallerParams/{“initialData”:{},”providerConfig”:{“createUiDefinition”:”URL from step 5“}}

     

  7. The end result could look like this:

    https://portal.azure.com/#blade/Microsoft_Azure_Compute/CreateMultiVmWizardBlade/internal_bladeCallId/anything/internal_bladeCallerParams/{“initialData”:{},”providerConfig”:{“createUiDefinition”:”https%3A%2F%2Fegnlmkotkizce.blob.core.windows.net%2Ftest%2FcreateUiDefinition.json“}}

     

  8. Open a new browser and paste in the URL from step 7. And the result will be ….

 

Super handy and cool!

 

Happy scomming azuring

Michel Kamp

https://michelkamp.wordpress.com

 

 

 

OMS: Querying OMS the Message Analyzer way

22 Sep

 

Hi,

Short post to give you something cool I tried out today. I think about 1 year ago Microsoft dropped the Network Analyzer tool and replaced it with the Microsoft Message Analyzer tool.

With this tool you can now trace not only network traffic, like you only could do with the network analyzer tool, but also many other trace datasources. One of them is also OMS. Yes you hear it right. You can now analyse your OMS query’s using the Message Analyzer tool !

Here’s a short howto:

Download the Message Analyzer tool from:

http://www.microsoft.com/en-us/download/details.aspx?id=44226

 

Install and start it.

 

Now press the “New Session” button.

Now select the OMS datasource

Logon into you AZURE account.

 

YOU will need to have an active AZURE subscription !!!

Select the correct Azure subscription and Workspace.

 

Now in de query box you can specify the search query like you would do in the OMS Log search.

 

For this demo I use “*” to get all records.

 

Press Apply

After a couple of seconds the OMS records will be displayed. Now you can select 1 record and see all the properties filed and values.

 

At this time the results are limited by 10. Maybe later on it will be changed.

 

Happy OMS’ing!

Michel Kamp

 

 

 

[OMS][TIP] Graph Grouping

14 Sep

 

Something I noticed.

In OMS when you are making search query’s you can use the BY command to group. When you specify multiply group columns and use the INTERVAL to generate a graph you will also get a nice feature exposed.

In the legend you can now select the lines you want to see by grouping. This could be very handy.

See picture below:

 

Drawback

 

One drawback when using multiply groups. If you use this query also in a custom view you will lose the legend. But this legend is useless anyway since the view space is too little to make it readable.

 

 

Happy SCOM’ing

Michel Kamp

[Workaround] OMS View Designer Pitfall alias Bug ?

5 Jul

 

In my last post I warned you for a design time issue when using the “Data-flow verification” on the Tile. (https://michelkamp.wordpress.com/2016/07/05/oms-view-designer-pitfall-alias-bug/ )

Now because I was of course myself hit by this and afraid losing my just designed dashboard I looked for a way to just open it behind the screen.

And I found a workaround.

Steps to take:

Open the OMS home Page

 

Now press F12 (using IE)

And (1) select the DOM Explorer. Now (2) and (3) select your custom designed View / dashboard and copy the GUID (4)

 

 

Now copy the URL form the current OMS page (1)

 

Edit the URL as below:

Replace the GUID after ?Solutionid= with the GUID you got from above step (4)

https://e1a1111-1d01-101a-1111-11ef1111c1cf.portal.mms.microsoft.com/?returnUrl=%2f#Workspace/overview/solutions/details/index?solutionId=11111f1e-7e1d-1c1f-1afc-1b1e11ebc11b&_timeInterval.intervalDuration=604800

Open a new IE tab

And paste this new URL above. And yes you are in !!!! Now first to do is to disable the Data-flow verification feature and save the dashboard.

Happy OMS’ing

Michel Kamp

Touching SCOM

https://michelkamp.wordpress.com

 

 

 

OMS View Designer Pitfall alias Bug ?

5 Jul

Hi OMS’ers,

Just a short post to warn you for a nasty situation during designing your fantastic OMS dashboards using the brand new View designer. (Public preview)

When you add a Tile you will have the feature called “Data-flow verification”. This feature will enable you to put a message on the Tile when no data records are found in the OMS system.

This is a handy feature because you don’t want to show an empty dashboard…. But this could also raise an issue during design time.

Because … what will happen when you have setup the “Data-flow verification” to check the past x days for any data but have made a typo or the data isn’t flowing in any more…. Yes of course the dashboard will show the message you specified but you will get more (for free) ….

You CAN NOT open your custom view (dashboard) any more to edit it !!! So you are somewhat stuck here … ; – (

Be warned!

 

So here the steps to see what happens:

Open de View designer

 

Add the Ttile , and enable the Data-flow verification

 

Now look at the tile when you add the query, it will give you a error when it hasn’t got any data back. So this will indicate you are going to have this issue….

 

Now save the dashboard.

And try to open it from the Home page……

 

Happy OMS’ing

Michel Kamp

Touching SCOM

https://michelkamp.wordpress.com

O no I forgot my SCOM account passwords!!

25 May

 

Problem:

O no I forgot my SCOM account passwords!! I don’t know the password of the Data Access, Data Reader and Writer account anymore. Resetting it in AD will force me to do a lot of tweaking to correct the accounts in SCOM.

Don’t worry we will find them for you.

Analyse:

 

SCOM stores the account passwords in the “Run AS Configuration -> Accounts” section. This account information is linked to a “Run As profile”. This Run as Profile can be assigned to a SCOM Workflow (Rule/Monitor/Task…) so that this workflow is going to run under the account security context.

 

Nice but we still can’t see the password on the accounts.

 

Solution:

 

But we can also do other things with the Run As profile. We can just assign them as a parameter to for example a script. In the script we can readout the account information and find our lost password.

In SCOM we can use the secure script provider (vbscript) aka “Microsoft.Windows.ScriptWriteAction”. The secure script provider streams the run as information as an input stream to the VBScript. So if you read this input stream at top of your script you will get the account information. This can be tricky sometimes.

See an example below:

<WriteActions>

<WriteAction ID=”sc” TypeID=”Windows!Microsoft.Windows.ScriptWriteAction”>

<ScriptName>ScriptName.vbs</ScriptName>

<Arguments />

<ScriptBody><![CDATA[ Set oAPI = CreateObject(“MOM.ScriptAPI”)

Set oArgs = WScript.Arguments

 

password= WScript.StdIn.ReadLine()

Call oAPI.LogScriptEvent(“ScriptName.vbs”, 101, 2, “Debug password = ” & password)

]]></ScriptBody>

<SecureInput>$RunAs[Name=”RUNAS_PROFILE_1″]/UserName$ $RunAs[Name=” RUNAS_PROFILE_1″]/Password$</SecureInput>

<TimeoutSeconds>300</TimeoutSeconds>

</WriteAction>

</WriteActions>

 

Using the SecureInput parameter we can provide the Run as account information. For getting the UserName we use :

$RunAs[Name=”RUNAS_PROFILE_1“]/UserName$

And for the password we use

$RunAs[Name=”RUNAS_PROFILE_1“]/Password$

The RUNAS_PROFILE_1 is the internal name of the Run as profile in SCOM. You can use Powershell “Get-SCOMRunAsProfile” to get the internal names.

I hear you thinking, this is way too old, this is VBScript, we WANT PowerShell! And I agree completely.

So for PowerShell we can use the normal PowerShell script provider aka “Microsoft.Windows.PowerShellProbe”. We don’t have to apply a secureinput parameter but just very simple supply the RunAs as a normal parameter. And this will do the trick.

<ProbeAction ID=”Probe” TypeID=”Windows!Microsoft.Windows.PowerShellProbe”>

<ScriptName>DisplayCerdentials.ps1</ScriptName>

<ScriptBody><![CDATA[Param(

$USERNAME,

$PASSWORD

)

 

# output the input paramters

Write-Output “UserName: $USERNAME”

Write-Output “Password: $PASSWORD”

 

# end script

 

]]></ScriptBody>

<SnapIns />

<Parameters>

<Parameter>

<Name> USERNAME </Name>

<Value>$RunAs[Name=”MSDL!Microsoft.SystemCenter.DataWarehouse.ActionAccount”]/UserName$</Value>

</Parameter>

<Parameter>

<Name> PASSWORD </Name>

<Value>$RunAs[Name=”SC!Microsoft.SystemCenter.DatabaseWriteActionAccount”]/Password$</Value>

</Parameter>

 

</Parameters>

<TimeoutSeconds>300</TimeoutSeconds>

<StrictErrorHandling>true</StrictErrorHandling>

</ProbeAction>

 

Now we make a simple workflow for example a task and add use this probeaction.

Concussion

 

You see it’s very simple to get account information that’s stored in the run as accounts / profiles. If this is good is up to you.

To make it even easier I created a MP that will display the most important account information (so the usernames and passwords).

You simply import the MP and select the Managementserver target and press the special task “GetRunAsCredentials”.

The account information will be displayed in the task output.

 

Download link for the Management Pack:

https://onedrive.live.com/redir?resid=A6ECD6E173E79D82!137890&authkey=!AEuYWi5Z6etHxno&ithint=file%2cxml

NOTICE: Please remember that the task output is stored in the SCOM Databases so it can be traced back not very secure I think. So use this only in emergencies. Or change the PowerShell script to write it to a file!!

 

Happy SCOMMING!

Michel Kamp

Touching SCOM

https://michelkamp.wordpress.com