Code from my DevConnections Sessions

Thanks for a great week! Here’s a link to the code I used at IT DevConnections 2017.


Big thanks to my colleague Kaido Järvemets for his contributions, even though he wasn’t able to attend!


How To: Publish ConfigMgr Data to OMS

This post shows an example of querying data from ConfigMgr, converting to JSON and posting to Microsoft Operations Management Suite (OMS) as a custom log. If you’re not yet familiar with OMS, take a look at the OMS home page and Channel 9 for starters.

In this example, we’re going to use PowerShell to query collection evaluation data from ConfigMgr and post it to OMS. Here are the major steps:

  • Query ConfigMgr for the information we need, translate it (via SQL) to JSON-friendly formatting
  • Convert the query results to JSON
  • Use OMS API to post the information to OMS

Download the entire script from GitHub.

Query ConfigMgr

Here’s the query that will run against your primary site (note, only primary sites perform collection evaluation, so running this against a CAS returns zero results):

'PS1' as [SiteCode],
[t0].[CollectionName] as [CollectionName],
[t0].[SiteID] as CollectionID,
[t1].[EvaluationLength] AS [RunTimeMS],
concat(convert(varchar(19), [t1].[LastRefreshTime], 127),'Z') as [LastEvaluationCompletionTime],
concat(convert(varchar(19), [t2].[NextRefreshTime], 127),'Z') as [NextEvaluationTime],
[t1].[MemberChanges] AS [MemberChanges],
concat(convert(varchar(19), [t1].[LastMemberChangeTime], 127),'Z')  AS [LastMemberChangeTime]
FROM [dbo].[Collections_G] AS [t0]
INNER JOIN [dbo].[Collections_L] AS [t1] ON [t0].[CollectionID] = [t1].[CollectionID]
INNER JOIN [dbo].[Collection_EvaluationAndCRCData] AS [t2] ON [t0].[CollectionID] = [t2].[CollectionID]

Notice I made some tweaks on the data returned. I added a SiteCode property because in my environment, I have more than one primary site (luckily, most of you don’t have do deal with this). I also modified the date-time values that were returned to append a Z for Zulu Time. I recall Zulu time from my Marine Corps days, and as I was trying to figure out how to get proper date-time into OMS (via JSON), all roads led to Zulu time (more info here, and here). Luckily, the date-times used in the SQL query above are already in UTC time, so we’re just changing the formatting to make it JSON-Friendly (note: I’m no JSON expert, so if you have identified better methods, please leave a comment). If you reuse this code for other queries and have times in local time, you need to first convert them to UTC time.

The columns of the query are described pretty well – I’ll point out the RunTimeMS, as that’s the evaluation time in milliseconds to perform the evaluation of the collection. So as you can see, I have some ugly ones that take 5 to 10 seconds to complete. The full script is posted below. To properly run this query against your environment, modify line 15 (the SQL Connection String). I’m using CM_PS1 as my database name and MyDataBaseServer as the server name. Modify this connection string as needed for your environment.

Convert the query results to JSON

Since we performed conversions in our SQL query, the process to convert to JSON is easy. You can see in lines 44-47, we use the convertto-json cmdlet, and specify the $table variable (which contains the results of the query from ConfigMgr). Also, notice the -Compress argument, which compresses the JSON (and removes a lot of whitespace).

Use OMS API to post the information

Now for the fun part – sending the data to OMS. You can easily setup a free OMS account – for this exercise, you don’t need to connect any servers to send data to OMS – we’ll simply use the API. Navigate to your new OMS workspace, and select Settings, and then Connected Sources. From here, copy the Workspace ID and Primary Key for lines four and seven in the main script. These items authenticate your script when calling the API.


Once you have updated the script (and you know you have proper rights to ConfigMgr, as well as the ability to access the internet from where you run the script), run it and cross your fingers for the success code of 200, as shown below. (You can review other return codes here).


Once you see the success code of 200, you’ll be more than ready to view the OMS portal, but do yourself a favor and get a coffee first – in my experience, it may take an hour or so to see the information the first time data is posted (I’m guessing there’s some processing that has to happen on the back end for the custom log type, indices, etc). In fact, when you do start seeing the data, you may only see one field from the custom log type – if that’s the case, wait longer, or open a private browser to see the rest (there’s some kind of caching going on . .).

Once you have waited that hour or two, navigate to Log Search in the OMS console and select Count of all data collected group by type. You should then see CM_CollevalInfo_CL


Click on CM_CollevalInfo_CL in the middle pane to display the details.


Now you see the magic. Notice the different property names, and the suffix _s (string), _d (double), _t (date/time) – view all suffixes here. The key is to get these items formatted properly, so that when you send the first run to OMS, the data types are created properly. You can see that I have a problem with NextEvaluationTime_s – it should have been a date/time, but was interpreted as a string. (Please send me your feedback if you find my issue). Once I figure this out, I’ll update this post.

Now that you have this information in OMS, you could use for analytical purposes, trigger an alert or Azure Automation runbook.

The source code is posted below, but depending on your browser, may not show highlighting properly, or make it easy for copy/paste. Download the entire script from GitHub.

For reference, spend some time reading the Log Analytics HTTP Data Collector API doc from Microsoft as well as Stefan Stranger’s Using the HTTP OMS Data Collector API for real-world scenarios Part 1 and Part 2.

Happy Scripting!


#Publish-CMData -

# Replace with your Workspace ID
$CustomerId = "e4ff0038-49647-4c0c-49647-99a3c49647d5"  

# Replace with your Primary Key
$SharedKey = "CPfHRvey56FI3fzYi6C/mG1FxMt8RcsHkMgCNPWT/tPGGuEY8Tq63WvUaVokla3WVzicGLe00pcf+4t7b1aAMg=="

# Specify the name of the record type that you'll be creating
$LogType = "CM_CollevalInfo"

# Specify a field with the created time for the records
$TimeStampField = "LastEvaluationCompletionTime"

$cnstring = 'Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=CM_PS1;Data Source=MyDataBaseServer;User ID=;Password='

$cmdtext = @"
    'PS1' as [SiteCode], 
    [t0].[CollectionName] as [CollectionName], 
    [t0].[SiteID] as CollectionID, 
    [t1].[EvaluationLength] AS [RunTimeMS], 
    concat(convert(varchar(19), [t1].[LastRefreshTime], 127),'Z') as [LastEvaluationCompletionTime], 
    concat(convert(varchar(19), [t2].[NextRefreshTime], 127),'Z') as [NextEvaluationTime], 
    [t1].[MemberChanges] AS [MemberChanges], 
    concat(convert(varchar(19), [t1].[LastMemberChangeTime], 127),'Z')  AS [LastMemberChangeTime]
FROM [dbo].[Collections_G] AS [t0]
INNER JOIN [dbo].[Collections_L] AS [t1] ON [t0].[CollectionID] = [t1].[CollectionID]
INNER JOIN [dbo].[Collection_EvaluationAndCRCData] AS [t2] ON [t0].[CollectionID] = [t2].[CollectionID]

$cn = New-Object System.Data.OleDb.OleDbConnection $cnstring

$cmd = New-Object System.Data.OleDb.OleDbCommand $cmdtext, $cn
$cmd.CommandTimeout = 0
$reader = $cmd.ExecuteReader()

$table = new-object ""
$json = convertto-json -inputobject (
    $table | select SiteCode, CollectionName, CollectionID, `
     RunTimeMS, LastEvaluationCompletionTime, NextEvaluationTime, 
     MemberChanges, LastmemberChangeTime) -Compress

# Create the function to create the authorization signature
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
    $xHeaders = "x-ms-date:" + $date
    $stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

    $bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
    $keyBytes = [Convert]::FromBase64String($sharedKey)

    $sha256 = New-Object System.Security.Cryptography.HMACSHA256
    $sha256.Key = $keyBytes
    $calculatedHash = $sha256.ComputeHash($bytesToHash)
    $encodedHash = [Convert]::ToBase64String($calculatedHash)
    $authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash
    return $authorization

# Create the function to create and post the request
Function Post-OMSData($customerId, $sharedKey, $body, $logType)
    $method = "POST"
    $contentType = "application/json"
    $resource = "/api/logs"
    $rfc1123date = [DateTime]::UtcNow.ToString("r")
    $contentLength = $body.Length
    $signature = Build-Signature `
        -customerId $customerId `
        -sharedKey $sharedKey `
        -date $rfc1123date `
        -contentLength $contentLength `
        -fileName $fileName `
        -method $method `
        -contentType $contentType `
        -resource $resource
    $uri = "https://" + $customerId + "" + $resource + "?api-version=2016-04-01"

    $headers = @{
        "Authorization" = $signature;
        "Log-Type" = $logType;
        "x-ms-date" = $rfc1123date;
        "time-generated-field" = $TimeStampField;

    $response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
    return $response.StatusCode


# Submit the data to the API endpoint
Post-OMSData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType  

Download the entire script from GitHub.