Category Archives: Microsoft

Windows Firewall Auditing with Operations Management Suite Part 2

While I was writing the previous blog on that subject I’ve remembered that I’ve forgot writing on another tip with Windows Firewall auditing. This tip is a small one. You can easily gather log data about Windows Firewall Port changes by adding the following log:

  • Microsoft-Windows-Windows Firewall With Advanced Security/Firewall


That way when someone adds/removes or modifies Windows Firewall rules you will see them in OMS and audit them:


Have fun analyzing logs.

Windows Firewall Auditing with Operations Management Suite

I was browsing trough Operations Management Suite and in the Security and Audit Solution I’ve noticed something new. There was a tile with text “Distinct IP Addresses Accessed”.


When I first saw that tile my number was 0. Clicking on the tile lead me to the following query:

Type=WindowsFirewall CommunicationDirection=SEND | measure count() by RemoteIP

This hinted me that this information is not coming from Security event log. Logging to a server where I have the Microsoft Monitoring Agent installed I was able to find the Management Pack that gathers that log:


This also showed me from where those events are taken. Quick search over Internet I’ve found how to enable those logs with group policy. You need to create or use existing group policy. Edit the group policy. Go to Computer Configuration –> Policies –> Windows Settings –> Security Settings –> Windows Firewall with Advanced Security –> Windows Firewall with Advanced Security. On that page you will see a link Windows Firewall Properties:


Clicking on it will allow you to configure logging for every Windows Firewall Profile – Domain, Private and Public.


When you click customize you can configure the location of the logs, in what size the logs are created and should dropped packets be log and or successful connections as well.


You can leave the location not configured as this will use the default one and that is what we need. I lower the limit to lower size because OMS will pick only the old non-active logs. And I also enable dropped packets and successful connections.

You can enable the same settings on specific profile or on all Windows Firewall profiles.

After enabling this policy on the servers of your choice you will start to see that tile populated and of course when you click on the tile a query will be executed and will show results:


Hope this will be helpful for you in enabling OMS.

Auditing PowerShell with Operations Management Suite

It has been a long time since I haven’t blogged about my new love Operations Management Suite Smile. This blog post will show you how easily you can audit all PowerShell commands that executed in your environment with Log Analytics in Operations Management Suite.

I love PowerShell and I think GUI should be only for discoverability and we (IT Pros) should work only with PowerShell even if it is hard in the beginning. One of the advantage of using PowerShell it has universal auditing no matter you use Microsoft products or third party one. We can easily create a group policy that will log every PowerShell cmdlet that is executed. I just need to open Group Policy Management Console in my AD, create new policy and under Computer Configuration  –> Policies –> Administrative Templates –> Windows Components –> Windows PowerShell configure Turn on Module Logging like this:


You can enable logging per module. In my case I am including all PowerShell modules.

When you configure such policy for all your servers including domain controllers all PowerShell commands will be logged.

If you have servers that are not in dome you can easily use other technologies to configure those logs like DSC.

After that your can open the Operations Management Suite portal. Go to Settings Tile –> Logs tab and add Microsoft-Windows-PowerShell/Operational log :


After that just wait until logs are being sent to MSOMS.

Finding who used command like Restart-Computer is simple as executing the following query:


For me this is very powerful scenario that you can use in your environment. Imagine even more scenarios when we have the option of adding custom fields which was announced as coming at Ignite.

My Webinar on Log Analytics in Operations Management Suite

Yesterday I’ve done a webinar for my company Lumagate on Azure OpInsights which is now called Operations Management Suite.

If you are interested in viewing this webinar you can find it on Lumagate’s content Library.

Azure Operational Insights Becomes Microsoft Operations Management Suite and Gains Some Features

Yes I know another change in the name. Let’s hope that this will be the last one. Microsoft Operations Management Suite actually includes 4 services:

  • Operational Insights
  • Backup (Azure Backup)
  • Site Recovery (Azure Site Recovery)
  • Automation (Azure Automation)

Note that the portal for this new service is Operational Insights’ portal and also Azure is no longer in the name of the service.

Now before showing what is new in Microsoft Operations Management Suite (Operational Insights) I would also to note that you will manage the services of this stack in different portals:

I personally find this confusing but wanted to share it with you.

So let’s see what is new.

First Microsoft Operations Management  Suite portal UI is slightly changed:


Intelligence Packs are now called Solution. Probable reason for that they will no longer represent only Intelligence Packs but more.


There are 3 new solutions to represent the other services besides Operational Insights: Automation, Backup and Azure Site Recovery. All of these solutions are currently at very basic level and provide small integration surface. Let’s look at every one of them.

Automation Solution:


This solution allows you to connect to your Azure Automation account and have some basic stats showing up from it. There are a lot of links to Azure links to go manage Azure Automation there. This solution also serves as deployment mechanism for Azure Automation Hybrid Worker which I’ve shown in previous post. And this seems to be end story for integration for now.



As Automaton, Backup Solution also offers similar integration. You can connect to Azure Backup vault and see some basic stats. There are links to go and manage Azure Backup on Azure portal.

Azure Site Recovery


This solution seems better than the above two and offers data in more rich way and again links to manage Azure Site Recovery on Azure Portal.

As I see it currently I do not think any of those 3 solution actually stores data in Operational Insights and basically gets this data from the other services directly.

Of course more previous integrations exists on these services as you can do some automation for Site Recovery or Backup and Site Recovery can leverage runbooks in Azure Automation.

Some other changes. Settings is slightly changed:


There are some things that are advertised as coming soon as gathering logs from Linux servers trough Linux agent and getting logs from AWS Storage.


Usage is also changed and now looks more useful:


And at last you can actually get syslog data from Linux machines located in Azure without agent. You can do that by going to the Azure Portal and add Storage account to data type. You will see that there is a new data type called Syslog (linux):


The bad news is I couldn’t find anywhere information how to enable Diagnostics in Azure for Linux VMs that will fill up that table. Hope documentation on that will come soon.

Azure Automation – Hybrid Worker Setup

It is May 4th which marks the start of Microsoft Ignite 2015 and of course new Azure features start to appear. So first Azure Automation is now available in the preview portal:


This new experience also gives you the option to create graphical runbooks:


Although this looks great I think I will still prefer writing text runbooks (PowerShell) but this will be useful for less PowerShell experienced folks.

Now lets move on to the more interesting feature and that is Hybrid Worker. Hybrid worker let’s use execute Runbooks on on-prem machine instead in Azure. If you click on that tile and configure you will see how to enable that:


So in short Azure Automation uses Azure Operational Insights for deployment of the bits of the Hybrid Worker on on-prem machine. After that you can logon locally on the machine you need to execute a cmdlet in order to connect the Hybrid worker to your Azure Automation Subscription. It is still uncertain for me if the Hybrid worker will use OpInsights’ proxy if you do not have direct connection to Internet.

Lets see how we can do all these steps.

First I go to my Operational Insights workspace and add Automation Intelligence Pack:


After that the bits needed for Hybrid Worker will be deployed on all your directly attached agents:


What I’ve noticed that those bits are not deployed to connected  SCOM server and agents connected trough SCOM. So you should use directly connected agents from OpInisghts.

After that you can logon to a machine with direct agent and execute cmdlet to connect that machine to Azure Automaton and basically promote it as Hybrid Worker.

The cmdlet in question is Add-HybridRunbookWorker and we first need to find in which module is that found. After some digging I’ve found that all Azure Automation Hybrid worker bits are locate in the following directory:

C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomationFiles


and in HybridRegistration folder you will find a module that you can import:

Import-Module “C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomationFiles\HybridRegistration\HybridRegistration.psd1″


After that you can see what cmdlets are available:

get-command -Module HybridRegistration


Looking at the help of Add-HybridRegistration we can see what parameters are available:


Seems Endpoint, Token and Name are mandatory but where I can find those? Actually two of them are easily to find at the Azure Automation account information. Just go to your Automation Account main page and on tip there is a key icon that you can click which will give you Primary Access Key, Secondary Access Key and URL.


Your Primary Access Key is your Token parameter, URL is your Endpoint parameter and for Name parameter you can choose whatever makes sense for you:



After that you should see immediately your Hybrid worker on the portal:


Now you can create a simple runbook like this:

workflow TestHybrid
Write-Output $env:computername

for tests.

If you run it and choose Azure:


Output will be “Client”:


But if you execute at your Hybrid Worker:


The output will be “DC01” which is the name of the server I’ve added as Hybrid Worker:


When you execute runbook on Hybrid Worker you will see process Orchestrator.Sandbox being started:


So a few summaries from me on this:

  • Seems this feature works currently only with OpInsights Direct Agent and not with SCOM connected servers.
  • It is not clear if Hybrid worker uses the proxy of OpInsights Direct Agent if there is one.
  • Executing runbooks in Azure looks a little bit faster.
  • Seems currently you cannot scheduled runbooks to be executed on Hybrid worker.
  • You cannot do tests of runbooks on Hybrid Worker. you have to publish the runbook and execute it.
  • Seems you are billed the same way if you are executing runbooks on Azure or Hybrid worker.
  • Tile for Automation in OpInsights is currently empty and unclickable.
  • PowerShell modules uploaded to Azure Automation are not distributed to Hybrid Workers.
  • If you give the same Name for Hybrid Worker when register it to more than one server they will work in high-availability/load balancing mode. I am not sure what happens if node is down.

Microsoft Azure Operational Insights Preview Series – General Availability (Part 18)

Previously on Microsoft Azure Operational Insights Preview Series:

This will be the last blog post of these series simply because I’ve received e-mail that Azure Operational Insights will become Generally Available May 4th. This date matches my prediction that it will happen around Microsoft Ignite Conference. I will still continue to write blog posts on the service but I will not include it in a series. Here is the official Azure e-mail we all received about the service:

I hope I was helpful with these series to you.