Windows Azure Pack doesn’t show Automation dashboard after updating SMA

this days, I updated Service Management Automation (SMA) in our production environment from version 2016 UR7 to version 2019.

after the update, I had an issue with Windows Azure Pack, which couldn’t display the dashboard for the Automation provider.

after some research, i found out, that the root cause was some months earlier. we have a lot of concurrent jobs running and some times the SMA workers was overloaded and all new jobs where hanging in the queue. the solution I found on some other websites, was to mark this queued jobs as failed inside the database and then restart the worker service:

after setting this jobs to failed state, I had to configure the Runbook worker to start only one Runbook per Sandbox process in the file C:\Program Files\Microsoft System Center\Service Management Automation\Orchestrator.Settings.config:

the Runbook worker could now be restarted and was working properly again. Stability was much better, than before. read more

getting Security Scopes of SCCM folder

since version 1906 SCCM supports role-based access control (RBAC) for folders. This can be configured trough the SCCM console or (recommended) PowerShell.

unfortunattely the current version of the commandlet Get-CMObjectSecurityScope doesn’t support any folder as InputObject. so, if you want to know, which Security Scope is allowed to see a folder, you have to use this:

to add or remove any SecurityScopes to a Folder, you can use this build-in commandlets:

Pester tests for PowerShell modules

if you write regularly PowerShell modules, you have to test the functions in it.

i wrote some Pester tests, which are doing this tests for all functions in a PowerShell module:

  • the function has SYNOPSIS, DESCRIPTION and EXAMPLES
  • the function has cmdletbinding
  • the function has an OutputType defined
  • the function name starts with an approved verb
  • the function name has an common prefix
  • all parameters have a help text
  • all parameters have a type declaration
  • all variables inside the function has the same upper/lower case
  • read more

    pass parameter values to Invoke-CMScript

    The Invoke-CMScript cmdlet invokes a PowerShell script in Microsoft System Center Configuration Manager. System Center Configuration Manager has an integrated ability to run Powershell scripts. The scripts simplify building custom tools to administer software and let you accomplish mundane tasks quickly, allowing you to get large jobs done more easily and more consistently. For more information, see Create and run PowerShell scripts from the Configuration Manager console.

    unfortunately the Invoke-CMScript doesn’t allow to pass values to defined  script params. if you want to do this, you have to invoke the script through the Configuration Manager console. read more

    PowerShell and PSHTML in Azure Function App

    PowerShell is now available in Azure Function App (still in preview).

    you can create a Powershell Function App like this:

    using PowerShell modules

    currently, you can’t install a powershell module inside Azure Function App, but you can upload your module to the script directory and access it from there. to do this, use Kudu:

    browse to site / wwwroot:

    then you can drag and drop your module ZIP-file to the right part of the explorer

    the file will be uploaded and unpacked automatically

    now you can reference this module directly from your function script read more

    getting info about users password expiration

    i work in a multi domain environment. each domain has different password expiration rules. unfortunattely there is no notification system for the password expiration, so i have to check manually how long my passwords are valid.

    for this, i wrote this PowerShell function, which does work without the use of any additional module:

    the result of this script looks like this:


    reclaime diskspace on zero detection storage when deletening vmware vm

    some time ago, i had the problem that my 3Par storage was getting full. in this time i removed a lot of test vm’s from this storage, but nothing happens. the storage was still full. the reason was the mechanisme how vmware deletes files from a DataStore and the activated zero-detection feature on the storage. if you delete a virtual disk file on the vmware datastore, it will only marked as deleted, but the datas are still there in the same format. to get the storage’s zero detection work, we have to zero out the deleted part of the datastore manually.

    you can do this with this script:


    how to get the correct Virtual Disk for a VMware vm

    Some times I have the problem, I need to resize or delete a VMware virtual disk, but I only know the guest’s drive letter. In vm’s where there are only one virtual disk, or where each virtual disk has a different size, this isn’t a problem. but if you have a vm with multiple virtual disk with exactly the same size, you can’t compare it between the guest Disk Manager and the virtual disk sizes. if your vm has more than one SCSI controller, the problem will increase.

    Windows Disk Manager VMware VM settings

    I searched long time to solve this problem, but I couldn’t find an easy solution for this. so I wrote this PowerShell script:

    when you run the script, it will ask you for credentials and then shows you the informations about booth of your virtual disk and Windows drive :

    Azure Automation Hybrid Worker behind a Firewall / Proxy

    One nice feature of Azure Automation is the Hybrid Worker. With the Hybrid Worker you can execute Runbooks inside your onPremise infrastructure. according to the official documentation or at John Hennen’s post, you have to open your FireWall for outbound traffic to * for this ports 443,9354,30000-30199.

    Azure Automation Hybrid Worker Traffic

    When i told this requirement to our security-team, they weren’t very enthusiastic about the wildcard rule for * So i have to search another solution.

    To configure the Microsoft Monitoring Agent to use your proxy services, go to the Control Panel → System and Security → Microsoft Monitoring Agent and then go to the Proxy Settings tab:


    after confiring the Proxy ettings, you can configure the Workspace ID and Key:

    2016-03-22_15-24-09Now, you have to run one Job on the new Hybrid Worker, which will fail cause of additional needed firewall exceptions.

    Inside the Hybrid Worker server in the directory %AllUsersProfile%\Microsoft\System Center\Orchestrator\7.2\SMA\Sandboxes\ you will find for each Runbook-Job a SubDirectory (for example 5hrotqyb.mz5). Inside this directory, there is one file with the file-extension *.SandboxID. Open this file, and you would found the Value “sandboxHubEndpointDetail”, inside which is the Server-URL. in my case net.tcp://

    now you should create an outbound firewall rule like this:

     Source: your internal Server IP
    Destination URL:
    Protocol: TCP
    Destination Ports: 9354, and 30000-30199


    daily Backup Report for SC DPM

    i like System Center Dataprotection Manager, especially for the backup possibilites of remote Windows servers and the Online backup to Azure. But in my past, i lost the overview about succeded and failed backup jobs. The included reporting doesn’t helped me enough, and the alert notification was like spam. I needed a daily report with one view of all Jobs, Disks, Agents and other states. To reach this goal, i wrote my own PowerShell script, which i want to share here.

    the report, sent by mail will look like this:

    SC DPM daily report

    the code for this is here: