Category Archives: System Center Configuration Manager

Looking back at 2015…

So, the year 2015 is almost at its end. While I write this, I am already in my second week of my two week time off. And boy,I really needed this two week break.

2015 was an extremely busy year for me, and I can actually cut the year in half.

At the first half, I was still busy participating in a project where I designed and deployed System Center 2012 R2 Configuration Manager. I also built a stand-alone Image Building environment running MDT 2013. Unfortunately, the project took way longer than expected due the customer being unable to take ownership and start administering it by themselves. Eventually I decided to walk away after the contractual end date of my involvement despite the fact the project isn’t finished yet. The longer it took, the more frustrating the project became for me so the decision to walk away was eventually the right one.

This takes me to the second half. In the second half, I saw a dramatic shift in my job since I did only one Configuration Manager design and deployment in the second half of 2015. I started to extend my skillset on Enterprise Client Management a bit more with Microsoft Intune and Microsoft’s Public Cloud platform: Azure.

I also started to deliver more workshops, master classes and training sessions. This is something I really like to do and I want to thank those who made it possible for me. It allowed to me renew my Microsoft Certified Trainer certification.

Fortunately, the frustrations of the first half provided some learning moments which required me to become a more complete consultant. So my coworker arranged a two day training session for me called “Professional Recommending” (this may be a poor translation of Professioneel Adviseren in Dutch) provided by Yearth. This is by far the most important training I received in my career and it really started to pay off pretty quickly by receiving more positive feedback from customers. I became a more complete consultant with this training.

I was also happy to do the presentation workshop with Monique Kerssens and Jinxiu Hu from Niqué Consultancy BV at ExpertsLive 2015. I was happy to receive the feedback that my presentation skills have developed greatly. To quote them: “you’re standing like a house”.

The icing on the cake came at the end of this year when I was asked to review the DataON CiB-9224 platform. You can read the review in my previous post.

So, I experienced some highs and lows this year. Fortunately, the highs came at the second half.

I look forward to 2016, but that’s for another post…



Investigating a ConfigMgr TP3 deployment workflow

Recently, Microsoft released System Center Technical Preview 3 for testing purposes. I found some time to investigate how to install it. The main focus of my investigation was to determine if the workflow for deploying it is different compared to ConfigMgr 2012 SP2/R2 SP1.

It also allowed to determine if it would make sense to create a site server on Microsoft Azure. Except for PXE and/or multicast enabled distribution points it would make sense to host the site server there.

To determine if the workflow is different or not, I used the following setup (I have limited Azure credits so I have to keep my resources low):

  • 1 Azure Cloud Service
  • 1 VNet
  • 1 A1 Azure VM configured as DC and local DNS
  • 1 A5 Azure VM configured as ConfigMgr TP3 site server, two additional virtual disks of 512 GB were added which are added to a single storage pool. a striped virtual disk was created to get some more IOPS.

Both Azure VM machines run Windows Server 2012 R2.

I use SQL 2014 SP1 Standard Edition for hosting the site database. I could use a gallery machine with SQL, I decided not to and install SQL 2014 SP1 manually.

NOTE: I made an attempt to use Windows Server 2016 Technical Preview machines, but the performance was quite annoying. I decided to go back to Windows Server 2012 R2 instead.

For ConfigMgr 2012 deployments, I use the following workflow:

  1. Install Roles & Features
  2. Install an SQL instance
  3. Install Windows 10 ADK
  4. Extend AD Schema and configure delegation
  5. Configure WSUS
  6. Install ConfigMgr

I followed the workflow displayed above for deploying ConfigMgr TP3. Not surprisingly, the result is the same. I also notice that the site server is also perfectly happy to run on the Azure platform. For distribution points, I’d suggest to use an on-premises machine for distributing content…




Possible workaround for capturing a Windows 10 reference image with MDT 2013 Update 1

As most of us should know by now Microsoft release Microsoft Deployment Toolkit 2013 Update 1, see the announcement at

The main improvements are support for Windows 10 and integration of System Center 2012 Configuration Manager SP2/R2 SP1. Unfortunately, this release has quite a lof of issues that makes it either very difficult or impossible to properly capture a reference image. A list of know issues is available at

The issue that bothers me the most is the following, and I quote:

Do not upgrade from Preview to RTM

MDT 2013 Update 1 Preview should be uninstalled before installing the final MDT 2013 Update 1. Do not attempt to upgrade a preview installation or deployment share. Although the product documentation is not updated for MDT 2013 Update 1, the information on upgrading an installation still holds true.

Being a consultant which require me to be an early adopter and testing new stuff to allow myself to be ready when it’s released requires me to work with Preview versions of verious software. Also, as an ITPro which has an isolated environment available purely for Image Building purposes, I need to upgrade my deployment share frequently. While I can automate building new deployment shares, it takes time I don’t have to research and test these new technologies. So I don’t have much choice than upgrading my deployment share. I must admit that releasing this technology with so many known issues is quite sloppy to me. I can only assume that various scenarios may not have been tested thoroughly by time constraints and releasing this version was under a possible amount of pressure.

Trying to build and capture a Windows 10 reference image fails. The capturing itself fails with an error message that a certain script cannot be loaded. The MDT 2013 U1 environment I currently have is for image building purposes only so I don’t have that many customizations configured.

So knowing that the capturing itself fails I can do the capturing part myself. Knowing that image building is not something I expect you to every day the amount of administrative effort increases just a little bit but it’s quite easy to do.

First, we start a deployment using the Windows Deployment Wizard. After selecting my Build and Capture Windows 10 Task Sequence I get the option to select how I want to capture an image.


I choose not to capture an image by selecting the option Do not capture an image of this computer. This will make the deployment run normally and finish without doing anything afterwards. I do use the option Finishaction=REBOOT in my customsettings.ini to make sure the machine restarts after completion.

The next step is logging on with the local Administrator password to SYSPREP the machine by running the sysprep.exe /oobe /generalize /shutdown command.


Here we see SYSPREP is in progress. After a small while the machine is turned off.

Now the machine will be started again using the LiteTouch boot media (in my case I use WDS) and wait until the deployment wizard is started once more. The reason why I do this is that my deployment share is available and accessible by the Z: drive which is automatically mapped. Pressing F8 opens the command prompt.

All I need to is to start capturing an image using DISM which may look like the screenshot below (hmmm, makes me wonder why I chose that filename).


Now the capture can start.


After a while the capture completes and a captured Windows 10 image is available in the Captures folder of the deployment share in use. This image can be used for deployment by MDT 2013 U1, System Center 2012 Configuration Manager SP2/R2 or whatever tool used for deploying .wim files.

Basically the workaround consists of replacing the image capturing part with manual labour. I’m sure that other workarounds may be available but this one works for me. The image capturing should take less than 72 hours since that is the maximum time a WinPE session is allowed to run. Once the 72 hours are up, it will automatically restart the computer. This should be enough though to have the image file created.

Feel free to use this workaround. As usual, testing is required before using it in a production environment.

Let’s hope an updated release should have all these issues solved, the sooner the better…




Thoughts on enabling Microsoft Antimalware extension on Azure virtual machines…

Recently, I was investigating in managing the Microsoft Antimalware extension on Azure virtual machines.

As we all know, the Microsoft Antimalware extension can be enabled when creating a new Azure virtual machine in the Azure portal. While enabling the Microsoft Antimalware extension can be enabled there, only the default settings will be applied. This might work in most scenario’s but company policy may require customization when specified, this may be extended to customizing the extension for specific server roles or even desktops.

It became clear that the only way to customize the configuration is using Azure PowerShell.

NOTE: More customization is also possible in the ‘new’ portal available at . At his time of writing this portal is still in Preview though so it is not support.

After checking out the cmdlet reference for Azure, I found the Set-AzureVMMicrosoftAntimalwareExtension cmdlet. More information on this cmdlet is available at

After reading the article I noticed that .json files can be used to provision a configuration for the extension. This brings a new challenge: what configuration should be in the .json file for a specific server role.

If an existing System Center Configuration Manager 2012 or newer infrastructure is available and the Endpoint Protection Point is enabled and used, then either existing configurations or the Endpoint Protection templates can be used. The trick is to read a template and ‘translate’ it into a .json file.

I decided to use the Domain Controller template as a reference. After analyzing the template .xml file, the resulting .json may look like this:

 “AntimalwareEnabled”: true,
 “RealtimeProtectionEnabled”: true,
   “isEnabled”: true,
    “day”: 7,
    “time”: 120,
    “scanType”: “Full”
     “Extensions”: “.pol;.chk;.edb;.sdb;.pat”,
     “Paths”: “%systemroot%\\NTDS\\Ntds.dit;%systemroot%\\NTDS\\EDB*.log;%systemroot%\\NTDS\\Edbres*.jrs;%systemroot%\\SYSVOL\\domain\\DO_NOT_REMOVE_NtFrs_PreInstall_Directory\\;%systemroot%\\SYSVOL\\staging;%systemroot%\\SYSVOL\\staging areas;%systemroot%\\SYSVOL\\sysvol”,
     “Processes”: “”

Keep in mind though that using wildcards in the .json file is not recommended by Microsoft as stated in the cmdlet reference page for the Set-AzureVMMicrosoftAntimalwareExtension cmdlet.

This method allows administrators to create multiple .json files for specific server roles and specify them when enabling the extension.

Feel free to use this method yourself. As always, try this out in a test environment or separate subscription used for testing purposes.

Hope this helps…


Using PowerShell for bulk Configuration Manager 2012 SP1 (or newer) Client installation…

You might be in the situation that the Configuration Manager Client needs to be installed on many machines. However, automatic client deployment using Client Push is not allowed not preferred for whatever technical/financial/political reason. Browsing in the Console and select a lot of discovered machines may be a very frustrating action. It may also be extremely error prone, especially when you already know which machines need to have a Configuration Manager Client (they’re most like not deployed by using OSD).

Fortunately, we can use PowerShell to have the Configuration Manager Client installed. What makes it even better is that a single cmdlet is needed to get the job done: Install-CMClient

You can find more info for the Install-CMClient cmdlet at the following location:

If you already know which machines need the Configuration Manager Client to be installed, then you can put them in a .csv file. This allows you to create a PowerShell script that reads each object and run the cmdlet for each object. A script to get the job done might look like this:

# Script name: Install_Client_Bulk.ps1
# Purpose: Installs the ConfigMgr client on multiple machines, uses .csv as input
# Author: Marc Westerink
# Reference:

#Create the required ‘global’ variables
$ConfigMgrModulePath=“D:\Program Files\Microsoft Configuration Manager\AdminConsole\bin\ConfigurationManager.psd1″
#Connecting to site
Import-Module $ConfigMgrModulePath
Set-Location $ConfigMgrSiteCode

#Initiate Client Installation(s)

Import-CSV E:\Install\CMHostName.csv | %{


Install-CMClient -SiteCode $SiteCode -AlwaysInstallClient $True -IncludeDomainController $True -DeviceName $_.CMHostName



Running this script has one downside. Since a lot of machines will be instructed to have the client installed, you will see a lot of entries in the ccm.log file so monitoring its progress might be challenging. So therefore I recommend testing the script with only a few entries in the .csv file (as little as two) to verify everything’s working as expected…


An alternative guide for applying CU4 for ConfigMgr 2012 R2

Recently, Cumulative Update 4 for System Center 2012 R2 Configuration Manager 2012 was released. A release like this generally spawns a lot of tweets, blogs, tweets about blogs and their respectable retweets making it hard to miss. It also allowed me to check out which guides are written. While these guides are generally a good way to start, I did notice they generally have the following properties in common:

  • Packages are created containing the updates
  • Device collections are created to deploy them to
  • Optionally, the PATCH=<blabla fix.msp> is used in a task sequence

I try to achieve a matter of simplicity as much as possible and I also try to facilitate a ‘fire and forget’ mechanism as well. If System Center Updates Publisher (SCUP) 2011 is used, then deploying this update becomes much easier. The following location provides a guide that can be used to install SCUP 2011:

When SCUP 2011 is used, applying CU4 goes a little bit different. Here are the steps how to do it, assuming the hotfix is downloaded and accessible by the site server. I used as a reference.

Before starting, verify a valid backup is available.

First, the update must be applied to the site server.

Log on the Configuration Manager site server with a user account that has administrative privileges to execute the update.


Start the .exe file to start installing the cumulative update.


Select Next.


Accept the License Terms en select Next.


The prerequisite check verifies if all prerequisites are met. In this screenshot a previous software installation needs to be restarted to be completed. In this case the installation may continue by pressing Next.


The installation is run on the site server which runs the Configuration Manager console. Make sure the checkbox is filled in and press Next.


Select Yes, update the site database and press Next.


The distribution and deployment will be done using Software Updates, this means the packages are not going to be used. To prevent them from creating, uncheck each checkbox and press Next.


Review the summary and select Install.


Installation in progress…


After a few moments the installation is finished successfully. Select Next to continue.


Press Finish to close this window

System Center Updates Publisher 2011 (SCUP) allows publishing updates to WSUS in conjunction with Configuration Manager that are not published by Microsoft Update. Catalogs are used to import these updates to SCUP 2011 which are then published to WSUS. The cumulative update contains a catalog that can be imported into SCUP 2011.

Start SCUP 2011, make sure to start this application by the Run as administrator feature.


Select the Import button.


Browse to the catalog file (.cab) and press Next.


Press Next.


The security warning pops up. Because Microsoft delivers this hotfix, it is safe to trust them by pressing the Accept button.


The updates are imported successfully. Select Close to close this window.


The updates are now visible in the Console. Select the updates and choose Assign.


The publication type must be Full Content. Create a new publication as displayed and press OK.


In the Publications tab, select the newly created publication and press the Publish button.


Press Next.


Review the summary and press Next.


Publishing in progress…


The updates are published successfully. Press Close to close the window.

All tasks in SCUP are completed and the SCUP console can be closed.


The final step is distributing the updates for deployment.

The following Software Updates synchronization should pick up the newly published updates. Review the wsyncmgr.log file to verify the updates are synchronized by Software Updates.


Yes, there they are…


It is recommended to have a separate Automatic Deployment Rule which is targeted to the All Desktop and Server Clients device collection. In this scenario, it is not scheduled to run automatically since it rarely has to run. Select the Run Now option to run the Automatic Deployment Rule.


Press OK to confirm.


Verify the newly published updates are part of the Deployment Package. Once present and distributed, Configuration Manager clients are able to download and install the updates automatically. The client itself will determine which particular updates are needed.

That’s it. Feel free to try this method in a test environment before applying this in a production environment.


Why Google software can be murder for your Release Management Process…

Recently in one my projects I was involved with deploying some Google applications such as Google Chrome and Google Drive on Windows 8.1 clients. Company policy states that all applications (including the Google stuff) must be deployed automatically without any user intervention. Since System Center 2012 R2 Configuration Manager was used, the Application Model needs to be used.

From a technical perspective, this means that each application is configured with the following properties:

  • All applications are configured to be silently installed
  • A required deployment is configured to a User Collection
  • Supersedence is used when an application is replaced by either a new version or a completely different application

While this approach works fine for most applications, it doesn’t work for the Google software such as Google Chrome or Google Drive.

Here’s why:

The first Google application, let’s say Google Chrome, automatically installs Google Update. Google Update frequently calls home to check if a new version is available. If so, it will automatically download and install this updated version and uninstalls the old one. But then, the Configuration Manager client will run its required cycles. It will notice that deployed version is not detected anymore so it will start installing it again to meet the policy defined by the required deployment (you can clearly see that in the AppEnfore.log file). So the older version is installed again and I’ve seen it break applications completely. Extremely annoying for service desk crew and admins…

While you can manage Google Update using GPO, not all applications can be managed to suppress using Google Update.

From a management perspective, Google Update completely takes away control to have a functioning Release Management Process. While Google Update is great in consumer scenarios, it certainly isn’t in the corporate world.

When System Center 2012 R2 Configuration Manager is used, I only have the following technical workarounds available:

  • Deploy it as an available deployment; this requires user intervention and kills some automation
  • Deploy it as part of a Task Sequence; this gives the application to everyone and kills the ability to determine who is allowed to use it (you can create requirements etc. but it kills simplicity)
  • Deploy it as a Package instead of an Application; this takes away all the nice features the Application Model can give and takes me back to System Center Configuration Manager 2007. Not to mention, Packages should be used for some ‘stateless’ action such as scripts or registry keys
  • Publish it as an update using SCUP 2011; let’s not go there, the problem remains…

Since Google updates their software quite frequently, I don’t see organizations frequently adding and superseding these applications. Imagine how much one admin needs to commit to managing Google applications.

In other words: Google needs to take out Google Update and provide something more manageable for the corporate world…


Steve Thompson [MVP]

The automation specialist

Boudewijn Plomp

Cloud and related stuff...

Anything about IT

by Alex Verboon

Deployment Made Simple

Modern Workplace

Azure, Hybrid Identity & Enterprise Mobility + Security

Daan Weda

This site is all about System Center and PowerShell

IT And Management by Abheek

Microsoft certified Trainer -Abheek

Heading To The Clouds

by Marthijn van Rheenen