Category Archives: Microsoft Deployment Toolkit

ConfigMgr: a second attempt to REALLY liberate yourself from driver management…

In a previous post, I made an attempt to use Microsoft Update for downloading and installing all drivers during an Operating System deployment task with System Center Configuration Manager or Microsoft Deployment Toolkit. This approach works pretty great as long as hardware vendors use components that require drivers who are published by Microsoft Update. This requires some testing and if something’s missing, then alternative methods are available.

However, this works great but how about maintaining them during normal operation? After all, since drivers are not managed in this scenario, the process of receiving new drivers if updated needs to continue. As we all know, System Center Configuration Manager doesn’t support deploying drivers using Software Updates since the Update Classification ‘Drivers’ is not available (it is in WSUS though) so that’s not an option.

Fortunately, since Windows 10 1607 a feature called Dual Scan is available and can be used in conjunction with Software Updates in System Center Configuration Manager. This allows organizations to use both sources for managing updates so Microsoft Update can be used to update drivers.

The easiest way to do it is to deploy Windows Update for Business policies System Center Configuration Manager (assuming Intune is not used). All that needs to be done is follow the instructions on

Within a policy, you can include drivers to be deployed by checking the option ‘Include drivers with Windows Update’. Roughly said, you can kiss driver management in System Center Configuration Manager goodbye.

Despite the availability of good tools provided by vendors such as HP and Dell, managing drivers in System Center Configuration Manager is still a dreadful task. So this approach may reduce administrative effort dramatically…







ConfigMgr: An attempt to liberate yourself from managing drivers

This attempt may not be suitable for the faint hearted and it may be intertpreted as if I’m dropping a bomb but here it goes.

In all those years working with Configuration Manager, managing drivers for devices remains a daunting task. It is time consuming, requires a lot of administrative effort and storage as well. It is also difficult to explain to customers on dealing with it accordingly, just not my kind of fun…

With the release of Windows 10 and Microsoft’s approach with the semi-annual update channels it may make sense to reevaluate the daunting task of driver management.

Would it be great if it can be thrown out of the window (no pun intended) so you don’t have to bother about it anymore?

Well, the answer is yes if you meet the following requirements:

Microsoft has also redesigned update deployment for Windows 10. The number of updates have been significantly reduced by merging all updates in a single monthly bundle which will increase the build version of Windows 10 as well. From Windows 10 1607 and newer, a feature called ‘Dual Scan’ has been introduced as well you may even wonder if you can throw out Update Management in Configuration Manager out of the window as well. I understand this may be hard to let get go, but releasing yourself from all this administrative effort allows you liberate yourself from this as well, unless the required processes and company policies are in place allowing you to have this automated…

To summarize it all, would it be great to have a fully patched machine including all drivers during deployment?

After investigating, I found an old but still valid approach by Chris Nackers which is available at

I followed the steps except setting the variable (by not setting it) required by ZTIWindowsUpdate.wsf to make sure the script will go to Microsoft Update and retrieve all required updates from there. Additionally, I did check the ‘Continue on error’ checkbox to make sure the Task Sequence can continue in case update installation may fail. During testing I noticed some old printer driver failed to update while the rest installed properly. Enabling the ‘Continue on error’ checkbox is easier than collecting all exit codes.

In my scenario, it looks like this.

Alternatively, you can place the step after installing all applications so they may be updated as well.

Of course this requires some testing, if some devices are not installed because the driver is not available on Microsoft Update, then you can add them yourself.

Since Microsoft likes Github so much, you can even download ZTIWindowsUpdate.wsf (and ZTIUtility.wsf) as well and even edit to to your liking (ie. reducing the number of retries), you find it at


The result is the deployment may take some time but you have a fully updated machine and don’t need to bother about managing drivers afterwards.

Also, allowing Dual Scan will update drivers as well keeping that part of updating the device as well…


Looking back at 2015…

So, the year 2015 is almost at its end. While I write this, I am already in my second week of my two week time off. And boy,I really needed this two week break.

2015 was an extremely busy year for me, and I can actually cut the year in half.

At the first half, I was still busy participating in a project where I designed and deployed System Center 2012 R2 Configuration Manager. I also built a stand-alone Image Building environment running MDT 2013. Unfortunately, the project took way longer than expected due the customer being unable to take ownership and start administering it by themselves. Eventually I decided to walk away after the contractual end date of my involvement despite the fact the project isn’t finished yet. The longer it took, the more frustrating the project became for me so the decision to walk away was eventually the right one.

This takes me to the second half. In the second half, I saw a dramatic shift in my job since I did only one Configuration Manager design and deployment in the second half of 2015. I started to extend my skillset on Enterprise Client Management a bit more with Microsoft Intune and Microsoft’s Public Cloud platform: Azure.

I also started to deliver more workshops, master classes and training sessions. This is something I really like to do and I want to thank those who made it possible for me. It allowed to me renew my Microsoft Certified Trainer certification.

Fortunately, the frustrations of the first half provided some learning moments which required me to become a more complete consultant. So my coworker arranged a two day training session for me called “Professional Recommending” (this may be a poor translation of Professioneel Adviseren in Dutch) provided by Yearth. This is by far the most important training I received in my career and it really started to pay off pretty quickly by receiving more positive feedback from customers. I became a more complete consultant with this training.

I was also happy to do the presentation workshop with Monique Kerssens and Jinxiu Hu from Niqué Consultancy BV at ExpertsLive 2015. I was happy to receive the feedback that my presentation skills have developed greatly. To quote them: “you’re standing like a house”.

The icing on the cake came at the end of this year when I was asked to review the DataON CiB-9224 platform. You can read the review in my previous post.

So, I experienced some highs and lows this year. Fortunately, the highs came at the second half.

I look forward to 2016, but that’s for another post…



Possible workaround for capturing a Windows 10 reference image with MDT 2013 Update 1

As most of us should know by now Microsoft release Microsoft Deployment Toolkit 2013 Update 1, see the announcement at

The main improvements are support for Windows 10 and integration of System Center 2012 Configuration Manager SP2/R2 SP1. Unfortunately, this release has quite a lof of issues that makes it either very difficult or impossible to properly capture a reference image. A list of know issues is available at

The issue that bothers me the most is the following, and I quote:

Do not upgrade from Preview to RTM

MDT 2013 Update 1 Preview should be uninstalled before installing the final MDT 2013 Update 1. Do not attempt to upgrade a preview installation or deployment share. Although the product documentation is not updated for MDT 2013 Update 1, the information on upgrading an installation still holds true.

Being a consultant which require me to be an early adopter and testing new stuff to allow myself to be ready when it’s released requires me to work with Preview versions of verious software. Also, as an ITPro which has an isolated environment available purely for Image Building purposes, I need to upgrade my deployment share frequently. While I can automate building new deployment shares, it takes time I don’t have to research and test these new technologies. So I don’t have much choice than upgrading my deployment share. I must admit that releasing this technology with so many known issues is quite sloppy to me. I can only assume that various scenarios may not have been tested thoroughly by time constraints and releasing this version was under a possible amount of pressure.

Trying to build and capture a Windows 10 reference image fails. The capturing itself fails with an error message that a certain script cannot be loaded. The MDT 2013 U1 environment I currently have is for image building purposes only so I don’t have that many customizations configured.

So knowing that the capturing itself fails I can do the capturing part myself. Knowing that image building is not something I expect you to every day the amount of administrative effort increases just a little bit but it’s quite easy to do.

First, we start a deployment using the Windows Deployment Wizard. After selecting my Build and Capture Windows 10 Task Sequence I get the option to select how I want to capture an image.


I choose not to capture an image by selecting the option Do not capture an image of this computer. This will make the deployment run normally and finish without doing anything afterwards. I do use the option Finishaction=REBOOT in my customsettings.ini to make sure the machine restarts after completion.

The next step is logging on with the local Administrator password to SYSPREP the machine by running the sysprep.exe /oobe /generalize /shutdown command.


Here we see SYSPREP is in progress. After a small while the machine is turned off.

Now the machine will be started again using the LiteTouch boot media (in my case I use WDS) and wait until the deployment wizard is started once more. The reason why I do this is that my deployment share is available and accessible by the Z: drive which is automatically mapped. Pressing F8 opens the command prompt.

All I need to is to start capturing an image using DISM which may look like the screenshot below (hmmm, makes me wonder why I chose that filename).


Now the capture can start.


After a while the capture completes and a captured Windows 10 image is available in the Captures folder of the deployment share in use. This image can be used for deployment by MDT 2013 U1, System Center 2012 Configuration Manager SP2/R2 or whatever tool used for deploying .wim files.

Basically the workaround consists of replacing the image capturing part with manual labour. I’m sure that other workarounds may be available but this one works for me. The image capturing should take less than 72 hours since that is the maximum time a WinPE session is allowed to run. Once the 72 hours are up, it will automatically restart the computer. This should be enough though to have the image file created.

Feel free to use this workaround. As usual, testing is required before using it in a production environment.

Let’s hope an updated release should have all these issues solved, the sooner the better…




Workflow analysis to automate building up an MDT 2013 U1 environment…

Recently I was building a new MDT 2013 environment to facilitate Image Building next to a new System Center 2012 R2 Configuration Manager site. This all went fine. I started to notice though that I’ve built more of these environments the same way manually which started to annoy me a little bit. So I decided to spend some time to see if these preparations could be automated as much as possible.

To start automating something, it is essential that we know what to automate and in which sequence the defined steps need to be set. In other words, we need to analyze and determine the workflow before something can be automated.

MDT 2013 is the recommend Image Building environment for most scenarios. I’d even recommend to create an isolated environment purely for that purpose. This environment can be as little as 3 machines to build reference images as explained at I use this article as a reference as well.

After some analysis I discovered and defined the following workflow to build the MDT 2013 U1 environment, assuming the VM is created and joined to the domain:

  1. Install WDS and WSUS (using a Windows Internal Database);
  2. Configure WSUS;
  3. Install Windows ADK 10 from locally available setup;
  4. Install MDT 2013 U1 from locally available setup;
  5. Create Deployment Share;
  6. Import one or more Operating Systems and create Task Sequences for them;
  7. Import applications using a .csv file;
  8. Edit Bootstrap.ini & CustomSettings.ini to meet requirements;
  9. Update Deployment Share.

After this workflow the Task Sequence can be edited. I considered trying to automate that as well, but that is for a future post. After that, you’re basically good to go. Having WSUS synchronized and download updates is something that should be done separately. Automating this is available at

MDT 2013 can be managed with PowerShell, albeit a bit crudely in my opinion. A set of cmdlets is available to get most things done, the ones I need for my workflow are there so I can write a complete script which represents each step of the workflow defined. However, some planning is required. The following items need to be in place:

  • A ‘source’ location the setup files of Windows ADK 10 and MDT 2013;
  • The sources of the applications which are going to be part of the deployment, including a .csv file which includes the required information;
  • The location for the Deployment Share.

These requirements can be put into the script and may be modified to be used again in a different environment.

So let’s get started and check out each step.


First step: Install WDS and WSUS (using a Windows Internal Database). This one is easy and can be done with just one line:

Add-WindowsFeature WDS,NET-Framework-Core,UpdateServices-Services,UpdateServices-WidDB,UpdateServices-UI –IncludeAllSubFeature


Second step: Configure WSUS

Fairly straightforward as well, I use the Start-Process cmdlet to initiate the WsusUtil command:

$FilePath=”C:\Program Files\Update Services\Tools\WsusUtil.exe”

$ArgumentList=”postinstall CONTENT_DIR=F:\WSUS”

Start-Process -FilePath $FilePath -ArgumentList $ArgumentList


In this step automating the sync can be added here.


Third step: Install Windows ADK 10

I need adksetup.exe to start installing it. It doesn’t matter if all the bits are downloaded previously or need to downloaded when setup is running. Still pretty straightforward:



$ArgumentList1=”/q /norestart ”

$ArgumentList2=”/installpath F:\ADK10 ”

$ArgumentList3=”/features OptionId.DeploymentTools OptionId.WindowsPreinstallationEnvironment OptionId.ImagingAndConfigurationDesigner OptionId.UserStateMigrationTool”


Start-Process -FilePath $FilePath -ArgumentList $ArgumentList1$ArgumentList2$ArgumentList3

Notice the spaces at the end of variables $ArgumentList1 and ArgumentList2


Forth step: Install MDT 2013 U1

The installation of MDT 2013 U1 consists of a single .msi file. Since I use the defaults it becomes pretty straightforward as well:

$FilePath=”msiexec.exe ”

$ArgumentList= “/i F:\Download\MDT2013U1\MicrosoftDeploymentToolkit2013_x64.msi /qb”

Start-Process -FilePath $FilePath -ArgumentList $ArgumentList


NOTE: I use /qb intentionally to see something happening.


Before proceeding, the PowerShell module for MDT needs to be loaded:

$ModulePath=”C:\Program Files\Microsoft Deployment Toolkit\Bin\MicrosoftDeploymentToolkit.psd1″

Import-Module $ModulePath


Fifth step: Create a deployment Share and share it

This requires a little bit more work. The folder itself must be created before running the cmdlet that configures the Deployment Share:




$Description=”MDT Deployment Share”


New-Item $Root -type directory

New-PSDrive -Name $Name -PSProvider $PSProvider -Root $Root -Description $Description -NetworkPath $NetworkPath -Verbose | Add-MDTPersistentDrive -Verbose

$Type = 0

$objWMI = [wmiClass] ‘Win32_share’

$objWMI.create($Root, $Desription, $Type)


Sixth step: Import an Operating System and create a Task Sequence for it


A little bit more work. I limited the script for one, this can be repeated when more Operating Systems need to be imported. I chose not to use a .csv file for that.


$Path=”DS001:\Operating Systems”


$DestinationFolder=”Windows 8.1 Enterprise x64″

Import-MDTOperatingSystem -Path $Path -SourcePath $SourcePath -DestinationFolder $DestinationFolder

$Path=”DS001:\Task Sequences”

$Name=”Build Windows 8.1 Enterprise x64″



$OperatingSystemPath=”DS001:\Operating Systems\Windows 8.1 Enterprise in Windows 8.1 Enterprise x64 install.wim”

$FullName=”Windows User”


Import-MDTTaskSequence -Path $Path -Name $Name -Template $Template -ID $ID -OperatingSystemPath $OperatingSystemPath -FullName $FullName -OrgName $FullName -HomePage $Homepage


Seventh step: Import applications using a .csv as source

Import-Csv F:\Download\Apps\MDTApps.csv | % {


Import-MDTApplication -Path $Path -Enable “True” -Name $_.ApplicationName -ShortName $_.ApplicationName -Commandline $_.Commandline -WorkingDirectory “.\Applications\$_.ApplicationName” -ApplicationSourcePath $_.ApplicationSourcePath -DestinationFolder $_.ApplicationName


Here’s a sample .csv file used for the purpose of this script:


Visual C++ 2005 redist,VS80sp1-KB926601-X86-ENU.exe /Q,F:\Download\Apps\VC2005

Visual C++ 2008 redist,VS90sp1-KB945140-ENU.exe /Q,F:\Download\Apps\VC2008

Visual C++ 2010 redist,VS10sp1-KB983509.exe /Q,F:\Download\Apps\VC2010

Visual C++ 2012 redist x64,vcredist_x64.exe /Q,F:\Download\Apps\VC2012X64

Visual C++ 2012 redist x86,vcredist_x86.exe /Q,F:\Download\Apps\VC2012X86

Visual C++ 2013 redist x64,vcredist_x64.exe /Q,F:\Download\Apps\VC2013X64

Visual C++ 2013 redist x86,vcredist_x86.exe /Q,F:\Download\Apps\VC2013X86


Hmmm, looks like I need to add Visual C++ 201 redist and Office 2013. The benefit of this approach is to modify the .csv to meet the list of Applications that are going to be part of the reference image.


Eighth step: Edit Bootstrap.ini and Customsettings.ini

This step requires a bit more work. The locations of the both files depend on step 5. To keep things easy, I remove all the text fist before appending it again in the file. The result may look like this:


Clear-Content -Path $File

$Line=”[Settings]” | Out-File $File -Append

$Line=”Priority=Default” | Out-File $File -Append

$Line=”” | Out-File $File -Append

$Line=”[Default]” | Out-File $File -Append

$Line=”DeployRoot=\\MDT01\DeploymentShare$” | Out-File $File -Append

$Line=”UserDomain=DOMAIN9″ | Out-File $File -Append

$Line=”UserID=MDTAdmin” | Out-File $File -Append

$Line=”UserPassword=P@ssw0rd” | Out-File $File -Append

$Line=”SkipBDDWelcome=YES” | Out-File $File -Append



Clear-Content -Path $File

$Line=”[Settings]” | Out-File $File -Append

$Line=”Priority=Default” | Out-File $File -Append

$Line=”” | Out-File $File -Append

$Line=”[Default]” | Out-File $File -Append

$Line=”_SMSTSORGNAME=DOMAIN9 Lab Deployment” | Out-File $File -Append

$Line=”OSInstall=Y” | Out-File $File -Append

$Line=”AdminPassword=P@ssw0rd” | Out-File $File -Append

$Line=”TimeZoneName=Pacific Standard Time” | Out-File $File -Append

$Line=”JoinWorkgroup=WORKGROUP” | Out-File $File -Append

$Line=”FinishAction=SHUTDOWN” | Out-File $File -Append

$Line=”DoNotCreateExtraPartition=YES” | Out-File $File -Append

$Line=”WSUSServer=http://mdt01.domain9.local:8530″ | Out-File $File -Append

$Line=”ApplyGPOPack=NO” | Out-File $File -Append

$Line=”SkipAdminPassword=YES” | Out-File $File -Append

$Line=”SkipProductKey=YES” | Out-File $File -Append

$Line=”SkipComputerName=YES” | Out-File $File -Append

$Line=”SkipDomainMembership=YES” | Out-File $File -Append

$Line=”SkipUserData=YES” | Out-File $File -Append

$Line=”SkipLocaleSelection=YES” | Out-File $File -Append

$Line=”SkipTaskSequence=NO” | Out-File $File -Append

$Line=”SkipTimeZone=YES” | Out-File $File -Append

$Line=”SkipApplications=YES” | Out-File $File -Append

$Line=”SkipBitLocker=YES” | Out-File $File -Append

$Line=”SkipSummary=YES” | Out-File $File -Append

$Line=”SkipRoles=YES” | Out-File $File -Append

$Line=”SkipCapture=NO” | Out-File $File -Append

$Line=”SkipFinalSummary=YES” | Out-File $File -Append


Ninth step: Update the Deployment Share

The final step will create the boot images and process everything to make sure it’s all there, this step may take a while.


Update-MDTDeploymentShare -Path $Path


That’s it. After running the script you can see it’s all there. The only thing missing is adding the applications to the Task Sequence (and WSUS sync à update download). Alternatively, you can set the SkipApplications setting to NO and select the applications during the Lite-Touch wizard.

To prevent this post becoming too verbose, all that needs to be done is to copy/paste each part into a single .ps1 file and you’re good to go. Feel free to use this in a completely isolated environment to prevent interfering with a production environment.


Thoughts and ideas to manage Google Chrome for Work, and some Extensions too…

Recently I was asked to investigate how to manage Google Chrome for Work in a corporate environment. The customer involved is using Google Cloud Services intensively. This means that in order to receive an optimal user experience, the Google Chrome browser is needed. I decided to investigate and come up with some thoughts and ideas on how to do it.

Managing applications consists of two activities:

  • Deployment
  • Configuration

In an ideal scenario, those activities are completely separated. The result will be that the installation of the application can be done using the default settings while the configuration will be taken care of elsewhere. Fortunately, Google provides a few methods of configuring Google Chrome. The most well-known are:

  • Editing the master_preferences file
  • Using Group Policy

For the sake of this blog, the corporate environment consists of Windows 8.1 devices which are domain joined. This means we can consider using Group Policy. Using Group Policy will meet the initial goal of keeping deployment and configuration separate. I was able to use the following reference to see how to use Group Policy to manage Google Chrome:

As we can see, the policies are machine driven. This provides another challenge how Google Chrome for Work needs to be installed. Google has the following deployment guide to get started:

In this guide, I found the following passage critical for my investigation:

Chrome installations from an MSI package are installed at the system level and are available to all users. As a result, any user-level installation of Chrome, (i.e. a user’s own Chrome installation), will be overridden.

It looks like Google has revised the approach of installing Google Chrome quite dramatically since it was possible to install Chrome in the user context. This allowed end users to install Google Chrome without administrator privileges which can be considered a security officer’s true nightmare, not to mention the configuration management and release management processes in a corporate environment.

In order to investigate how to do it, I’ve set up a lab environment with the following machines:

  • 1 Windows Server 2012 R2 domain controller
  • 1 System Center 2012 R2 Configuration Manager site server
  • 1 Windows 8.1 x64 client

Knowing that Google Chrome will always be installed in the system context means we can make it part of an Operating System Deployment Task Sequence. But first an application needs to be made.

To create the application just import googlechromestandaloneenterprise.msi and create a standard Deployment Type, nothing fancy.


Really nothing fancy…

Adding this to a Task Sequence is nothing fancy either…

There you go. I created an MDT 2013 Task Sequence with an unattended Windows 8.1 x64 deployment since it’s not relevant to build a Windows 8.1 x64 image first.

Configuring Google Chrome is a bit more challenging. It really depends on requirements and processes to determine how Google Chrome for Work should be configured. This is what I did first:

  • Create a separate GPO for Google Chrome and link it to a OU where my client machine resides
  • Import the chrome.adm template

I’ve decided to configure just a few settings the GPO provides:

  • Setting Google Chrome as the default browser
  • Configure the start page

Configuring the Start Page requires 2 GPO settings




What about Extensions?

Admittingly, I find it great to manage Extensions by GPO. It does require some detective work before they can be automatically added to Google Chrome. Fortunately, nothing has to be downloaded in advance once you know the Extension ID. In my example I added a list of so called force-installed Extensions.

And here’s the list itself


Once the configuration is in place and the machine is deployed, the result will look this

Starting Google Chrome with my Start Pages

And the Extensions specified are also in place.


Finally, Google frequently updates Chrome. You can manage the update behavior by configuring Google Update by Group Policy, the following website can be used as a reference:

This requires to import the GoogleUpdate.adm template to configure the update behavior.

In this investigation, I configured the following update settings:

  • Automatically check for updates every 60 minutes
  • Enable auto updating of Google Chrome


Here are the GPO’s needed to configure it.


and finally


Updating Google Chrome for Work works great, if you’re an administrator. Unfortunately, when a user accessed the ‘About’ tab in Chrome it will trigger an update check. For a standard user this will immediately trigger User Account Control asking for credentials for elevation in Windows Vista or newer. Because Google Chrome is installed in the system context (installed in the %PROGRAMFILES% directory), administrator rights are required to make changes to the computer initiated by a user. This behavior is by design.

Especially in Windows 8.1, it is absolutely not recommended to turn off UAC since it will break a lot of functionality of Windows itself. Giving users administrator privileges should not be recommended either. To maintain a strict configuration management and release management process, it’s recommended to disable updates to prevent users who do have administrator privileges install updates by themselves.

This means a different method needs to be used to deploy an updated version of Google Chrome for Work. To me, the Application model of Configuration Manager is the most suitable environment since it allows to supersede applications. This also means the Task Sequence may be edited frequently as well. Alternatively, a 3rd party update tool such as Secunia can be used as well if the organization has a license for that.

I can conclude that Google has done the right thing to have Google Chrome for Work installed in the system context and manage it by using Group Policy.





Minimum required permissions to join a computer to the domain during OSD

Recently I was required to deliver the minimum required permissions to allow a computer to join the domain during Operating System Deployment. The challenge in this scenario was that each computer object is pre-staged in Active Directory prior to deployment. Here are a few but not all reasons why each computer object is pre-staged (let’s assume this is achieved by a PowerShell script):

  • A list of all computers exist, however the list contains only names (no MAC addresses or GUIDs)
  • Operating System Deployment needs to be simplified by not pointing to any Organizational Unit in the answer file
  • For each user a computer will be configured as a Primary Device, this is most likely used for file server related configurations

I guess I was lucky that at most projects in the past, the administrators told me which account to use since they already configured the required delegation for domain join. However, it was the first time that computer objects are pre-staged prior to deployment and I had to deliver the required permissions myself. Initially, I received the following error message:

“The join operation was not successful. This could be because an existing computer account having the name xxxx was previously created using a different set of credentials. Use a different computer name or contact your administrator to remove any stale conflicting raccount. The error was: Access is denied.”

After some digging I found the following blog post that helped me getting on track again:

After reading this post, I needed the following scenario:

2. Allowing a security principal to join and re-join a computer to a domain

The second scenario; allowing a principal to also re-join a computer to a domain requires some additional permissions. This is useful if you want to have a service account that can manage all computer accounts, also existing ones. System Center Configuration Manager for example requires these permissions.

According to the article the following permissions are required:

  • Reset Password
  • Read and write Account Restrictions
  • Validated write to DNS host name
  • Validated write to service principal name

In order to reproduce the behavior, I set up a small lab environment with a Domain Controller, a System Center 2012 R2 Configuration manager primary site server and two client machines. I did the following before deploying the machines:

  • Pre-stage the two computer accounts
  • Delegated the permissions to the account used for to join computers to the domain

The first client was joined correctly, after that I did the same with the second client and that one joined correctly as well. This means I was able to reproduce expected behavior. I consider that a result.

I’d like to thank Morgan for providing a detailed blog post which helped me achieve this goal. It comes back to having the right permissions to achieve the goal.

Feel free to try it out yourself, don’t forget to use a test environment before putting this in production…

UPDATE: the required permissions are also documented in the following KB article: This article applies to Windows Server 2012 (R2) as well.


Steve Thompson [MVP]

The automation specialist

Boudewijn Plomp

Cloud and related stuff...

Anything about IT

by Alex Verboon

Deployment Made Simple

Modern Workplace

Azure, Hybrid Identity & Enterprise Mobility + Security

Daan Weda

This site is all about System Center and PowerShell

IT And Management by Abheek

Microsoft certified Trainer -Abheek

Heading To The Clouds

by Marthijn van Rheenen