RSS

Category Archives: PowerShell

Backing up Azure Storage Accounts…

New year, new challenges. And I was confronted with quite a nice one.

One of my customers uses Azure Storage quite intensively. While Azure Storage Accounts provide some protection by means of replication, there’s no real protection from corruption or deletion inside the Storage Account itself. Data that has been deleted will be deleted on the replicas as well. Unfortunately, there’s no replication mechanism to replicate data between Storage Accounts comparable to DFS Replication. Governance constraints may also prevent using Geo-redundant storage. Geo-redundant storage can also not guarantee that the secondary location still has the data before it became corrupted or deleted.

So a mechanism must be developed to protect the data from a potential disaster. Only Blob, File and Table Storage are valid candidates to be protected. Microsoft has released a tool that allows content to be copied to a different Storage Account (including from and to local disk): AzCopy

The relevant information regarding AzCopy is available at https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/

AzCopy is a commandline tool that allows content to be copied, but it is very static by nature. This may be great for a few single Blob containers or Table Uri, but a more versatile approach is required when hundreds of Blob containers and Table Uris need to be copied, especially when they are frequently changed.

Fortunately, I was able to use PowerShell that ‘feeds’ Azcopy with the required parameters to get the job done. The result is a script. The script developed focuses on Blob and Table Storage only, it can easily be extended for File Storage. For the sake of this blog post, Access Keys are used (the secondary) but SAS tokens requires only minor modifications.

Before proceeding, the workflow needs to be defined:

  • Provide the source and destination Storage Accounts, the destination Storage Account will store everything as Blobs meaning that Tables will be exported to a .MANIFEST and .JSON file (to allow the table to be recreated);
  • Retrieve the list of Blob containers and ‘convert’ them to a URL;
  • Copy each Blob container to the destination Storage Account, all containers will be stored as virtual directories in a single container to maintain structure;
  • Retrieve the list of Table Uris;
  • Export each Uri to a .MANIFEST and .JSON file and store them as blob at the destination Storage Account;
  • Log all copy actions.

The benefit of AzCopy is that it tells the Azure platform to copy something. No extensive processing is required by AzCopy. This allows the script to run on an Azure VM. A Standard_A1 VM is sufficient, additional disks are highly recommended when large Tables are used.

Unfortunately, parallel processing of AzCopy.exe is not possible. Everything must be done sequentially. Alternatively, multiple VMs can be spun up to backup their own set of Storage Accounts. This is highly recommended because backing up Storage Accounts may very time consuming, especially when large sets of small files or very large tables. Additionally, some small text files are used to store the items retrieved. They also allow the use of ForEach loops.

To repeat the actions for multiple Storage Accounts, a .csv file is used to input. The .csv file may look like this:

SourceStorageName,SourceStorageKey,DestinationStorageName,DestinationStorageKey
source1,sourcekey1,destination1,destinationkey1
source2,sourcekey2,destination2,destinationkey2

The actual script uses a lot of variables, AzCopy is called using the Start-Process cmdlet while the parameters for AzCopy.exe are populated by a long variable that will be used by the -ArgumentList parameter.

So here’s the script I used to achieve the goal of backing up Blob containers and Table Uris:

#
# Name: Copy_Storage_Account_AzCopy.ps1
#
# Author: Marc Westerink
#
# Version: 1.0
#
# Purpose: This script copies Blob and Table Storage from a source Storage Account to a Destination Storage Account.
# File F:\Input\Storage_Accounts.csv contains all Source and Destination Storage.
# All Blob Containers and Tables will be retrieved and processed sequentially.
# All content will be copied as blobs to a container named after the Source Storage Account. A virtual directory will be created for each blob container.
#
# Requirements:
# – Storage Accounts with Secondary Access Keys
# – AzCopy needs to be installed
# – Azure PowerShell needs to be installed
# – An additional disk to store Temporary Files (i.e. F:\ drive)
# – A Temp Folder (i.e. F:\Temp) with two Text Files ‘temp.txt’ and ‘output.txt’. The Temp Folder will be used by AzCopy.
# – A Folder to store all log files (i.e. F:\Logs)
#

# First, let’s create the required global variables

# Get the date the script is being run
$Date = Get-Date -format “dd-MM-yyyy”

# AzCopy Path
$FilePath=’C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe’

#Temp Files, let’s make sure they’re cleared before starting

$File1=’F:\Temp\temp.txt’
Clear-Content -Path $File1

$File2=’F:\Temp\output.txt’
Clear-Content -Path $File2

#Recursive Parameter: DO NOT use for Table Storage
$Recursive=’/S’

#Suppress prompt popups
$Prompt=’/Y’

#Temporary Directory for AzCopy
$TempDir=’F:\Temp’

#SplitSize parameter for Tables, this will split a large table into separate .JSON files of 1024 MB
$SplitSize=’/SplitSize:1024′

#DestType parameter, required for copying tables as blobs to the Destination
$DestType=’/DestType:Blob’

#Temporary Directory for AzCopy Journal files
$Journal=’/Z:F:\Temp’

#Blob path
$Blob=’.blob.core.windows.net/’

#https header
$HTTPS=’https:// ‘

#Let’s import the CSV and process all Storage Accounts
Import-Csv F:\Input\Storage_Accounts.csv | % {

#Creating the Full Path of the Source Storage Account Blob
$SourceStoragePath=$HTTPS+$_.SourceStorageName+$Blob

#Creating the Full Path of the Destination Storage Container, if it doesn’t exist it will be created
$DestStorageContainer=$HTTPS+$_.DestinationStorageName+$Blob+$_.SourceStorageName+$Date

#Gather the Source Access Key
$SourceStorageKey=$_.SourceStorageKey

#Gather the Destination Access Key
$DestinationStorageKey=$_.DestinationStorageKey

#Defining the log file for verbose logging with the Source Storage Account Name and the date
$Verbose=’/V:F:\Logs\’+$_.SourceStorageName+$Date+’.log’

#Create the Azure Storage Context to gather all Blobs and Tables
$Context = New-AzureStorageContext -StorageAccountName $_.SourceStorageName -StorageAccountKey $_.SourceStorageKey

#Copy blob containers first

#Get all containers
Get-AzureStorageContainer -context $context | Select Name | % {

Add-Content -Path $File1 -Value $_.Name
}

#Convert all Container Names to full paths and write them to the Output File
Get-Content $File1 | % {

Add-Content -Path $File2 -Value $SourceStoragePath$_
}

#Process all Containers using the Output File as input
Get-Content $File2 | % {

#Gather virtual directory name using the container name
$VirtualDirectory= $_ -replace $SourceStoragePath,”

$ArgumentList=’/Source:’+$_+’ ‘+’/Dest:’+$DestStorageContainer+’/’+$VirtualDirectory+’ ‘+’/SourceKey:’+$SourceStorageKey+’ ‘+’/DestKey:’+$DestinationStorageKey+’ ‘+$Recursive+’ ‘+$Verbose+’ ‘+$Prompt+’ ‘+$Journal
Start-Process -FilePath $FilePath -ArgumentList $ArgumentList -Wait
}

#Before proceeding, let’s clean up all files used
Clear-Content -Path $File1
Clear-Content -Path $File2
#Get All Tables
Get-AzureStorageTable -context $context | Select Uri | % {

Add-Content -Path $File2 -Value $_.Uri
}

#Process all Tables using the Output File as input
Get-Content $File2 | % {

$ArgumentList=’/Source:’+$_+’ ‘+’/Dest:’+$DestStorageContainer+’ ‘+’/SourceKey:’+$SourceStorageKey+’ ‘+’/DestKey:’+$DestinationStorageKey+’ ‘+$SplitSize+’ ‘+$Verbose+’ ‘+$Prompt+’ ‘+$DestType+’ ‘+$Journal
Start-Process -FilePath $FilePath -ArgumentList $ArgumentList -Wait

}

#Cleanup Output File
Clear-Content -Path $File2
}

To have this script run by a schedule, a simple Scheduled Task can be created to do so. The schedule itself depends on the environment’s needs and the time to have everything copied. It’s not uncommon that a large storage account with countless small blobs and huge tables may take a week or even longer to be copied…

 

 

 
Leave a comment

Posted by on 02/02/2016 in Azure, PowerShell, Public Cloud

 

Looking forward to 2016…

So, after leaving 2015 behind us and getting started in 2016 it’s time to have a look what 2016 is going to bring us.

2015 was the year that got the adoption of cloud technology really going and I expect more and more organizations to do so or start adopting more features cloud technology offers us. A very nice feature is that organizations start to understand better how convenient it is when the ‘gate’ for end users has shifted from Active Directory to Azure Active Directory.

Three big releases will most likely take place this year:

  • AzureStack;
  • Windows Server 2016;
  • System Center 2016.

I strongly believe the release of Windows Server 2016 will dramatically change the way we’re used to work and I really believe the following two features will enable it:

  • Nano Server;
  • Containers.

Since the release of Windows Server 2016 Technical Preview 3, and even more with Windows Server 2016 Technical Preview 4 we’re able to research and experiment with these two features. Fortunately, I don’t expect Windows Server 2016 RTM to be released in the first half of 2016. This allows me to play around with it and understand how it works so that I am prepared when it becomes available.

So, Windows Server 2016 is quite a big tip of the iceberg. With the rest all coming as well I expect 2016 to be a very busy year. But I expect to have a lot of fun with it as well…

So let’s see what’s going to happen this year, I look forward to it.

 

Looking back at 2015…

So, the year 2015 is almost at its end. While I write this, I am already in my second week of my two week time off. And boy,I really needed this two week break.

2015 was an extremely busy year for me, and I can actually cut the year in half.

At the first half, I was still busy participating in a project where I designed and deployed System Center 2012 R2 Configuration Manager. I also built a stand-alone Image Building environment running MDT 2013. Unfortunately, the project took way longer than expected due the customer being unable to take ownership and start administering it by themselves. Eventually I decided to walk away after the contractual end date of my involvement despite the fact the project isn’t finished yet. The longer it took, the more frustrating the project became for me so the decision to walk away was eventually the right one.

This takes me to the second half. In the second half, I saw a dramatic shift in my job since I did only one Configuration Manager design and deployment in the second half of 2015. I started to extend my skillset on Enterprise Client Management a bit more with Microsoft Intune and Microsoft’s Public Cloud platform: Azure.

I also started to deliver more workshops, master classes and training sessions. This is something I really like to do and I want to thank those who made it possible for me. It allowed to me renew my Microsoft Certified Trainer certification.

Fortunately, the frustrations of the first half provided some learning moments which required me to become a more complete consultant. So my coworker arranged a two day training session for me called “Professional Recommending” (this may be a poor translation of Professioneel Adviseren in Dutch) provided by Yearth. This is by far the most important training I received in my career and it really started to pay off pretty quickly by receiving more positive feedback from customers. I became a more complete consultant with this training.

I was also happy to do the presentation workshop with Monique Kerssens and Jinxiu Hu from Niqué Consultancy BV at ExpertsLive 2015. I was happy to receive the feedback that my presentation skills have developed greatly. To quote them: “you’re standing like a house”.

The icing on the cake came at the end of this year when I was asked to review the DataON CiB-9224 platform. You can read the review in my previous post.

So, I experienced some highs and lows this year. Fortunately, the highs came at the second half.

I look forward to 2016, but that’s for another post…

 

 

Leveraging PowerShell to add multiple empty disks to Azure virtual machines

This post’s a quicky…

Adding disks is quite a straightforward action which can be achieved by a single PowerShell cmdlet: Get-AzureVM “myservice” -Name “MyVM” | Add-AzureDataDisk -CreateNew -DiskSizeInGB 128 -DiskLabel “main” -LUN 0 | Update-AzureVM

More information for this cmdlet is available at https://msdn.microsoft.com/nl-nl/library/azure/dn495298.aspx

Empty disks, with a maximum size of 1023 GB, have some limitations regarding maximum IOPS. On top of that, each Azure VM size allows a maximum amount of datadisks to be connected. An overview of these Azure VM sizes is available at https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-size-specs

These disks can be added to storage pools to have one or more disks with to with bigger sizes.

I investigated how I could add 14 disks to an A4 sized Azure VM quickly. It’s quite easy to achieve with a .csv file and a loop. The script may look like this:

#
# Script name: add_disks_azurevm.ps1
#
# Purpose: Adds multiple empty datadisks to at least one Azure VM, uses .csv as input
#
# Author: Marc Westerink
#
# Reference: https://msdn.microsoft.com/nl-nl/library/azure/dn495298.aspx
#

Import-Module “C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\Azure.psd1”
Add-AzureAccount

Import-Csv C:\temp\disks.csv | % {

Set-AzureSubscription -SubscriptionName $_.SubscriptionName -CurrentStorageAccountName $_.StorageAccountName
Get-AzureVM -ServiceName $_.ServiceName -Name $_.VMName | Add-AzureDataDisk -CreateNew -DiskSizeInGB $_.DiskSizeInGB -DiskLabel $_.DiskLabel -LUN $_.LUN | Update-AzureVM
}

The .csv file that delivers the required information can easily be edited in Excel. The script even allows switching to a different subscription and/or storage account which adds some flexibility. It doesn’t matter if the same subscription information is set multiple times.

After running this script I can see the disks are available in my Azure VM and use them to my needs.

disks

Feel free to try this yourself…

 

 
Leave a comment

Posted by on 06/10/2015 in Azure, PowerShell

 

Workflow analysis to automate building up an MDT 2013 U1 environment…

Recently I was building a new MDT 2013 environment to facilitate Image Building next to a new System Center 2012 R2 Configuration Manager site. This all went fine. I started to notice though that I’ve built more of these environments the same way manually which started to annoy me a little bit. So I decided to spend some time to see if these preparations could be automated as much as possible.

To start automating something, it is essential that we know what to automate and in which sequence the defined steps need to be set. In other words, we need to analyze and determine the workflow before something can be automated.

MDT 2013 is the recommend Image Building environment for most scenarios. I’d even recommend to create an isolated environment purely for that purpose. This environment can be as little as 3 machines to build reference images as explained at https://technet.microsoft.com/en-us/library/dn744290.aspx. I use this article as a reference as well.

After some analysis I discovered and defined the following workflow to build the MDT 2013 U1 environment, assuming the VM is created and joined to the domain:

  1. Install WDS and WSUS (using a Windows Internal Database);
  2. Configure WSUS;
  3. Install Windows ADK 10 from locally available setup;
  4. Install MDT 2013 U1 from locally available setup;
  5. Create Deployment Share;
  6. Import one or more Operating Systems and create Task Sequences for them;
  7. Import applications using a .csv file;
  8. Edit Bootstrap.ini & CustomSettings.ini to meet requirements;
  9. Update Deployment Share.

After this workflow the Task Sequence can be edited. I considered trying to automate that as well, but that is for a future post. After that, you’re basically good to go. Having WSUS synchronized and download updates is something that should be done separately. Automating this is available at http://blogs.technet.com/b/heyscriptingguy/archive/2013/04/15/installing-wsus-on-windows-server-2012.aspx.

MDT 2013 can be managed with PowerShell, albeit a bit crudely in my opinion. A set of cmdlets is available to get most things done, the ones I need for my workflow are there so I can write a complete script which represents each step of the workflow defined. However, some planning is required. The following items need to be in place:

  • A ‘source’ location the setup files of Windows ADK 10 and MDT 2013;
  • The sources of the applications which are going to be part of the deployment, including a .csv file which includes the required information;
  • The location for the Deployment Share.

These requirements can be put into the script and may be modified to be used again in a different environment.

So let’s get started and check out each step.

 

First step: Install WDS and WSUS (using a Windows Internal Database). This one is easy and can be done with just one line:

Add-WindowsFeature WDS,NET-Framework-Core,UpdateServices-Services,UpdateServices-WidDB,UpdateServices-UI –IncludeAllSubFeature

 

Second step: Configure WSUS

Fairly straightforward as well, I use the Start-Process cmdlet to initiate the WsusUtil command:

$FilePath=”C:\Program Files\Update Services\Tools\WsusUtil.exe”

$ArgumentList=”postinstall CONTENT_DIR=F:\WSUS”

Start-Process -FilePath $FilePath -ArgumentList $ArgumentList

 

In this step automating the sync can be added here.

 

Third step: Install Windows ADK 10

I need adksetup.exe to start installing it. It doesn’t matter if all the bits are downloaded previously or need to downloaded when setup is running. Still pretty straightforward:

 

$FilePath=”F:\Download\Win10ADK\adksetup.exe”

$ArgumentList1=”/q /norestart ”

$ArgumentList2=”/installpath F:\ADK10 ”

$ArgumentList3=”/features OptionId.DeploymentTools OptionId.WindowsPreinstallationEnvironment OptionId.ImagingAndConfigurationDesigner OptionId.UserStateMigrationTool”

 

Start-Process -FilePath $FilePath -ArgumentList $ArgumentList1$ArgumentList2$ArgumentList3

Notice the spaces at the end of variables $ArgumentList1 and ArgumentList2

 

Forth step: Install MDT 2013 U1

The installation of MDT 2013 U1 consists of a single .msi file. Since I use the defaults it becomes pretty straightforward as well:

$FilePath=”msiexec.exe ”

$ArgumentList= “/i F:\Download\MDT2013U1\MicrosoftDeploymentToolkit2013_x64.msi /qb”

Start-Process -FilePath $FilePath -ArgumentList $ArgumentList

 

NOTE: I use /qb intentionally to see something happening.

 

Before proceeding, the PowerShell module for MDT needs to be loaded:

$ModulePath=”C:\Program Files\Microsoft Deployment Toolkit\Bin\MicrosoftDeploymentToolkit.psd1″

Import-Module $ModulePath

 

Fifth step: Create a deployment Share and share it

This requires a little bit more work. The folder itself must be created before running the cmdlet that configures the Deployment Share:

$Name=”DS001″

$PSProvider=”MDTProvider”

$Root=”F:\DeploymentShare”

$Description=”MDT Deployment Share”

$NetworkPath=”\\MDT01\DeploymentShare$”

New-Item $Root -type directory

New-PSDrive -Name $Name -PSProvider $PSProvider -Root $Root -Description $Description -NetworkPath $NetworkPath -Verbose | Add-MDTPersistentDrive -Verbose

$Type = 0

$objWMI = [wmiClass] ‘Win32_share’

$objWMI.create($Root, $Desription, $Type)

 

Sixth step: Import an Operating System and create a Task Sequence for it

 

A little bit more work. I limited the script for one, this can be repeated when more Operating Systems need to be imported. I chose not to use a .csv file for that.

 

$Path=”DS001:\Operating Systems”

$SourcePath=”F:\Download\OS\win8.1_ent_x64″

$DestinationFolder=”Windows 8.1 Enterprise x64″

Import-MDTOperatingSystem -Path $Path -SourcePath $SourcePath -DestinationFolder $DestinationFolder

$Path=”DS001:\Task Sequences”

$Name=”Build Windows 8.1 Enterprise x64″

$Template=”Client.xml”

$ID=”1″

$OperatingSystemPath=”DS001:\Operating Systems\Windows 8.1 Enterprise in Windows 8.1 Enterprise x64 install.wim”

$FullName=”Windows User”

$Homepage=”about:blank”

Import-MDTTaskSequence -Path $Path -Name $Name -Template $Template -ID $ID -OperatingSystemPath $OperatingSystemPath -FullName $FullName -OrgName $FullName -HomePage $Homepage

 

Seventh step: Import applications using a .csv as source

Import-Csv F:\Download\Apps\MDTApps.csv | % {

$Path=”DS001:\Applications”

Import-MDTApplication -Path $Path -Enable “True” -Name $_.ApplicationName -ShortName $_.ApplicationName -Commandline $_.Commandline -WorkingDirectory “.\Applications\$_.ApplicationName” -ApplicationSourcePath $_.ApplicationSourcePath -DestinationFolder $_.ApplicationName

}

Here’s a sample .csv file used for the purpose of this script:

ApplicationName,CommandLine,ApplicationSourcePath

Visual C++ 2005 redist,VS80sp1-KB926601-X86-ENU.exe /Q,F:\Download\Apps\VC2005

Visual C++ 2008 redist,VS90sp1-KB945140-ENU.exe /Q,F:\Download\Apps\VC2008

Visual C++ 2010 redist,VS10sp1-KB983509.exe /Q,F:\Download\Apps\VC2010

Visual C++ 2012 redist x64,vcredist_x64.exe /Q,F:\Download\Apps\VC2012X64

Visual C++ 2012 redist x86,vcredist_x86.exe /Q,F:\Download\Apps\VC2012X86

Visual C++ 2013 redist x64,vcredist_x64.exe /Q,F:\Download\Apps\VC2013X64

Visual C++ 2013 redist x86,vcredist_x86.exe /Q,F:\Download\Apps\VC2013X86

 

Hmmm, looks like I need to add Visual C++ 201 redist and Office 2013. The benefit of this approach is to modify the .csv to meet the list of Applications that are going to be part of the reference image.

 

Eighth step: Edit Bootstrap.ini and Customsettings.ini

This step requires a bit more work. The locations of the both files depend on step 5. To keep things easy, I remove all the text fist before appending it again in the file. The result may look like this:

$File=”F:\DeploymentShare\Control\Bootstrap.ini”

Clear-Content -Path $File

$Line=”[Settings]” | Out-File $File -Append

$Line=”Priority=Default” | Out-File $File -Append

$Line=”” | Out-File $File -Append

$Line=”[Default]” | Out-File $File -Append

$Line=”DeployRoot=\\MDT01\DeploymentShare$” | Out-File $File -Append

$Line=”UserDomain=DOMAIN9″ | Out-File $File -Append

$Line=”UserID=MDTAdmin” | Out-File $File -Append

$Line=”UserPassword=P@ssw0rd” | Out-File $File -Append

$Line=”SkipBDDWelcome=YES” | Out-File $File -Append

 

$File=”F:\DeploymentShare\Control\CustomSettings.ini”

Clear-Content -Path $File

$Line=”[Settings]” | Out-File $File -Append

$Line=”Priority=Default” | Out-File $File -Append

$Line=”” | Out-File $File -Append

$Line=”[Default]” | Out-File $File -Append

$Line=”_SMSTSORGNAME=DOMAIN9 Lab Deployment” | Out-File $File -Append

$Line=”OSInstall=Y” | Out-File $File -Append

$Line=”AdminPassword=P@ssw0rd” | Out-File $File -Append

$Line=”TimeZoneName=Pacific Standard Time” | Out-File $File -Append

$Line=”JoinWorkgroup=WORKGROUP” | Out-File $File -Append

$Line=”FinishAction=SHUTDOWN” | Out-File $File -Append

$Line=”DoNotCreateExtraPartition=YES” | Out-File $File -Append

$Line=”WSUSServer=http://mdt01.domain9.local:8530″ | Out-File $File -Append

$Line=”ApplyGPOPack=NO” | Out-File $File -Append

$Line=”SkipAdminPassword=YES” | Out-File $File -Append

$Line=”SkipProductKey=YES” | Out-File $File -Append

$Line=”SkipComputerName=YES” | Out-File $File -Append

$Line=”SkipDomainMembership=YES” | Out-File $File -Append

$Line=”SkipUserData=YES” | Out-File $File -Append

$Line=”SkipLocaleSelection=YES” | Out-File $File -Append

$Line=”SkipTaskSequence=NO” | Out-File $File -Append

$Line=”SkipTimeZone=YES” | Out-File $File -Append

$Line=”SkipApplications=YES” | Out-File $File -Append

$Line=”SkipBitLocker=YES” | Out-File $File -Append

$Line=”SkipSummary=YES” | Out-File $File -Append

$Line=”SkipRoles=YES” | Out-File $File -Append

$Line=”SkipCapture=NO” | Out-File $File -Append

$Line=”SkipFinalSummary=YES” | Out-File $File -Append

 

Ninth step: Update the Deployment Share

The final step will create the boot images and process everything to make sure it’s all there, this step may take a while.

$Path=”DS001:”

Update-MDTDeploymentShare -Path $Path

 

That’s it. After running the script you can see it’s all there. The only thing missing is adding the applications to the Task Sequence (and WSUS sync à update download). Alternatively, you can set the SkipApplications setting to NO and select the applications during the Lite-Touch wizard.

To prevent this post becoming too verbose, all that needs to be done is to copy/paste each part into a single .ps1 file and you’re good to go. Feel free to use this in a completely isolated environment to prevent interfering with a production environment.

 

Thoughts on enabling Microsoft Antimalware extension on Azure virtual machines…

Recently, I was investigating in managing the Microsoft Antimalware extension on Azure virtual machines.

As we all know, the Microsoft Antimalware extension can be enabled when creating a new Azure virtual machine in the Azure portal. While enabling the Microsoft Antimalware extension can be enabled there, only the default settings will be applied. This might work in most scenario’s but company policy may require customization when specified, this may be extended to customizing the extension for specific server roles or even desktops.

It became clear that the only way to customize the configuration is using Azure PowerShell.

NOTE: More customization is also possible in the ‘new’ portal available at http://portal.azure.com . At his time of writing this portal is still in Preview though so it is not support.

After checking out the cmdlet reference for Azure, I found the Set-AzureVMMicrosoftAntimalwareExtension cmdlet. More information on this cmdlet is available at https://msdn.microsoft.com/en-us/library/dn771716.aspx

After reading the article I noticed that .json files can be used to provision a configuration for the extension. This brings a new challenge: what configuration should be in the .json file for a specific server role.

If an existing System Center Configuration Manager 2012 or newer infrastructure is available and the Endpoint Protection Point is enabled and used, then either existing configurations or the Endpoint Protection templates can be used. The trick is to read a template and ‘translate’ it into a .json file.

I decided to use the Domain Controller template as a reference. After analyzing the template .xml file, the resulting .json may look like this:

{
 “AntimalwareEnabled”: true,
 “RealtimeProtectionEnabled”: true,
 “ScheduledScanSettings”:
 {
   “isEnabled”: true,
    “day”: 7,
    “time”: 120,
    “scanType”: “Full”
 },
    “Exclusions”:
 {
     “Extensions”: “.pol;.chk;.edb;.sdb;.pat”,
     “Paths”: “%systemroot%\\NTDS\\Ntds.dit;%systemroot%\\NTDS\\EDB*.log;%systemroot%\\NTDS\\Edbres*.jrs;%systemroot%\\SYSVOL\\domain\\DO_NOT_REMOVE_NtFrs_PreInstall_Directory\\;%systemroot%\\SYSVOL\\staging;%systemroot%\\SYSVOL\\staging areas;%systemroot%\\SYSVOL\\sysvol”,
     “Processes”: “”
 }
}

Keep in mind though that using wildcards in the .json file is not recommended by Microsoft as stated in the cmdlet reference page for the Set-AzureVMMicrosoftAntimalwareExtension cmdlet.

This method allows administrators to create multiple .json files for specific server roles and specify them when enabling the extension.

Feel free to use this method yourself. As always, try this out in a test environment or separate subscription used for testing purposes.

Hope this helps…

 

Using PowerShell for bulk Configuration Manager 2012 SP1 (or newer) Client installation…

You might be in the situation that the Configuration Manager Client needs to be installed on many machines. However, automatic client deployment using Client Push is not allowed not preferred for whatever technical/financial/political reason. Browsing in the Console and select a lot of discovered machines may be a very frustrating action. It may also be extremely error prone, especially when you already know which machines need to have a Configuration Manager Client (they’re most like not deployed by using OSD).

Fortunately, we can use PowerShell to have the Configuration Manager Client installed. What makes it even better is that a single cmdlet is needed to get the job done: Install-CMClient

You can find more info for the Install-CMClient cmdlet at the following location:

https://technet.microsoft.com/library/jj821865(v=sc.20).aspx

If you already know which machines need the Configuration Manager Client to be installed, then you can put them in a .csv file. This allows you to create a PowerShell script that reads each object and run the cmdlet for each object. A script to get the job done might look like this:

#
# Script name: Install_Client_Bulk.ps1
#
# Purpose: Installs the ConfigMgr client on multiple machines, uses .csv as input
#
# Author: Marc Westerink
#
# Reference: http://technet.microsoft.com/library/jj821865(v=sc.20).aspx
#

#Create the required ‘global’ variables
$ConfigMgrModulePath=“D:\Program Files\Microsoft Configuration Manager\AdminConsole\bin\ConfigurationManager.psd1″
$ConfigMgrSiteCode=“P01:”
#Connecting to site
Import-Module $ConfigMgrModulePath
Set-Location $ConfigMgrSiteCode

#Initiate Client Installation(s)

Import-CSV E:\Install\CMHostName.csv | %{

$SiteCode=”P01″

Install-CMClient -SiteCode $SiteCode -AlwaysInstallClient $True -IncludeDomainController $True -DeviceName $_.CMHostName

}

 

Running this script has one downside. Since a lot of machines will be instructed to have the client installed, you will see a lot of entries in the ccm.log file so monitoring its progress might be challenging. So therefore I recommend testing the script with only a few entries in the .csv file (as little as two) to verify everything’s working as expected…

 
 
Steve Thompson [MVP]

The automation specialist

Boudewijn Plomp

Cloud and related stuff...

Anything about IT

by Alex Verboon

MDTGuy.WordPress.com

Deployment Made Simple

Modern Workplace

Azure, Hybrid Identity & Enterprise Mobility + Security

Daan Weda

This WordPress.com site is all about System Center and PowerShell

IT And Management by Abheek

Microsoft certified Trainer -Abheek

Heading To The Clouds

by Marthijn van Rheenen