RSS

Category Archives: Public Cloud

Enrolling lots of Windows 10 devices to Microsoft Intune, why bother?

Recently I’ve been involved in a few Microsoft Intune deployments.

These are standalone environments, so no hybrid scenario with System Center Configuration Manager. As we all know, Microsoft Intune can be purchased separately but that’s something I wouldn’t recommend. The pricing models of Enterprise Mobility + Security (EM+S) or Microsoft 365 Enterprise (a.k.a. Secure Productive Enterprise) would give you a lot more benefits making it a true bang for your buck. Organizations who fail to see that will basically defeat themselves because their competition does embrace this strategy. These subscriptions will replace a lot of on-premises management tools which liberates administrators with their daily tasks of extinguishing fires…

Microsoft Intune is available for EM+S E3 or 365 Enterprise E3 (also in both E5 subscriptions). Both subscriptions also include Azure Active Directory Premium P1. Azure Active Directory Premium P1 is a requirement to achieve a goal this post is talking about making Windows 10 device enrollment really simple.

Following guidelines on https://docs.microsoft.com/en-us/intune/windows-enroll allows organizations to deliver automatic enrollment for Windows 10 devices when Azure Active Directory Premium is enabled for a user who is assigned a EM+S or 365 Enterprise license. All features are enabled by default so we know it’s there if we don’t fiddle around with them…

So what does this actually mean?

Well, it means that each user who receives a Windows 10 device, preferably Enterprise, will do the device enrollment for you during the OOBE phase of Windows 10. It doesn’t matter if your organization has 5, 50, 500, 5000 or more devices. How cool is that?

As long as all required licenses are in place, admins don’t need to bother about this at all…

 

Advertisements
 

Manage your Azure Bill part 2: the operating phase…

Customers who already use Microsoft Azure by one or more subscriptions may face some challenges to get some insights in their Azure spending. Quite often customers ask me how to get some insights in their Azure spending and they are looking for ways to get more details where the money goes in a presentable way. Fortunately, it’s pretty easy to answer this question but it depends on the contract they have. It can be sorted in two categories:

  1. Enterprise Agreement (EA) contracts
  2. All other ones (Pay-as-you-go, CSP etc.)

Customers having EA contracts can use PowerBI (Pro) to generate their reporting quite easily. PowerBI Pro is available for all users with an Office365 E5 license. The Azure Enterprise is available from the PowerBI portal (picture is in Dutch but you can do the math).

azure-ea-appsource

All other contract types can build their own environment using the Azure Usage and Billing Portal. Instructions on how to build it can be found at https://azure.microsoft.com/en-us/blog/announcing-the-release-of-the-azure-usage-and-billing-portal/. There are some catches but it’s pretty easy to build, I got it running in my MSDN subscription easily. Once the environment is up and running the billing data is in the database it can be queried and processed in any way the customers chooses to do so.

Alternatively, 3rd party vendors offer services to present the Azure spending but that’s for another day…

 
Leave a comment

Posted by on 31/12/2016 in Azure, Cloud, Public Cloud, Revenue

 

Building a Storage Spaces Direct (S2D) cluster and succeed…

As in my previous post I failed miserably buiding an S2D cluster. Fortunately, it was just a small matter of reading this whitepaper properly which states only local storage can be used. We all know iSCSI storage is not locally attached so it makes perfect sense it doesn’t work. But at least I tested it and verified…

OK, so knowing that S2D works with DAS storage only it is time to test and verify if it’s difficult to build an S2D cluster.

To build the cluster, I’m going to build one using this guide. I use 2 FS1 Azure VMs and attach one P10 disk to each node.

So I follow the steps to build the cluster.

Thir fist step is to enable S2D which works fine.

s2d-with-das

NOTE: as in my previous post, the CacheMode parameter is not there. While this is still in the guide it may be a bit confusing to read it.

The next step is creating a Storage Pool for S2D.

s2d-storage-pool-2-disk-fail

Hmm, that’s odd. Appearantly 2 disks is insufficient. So, let’s add two more, one at each node resulting in having four disks.

s2d-storage-4-disk-success

OK, so I can continue building a S2D cluster disk of 250 GB

s2d-virtualdisk

The final step is creating a share according to the guide.

smb-share-fail

Hmmm, this fails too…

Well I was able to create the share using the Failover Clustering Console by configuring it as a SOFS and provide a ‘Quick’ file share.

So yeah, it’s relatively easy to build an S2D cluster but some steps in the overview need to be reviewed again. It contains mistakes…

 

Microsoft Azure: ONE feature I REALLY need…

It’s been a while since I posted my previous blog post. The main reason I didn’t post anything for a while is that I was very busy personally and professionally. I’ve been helping out customers adopting the Public Cloud more frequently and I must admit it’s a lot of fun.

During these conversations I’m focusing a lot on architecting Azure solutions (especially IaaS). Many times I get a feedback question that most likely goes like this:

“Yeah, it’s very nice this Microsoft Azure and all that other gay stuff, but how much does it cost?

Quickly followed by:

“Why is Microsoft Azure so expensive?”

Provinding an answer for the first question is quite challenging because Microsoft provides the Azure Pricing Calculator (available at https://azure.microsoft.com/en-us/pricing/calculator/) only. It allows me to provide an estimate on how much it will cost an organisation to use Azure services. It is still an estimate and that’s problematic because I cannot really use it for any TCO calculation. TCO is something that a CFO looks at and he or she wants the TCO to be a low as possible. All I could find was an old post available at https://azure.microsoft.com/en-us/blog/windows-azure-platform-tco-and-roi-calculator-now-available-online-and-offline/ but the tools are not there anymore.

I need to have a total overview In order to provide an honest and accurate calculation since most organisations want to mirror their on-premises costs to Microsoft Azure’s. Here’s a, most certainly incomplete, list of costs IT organizations make:

  • Hardware purchasing
  • Licensing
  • Labour
  • Housing
  • Energy
  • 3rd party Support Plans

The fun part of is that many organizations have no idea which costs they have and if they do, taking them in the equation. This behaviour will automatically cause the second question to be asked. I’d like to see Microsoft deliver a tool that allows me to fill in these variables. Microsoft’s biggest competitor, AWS, has such a tool.

Sounds like quite a rant to Microsoft, right?

Well, what really works in their defence is that the Azure Pricing Calculator really helps organizations to provide an estimate. Unfortunately, some common sense may be required when using Azure Services. Things that need to be taken into consideration are:

  • Uptime: if a service is not needed at given times, then turn them off and stop paying for what is actively used
  • Automation: When given times are available, ie. office hours, then schedule the switching on and off activities using Automation
  • Workload: if your workload demand is strongly fluctuating, then you don’t want to buy the hardware required to faciliate a few peaks
  • Evolution: Do you really need to build a VM with IIS when the web application can run on an Azure Web App service? It makes sense to evolve on-premises or IaaS services to PaaS services and no longer be bothered managing the fabric layer or even an Operating System and/or application
  • Evolution part 2: Consider replacing (legacy) applcations by SaaS services so you don’t manage them either
  • Initial investments: No initial investments are required when using Azure Cloud Services. You don’t need to have a budget ready to buy hardware. Think about the shorter ‘time to market’

If you look at it like this, then adopting Cloud Services may not so expensive at all.

Additionally, tunnel vision can be created when looking at costs alone. Many times a small increase in costs may greatly increase the benefits adopting Azure Cloud Services and I’d certainly recommend it in most cases. The only case I wouldn’t recommend it is having a workload with almost no fluctuations.

Nevertheless, it would be nice if Microsoft would provide a tool or someone can tell me where to find it if it already exists 🙂

 

 

 

 

 
1 Comment

Posted by on 26/10/2016 in Azure, Cloud, Public Cloud, Rant

 

Backing up Azure Storage Accounts…

New year, new challenges. And I was confronted with quite a nice one.

One of my customers uses Azure Storage quite intensively. While Azure Storage Accounts provide some protection by means of replication, there’s no real protection from corruption or deletion inside the Storage Account itself. Data that has been deleted will be deleted on the replicas as well. Unfortunately, there’s no replication mechanism to replicate data between Storage Accounts comparable to DFS Replication. Governance constraints may also prevent using Geo-redundant storage. Geo-redundant storage can also not guarantee that the secondary location still has the data before it became corrupted or deleted.

So a mechanism must be developed to protect the data from a potential disaster. Only Blob, File and Table Storage are valid candidates to be protected. Microsoft has released a tool that allows content to be copied to a different Storage Account (including from and to local disk): AzCopy

The relevant information regarding AzCopy is available at https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/

AzCopy is a commandline tool that allows content to be copied, but it is very static by nature. This may be great for a few single Blob containers or Table Uri, but a more versatile approach is required when hundreds of Blob containers and Table Uris need to be copied, especially when they are frequently changed.

Fortunately, I was able to use PowerShell that ‘feeds’ Azcopy with the required parameters to get the job done. The result is a script. The script developed focuses on Blob and Table Storage only, it can easily be extended for File Storage. For the sake of this blog post, Access Keys are used (the secondary) but SAS tokens requires only minor modifications.

Before proceeding, the workflow needs to be defined:

  • Provide the source and destination Storage Accounts, the destination Storage Account will store everything as Blobs meaning that Tables will be exported to a .MANIFEST and .JSON file (to allow the table to be recreated);
  • Retrieve the list of Blob containers and ‘convert’ them to a URL;
  • Copy each Blob container to the destination Storage Account, all containers will be stored as virtual directories in a single container to maintain structure;
  • Retrieve the list of Table Uris;
  • Export each Uri to a .MANIFEST and .JSON file and store them as blob at the destination Storage Account;
  • Log all copy actions.

The benefit of AzCopy is that it tells the Azure platform to copy something. No extensive processing is required by AzCopy. This allows the script to run on an Azure VM. A Standard_A1 VM is sufficient, additional disks are highly recommended when large Tables are used.

Unfortunately, parallel processing of AzCopy.exe is not possible. Everything must be done sequentially. Alternatively, multiple VMs can be spun up to backup their own set of Storage Accounts. This is highly recommended because backing up Storage Accounts may very time consuming, especially when large sets of small files or very large tables. Additionally, some small text files are used to store the items retrieved. They also allow the use of ForEach loops.

To repeat the actions for multiple Storage Accounts, a .csv file is used to input. The .csv file may look like this:

SourceStorageName,SourceStorageKey,DestinationStorageName,DestinationStorageKey
source1,sourcekey1,destination1,destinationkey1
source2,sourcekey2,destination2,destinationkey2

The actual script uses a lot of variables, AzCopy is called using the Start-Process cmdlet while the parameters for AzCopy.exe are populated by a long variable that will be used by the -ArgumentList parameter.

So here’s the script I used to achieve the goal of backing up Blob containers and Table Uris:

#
# Name: Copy_Storage_Account_AzCopy.ps1
#
# Author: Marc Westerink
#
# Version: 1.0
#
# Purpose: This script copies Blob and Table Storage from a source Storage Account to a Destination Storage Account.
# File F:\Input\Storage_Accounts.csv contains all Source and Destination Storage.
# All Blob Containers and Tables will be retrieved and processed sequentially.
# All content will be copied as blobs to a container named after the Source Storage Account. A virtual directory will be created for each blob container.
#
# Requirements:
# – Storage Accounts with Secondary Access Keys
# – AzCopy needs to be installed
# – Azure PowerShell needs to be installed
# – An additional disk to store Temporary Files (i.e. F:\ drive)
# – A Temp Folder (i.e. F:\Temp) with two Text Files ‘temp.txt’ and ‘output.txt’. The Temp Folder will be used by AzCopy.
# – A Folder to store all log files (i.e. F:\Logs)
#

# First, let’s create the required global variables

# Get the date the script is being run
$Date = Get-Date -format “dd-MM-yyyy”

# AzCopy Path
$FilePath=’C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe’

#Temp Files, let’s make sure they’re cleared before starting

$File1=’F:\Temp\temp.txt’
Clear-Content -Path $File1

$File2=’F:\Temp\output.txt’
Clear-Content -Path $File2

#Recursive Parameter: DO NOT use for Table Storage
$Recursive=’/S’

#Suppress prompt popups
$Prompt=’/Y’

#Temporary Directory for AzCopy
$TempDir=’F:\Temp’

#SplitSize parameter for Tables, this will split a large table into separate .JSON files of 1024 MB
$SplitSize=’/SplitSize:1024′

#DestType parameter, required for copying tables as blobs to the Destination
$DestType=’/DestType:Blob’

#Temporary Directory for AzCopy Journal files
$Journal=’/Z:F:\Temp’

#Blob path
$Blob=’.blob.core.windows.net/’

#https header
$HTTPS=’https:// ‘

#Let’s import the CSV and process all Storage Accounts
Import-Csv F:\Input\Storage_Accounts.csv | % {

#Creating the Full Path of the Source Storage Account Blob
$SourceStoragePath=$HTTPS+$_.SourceStorageName+$Blob

#Creating the Full Path of the Destination Storage Container, if it doesn’t exist it will be created
$DestStorageContainer=$HTTPS+$_.DestinationStorageName+$Blob+$_.SourceStorageName+$Date

#Gather the Source Access Key
$SourceStorageKey=$_.SourceStorageKey

#Gather the Destination Access Key
$DestinationStorageKey=$_.DestinationStorageKey

#Defining the log file for verbose logging with the Source Storage Account Name and the date
$Verbose=’/V:F:\Logs\’+$_.SourceStorageName+$Date+’.log’

#Create the Azure Storage Context to gather all Blobs and Tables
$Context = New-AzureStorageContext -StorageAccountName $_.SourceStorageName -StorageAccountKey $_.SourceStorageKey

#Copy blob containers first

#Get all containers
Get-AzureStorageContainer -context $context | Select Name | % {

Add-Content -Path $File1 -Value $_.Name
}

#Convert all Container Names to full paths and write them to the Output File
Get-Content $File1 | % {

Add-Content -Path $File2 -Value $SourceStoragePath$_
}

#Process all Containers using the Output File as input
Get-Content $File2 | % {

#Gather virtual directory name using the container name
$VirtualDirectory= $_ -replace $SourceStoragePath,”

$ArgumentList=’/Source:’+$_+’ ‘+’/Dest:’+$DestStorageContainer+’/’+$VirtualDirectory+’ ‘+’/SourceKey:’+$SourceStorageKey+’ ‘+’/DestKey:’+$DestinationStorageKey+’ ‘+$Recursive+’ ‘+$Verbose+’ ‘+$Prompt+’ ‘+$Journal
Start-Process -FilePath $FilePath -ArgumentList $ArgumentList -Wait
}

#Before proceeding, let’s clean up all files used
Clear-Content -Path $File1
Clear-Content -Path $File2
#Get All Tables
Get-AzureStorageTable -context $context | Select Uri | % {

Add-Content -Path $File2 -Value $_.Uri
}

#Process all Tables using the Output File as input
Get-Content $File2 | % {

$ArgumentList=’/Source:’+$_+’ ‘+’/Dest:’+$DestStorageContainer+’ ‘+’/SourceKey:’+$SourceStorageKey+’ ‘+’/DestKey:’+$DestinationStorageKey+’ ‘+$SplitSize+’ ‘+$Verbose+’ ‘+$Prompt+’ ‘+$DestType+’ ‘+$Journal
Start-Process -FilePath $FilePath -ArgumentList $ArgumentList -Wait

}

#Cleanup Output File
Clear-Content -Path $File2
}

To have this script run by a schedule, a simple Scheduled Task can be created to do so. The schedule itself depends on the environment’s needs and the time to have everything copied. It’s not uncommon that a large storage account with countless small blobs and huge tables may take a week or even longer to be copied…

 

 

 
Leave a comment

Posted by on 02/02/2016 in Azure, PowerShell, Public Cloud

 

Looking back at 2015…

So, the year 2015 is almost at its end. While I write this, I am already in my second week of my two week time off. And boy,I really needed this two week break.

2015 was an extremely busy year for me, and I can actually cut the year in half.

At the first half, I was still busy participating in a project where I designed and deployed System Center 2012 R2 Configuration Manager. I also built a stand-alone Image Building environment running MDT 2013. Unfortunately, the project took way longer than expected due the customer being unable to take ownership and start administering it by themselves. Eventually I decided to walk away after the contractual end date of my involvement despite the fact the project isn’t finished yet. The longer it took, the more frustrating the project became for me so the decision to walk away was eventually the right one.

This takes me to the second half. In the second half, I saw a dramatic shift in my job since I did only one Configuration Manager design and deployment in the second half of 2015. I started to extend my skillset on Enterprise Client Management a bit more with Microsoft Intune and Microsoft’s Public Cloud platform: Azure.

I also started to deliver more workshops, master classes and training sessions. This is something I really like to do and I want to thank those who made it possible for me. It allowed to me renew my Microsoft Certified Trainer certification.

Fortunately, the frustrations of the first half provided some learning moments which required me to become a more complete consultant. So my coworker arranged a two day training session for me called “Professional Recommending” (this may be a poor translation of Professioneel Adviseren in Dutch) provided by Yearth. This is by far the most important training I received in my career and it really started to pay off pretty quickly by receiving more positive feedback from customers. I became a more complete consultant with this training.

I was also happy to do the presentation workshop with Monique Kerssens and Jinxiu Hu from Niqué Consultancy BV at ExpertsLive 2015. I was happy to receive the feedback that my presentation skills have developed greatly. To quote them: “you’re standing like a house”.

The icing on the cake came at the end of this year when I was asked to review the DataON CiB-9224 platform. You can read the review in my previous post.

So, I experienced some highs and lows this year. Fortunately, the highs came at the second half.

I look forward to 2016, but that’s for another post…

 

 

Backup to Azure Vault revisited…

Recently, Microsoft announced the general availability of Backup of Azure IaaS VMs. The announcement is available here: http://azure.microsoft.com/en-us/blog/general-availability-of-backup-for-azure-iaas-vms/

This general availability means the backup mechanism of Azure virtual machines has changed. Before this general availability, Azure VMs were treated the same way as on-premises machine using an Agent that facilitates backup. While this method works great, the ‘new’ mechanism makes protection easier for Azure VMs.

To protect Azure VMs using an Azure Backup Vault, the following workflow can be used:

  1. Create a Backup Vault (if not used already)
  2. Register Azure VMs by discovering and select them for protection
  3. Setup a Backup Policy
  4. Start protecting

Keep in mind that each Backup Vault can store 54,4 TB of Backup content

This post describes the steps to configure Backup of Azure VMs.

  1. Create a Backup Vault
    create_vault
    Pretty self-explanatory if you ask me.
  2. Register Azure VMs and Setup a backup policy
    Once the Backup Vault has been created, Azure VMs can be registered for protection
    register_items
    Select Register.
    select_items
    Select the machines that need to be registered for protection and press the OK button. Press the protect button in the portal to proceed.
    select_items_protect

    Press the Next button.
    configure_protection
    Here a protection policy can be configured. In this case a daily backup is scheduled at 8 pm.
    retention_range
    The final step is defining the retention range. Notice that these settings are the same as in the Azure Recovery Services Agent. In this scenario a daily backup only is chosen. The configuration is completed after finishing the OK button

  3. Start protecting
    An initial backup can be started by using the Backup Now button
    backup_now
    Its progress can be monitored by the Jobs tab. This tab can also be used to view how long it takes for a machine to be backed up.

To restore a virtual machine, a new virtual machine can be created from ny backup set available…

This greatly simplifies protecting Azure VMs…

 
Leave a comment

Posted by on 18/09/2015 in Azure, Cloud, Public Cloud

 
 
Steve Thompson [MVP]

The automation specialist

Boudewijn Plomp

Cloud and related stuff...

Anything about IT

by Alex Verboon

MDTGuy.WordPress.com

Deployment Made Simple

Modern Workplace

Azure, Hybrid Identity & Enterprise Mobility + Security

Daan Weda

This WordPress.com site is all about System Center and PowerShell

IT And Management by Abheek

Microsoft certified Trainer -Abheek

Heading To The Clouds

by Marthijn van Rheenen