Custom PowerShell Reboot GUI

When deploying software via SCCM I thought wouldn’t it be nice if there was greater flexibility regarding system reboot prompts for the end user. Sure you can enable a maintenance window and push your software out during that time, but we have at times been caught where a software push is needed during business hours.

So I came up with this PowerShell script which you can run as part of a task sequence when deploying emergency/unscheduled software installs. The script generates a GUI which provides the end-user with three options;

  1. Restart the computer
  2. Schedule a restart (note in here I have hard-coded this for 6pm)
  3. Cancel the restart

The script also starts a count-down timer to automatically restart the computer after 3 minutes if no user interaction occurs.


Example Script Use – SCCM TS

In the below example we are going to create a Package in SCCM which contains the script file, you will also need to include two exe files from MDT which allow you to run the script in interactive mode.

Locate ServiceUI.exe and TSProgressUI.exe (obviously picking the x86 or x64 where applicable) and add these into your package source. You should have something that looks like this ;


Now add a Run Command Line entry into your TS and use the following command line;

ServiceUI.exe -process:TSProgressUI.exe %SYSTEMROOT%\System32\WindowsPowerShell\v1.0\powershell.exe -NoProfile -WindowStyle Hidden -ExecutionPolicy Bypass -File CustomRestart.ps1


When the Task Sequence is run, you should now have the restart prompt appear;


Script Source

 Code generated by: SAPIEN Technologies, Inc., PowerShell Studio 2016 v5.2.128
 Generated on: 04/10/2016 10:13
 Generated by: Maurice.Daly
 Provides an reboot prompt which counts down from 3 minutes and allows the
 end user to schedule or cancel the reboot.

#region Import Assemblies
[void][Reflection.Assembly]::Load('System.Windows.Forms, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089')
[void][Reflection.Assembly]::Load('System.Data, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089')
[void][Reflection.Assembly]::Load('System.Drawing, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a')
#endregion Import Assemblies

#Define a Param block to use custom parameters in the project
#Param ($CustomParameter)

function Main {
 The Main function starts the project application.

 .PARAMETER Commandline
 $Commandline contains the complete argument string passed to the script packager executable.

 Use this function to initialize your script and to call GUI forms.

 To get the console output in the Packager (Forms Engine) use:
 $ConsoleOutput (Type: System.Collections.ArrayList)
 Param ([String]$Commandline)

 #TODO: Add initialization script here (Load modules and check requirements)


 if((Call-MainForm_psf) -eq 'OK')


 $global:ExitCode = 0 #Set the exit code for the Packager

#endregion Source: Startup.pss

#region Source: MainForm.psf
function Call-MainForm_psf

 #region Import the Assemblies
 [void][reflection.assembly]::Load('System.Windows.Forms, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089')
 [void][reflection.assembly]::Load('System.Data, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089')
 [void][reflection.assembly]::Load('System.Drawing, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a')
 #endregion Import Assemblies

 #region Generated Form Objects
 $MainForm = New-Object 'System.Windows.Forms.Form'
 $panel2 = New-Object 'System.Windows.Forms.Panel'
 $ButtonCancel = New-Object 'System.Windows.Forms.Button'
 $ButtonSchedule = New-Object 'System.Windows.Forms.Button'
 $ButtonRestartNow = New-Object 'System.Windows.Forms.Button'
 $panel1 = New-Object 'System.Windows.Forms.Panel'
 $labelITSystemsMaintenance = New-Object 'System.Windows.Forms.Label'
 $labelSecondsLeftToRestart = New-Object 'System.Windows.Forms.Label'
 $labelTime = New-Object 'System.Windows.Forms.Label'
 $labelInOrderToApplySecuri = New-Object 'System.Windows.Forms.Label'
 $timerUpdate = New-Object 'System.Windows.Forms.Timer'
 $InitialFormWindowState = New-Object 'System.Windows.Forms.FormWindowState'
 #endregion Generated Form Objects

 # User Generated Script
 $TotalTime = 180 #in seconds

 #TODO: Initialize Form Controls here
 $labelTime.Text = "{0:D2}" -f $TotalTime #$TotalTime
 #Add TotalTime to current time
 $script:StartTime = (Get-Date).AddSeconds($TotalTime)
 #Start the timer

 # Define countdown timer
 [TimeSpan]$span = $script:StartTime - (Get-Date)
 #Update the display
 $labelTime.Text = "{0:N0}" -f $span.TotalSeconds
 if ($span.TotalSeconds -le 0)
 Restart-Computer -Force


 $ButtonRestartNow_Click = {
 # Restart the computer immediately
 Restart-Computer -Force

 # Schedule restart for 6pm
 (schtasks /create /sc once /tn "Post Maintenance Restart" /tr "shutdown - r -f ""restart""" /st 18:00 /f)

 #TODO: Place custom script here

 #TODO: Place custom script here


 #Event Argument: $_ = [System.Windows.Forms.PaintEventArgs]
 #TODO: Place custom script here


 #TODO: Place custom script here

 # --End User Generated Script--
 #region Generated Events

 #Correct the initial state of the form to prevent the .Net maximized form issue
 $MainForm.WindowState = $InitialFormWindowState

 #Store the control values

 #Remove all event handlers from the controls
 catch [Exception]
 { }
 #endregion Generated Events

 #region Generated Form Code
 # MainForm
 $MainForm.AutoScaleDimensions = '6, 13'
 $MainForm.AutoScaleMode = 'Font'
 $MainForm.BackColor = 'White'
 $MainForm.ClientSize = '373, 279'
 $MainForm.MaximizeBox = $False
 $MainForm.MinimizeBox = $False
 $MainForm.Name = 'MainForm'
 $MainForm.ShowIcon = $False
 $MainForm.ShowInTaskbar = $False
 $MainForm.StartPosition = 'CenterScreen'
 $MainForm.Text = 'Systems Maintenance'
 $MainForm.TopMost = $True
 # panel2
 $panel2.BackColor = 'ScrollBar'
 $panel2.Location = '0, 205'
 $panel2.Name = 'panel2'
 $panel2.Size = '378, 80'
 $panel2.TabIndex = 9
 # ButtonCancel
 $ButtonCancel.Location = '250, 17'
 $ButtonCancel.Name = 'ButtonCancel'
 $ButtonCancel.Size = '77, 45'
 $ButtonCancel.TabIndex = 7
 $ButtonCancel.Text = 'Cancel'
 $ButtonCancel.UseVisualStyleBackColor = $True
 # ButtonSchedule
 $ButtonSchedule.Font = 'Microsoft Sans Serif, 8.25pt, style=Bold'
 $ButtonSchedule.Location = '139, 17'
 $ButtonSchedule.Name = 'ButtonSchedule'
 $ButtonSchedule.Size = '105, 45'
 $ButtonSchedule.TabIndex = 6
 $ButtonSchedule.Text = 'Schedule - 6pm'
 $ButtonSchedule.UseVisualStyleBackColor = $True
 # ButtonRestartNow
 $ButtonRestartNow.Font = 'Microsoft Sans Serif, 8.25pt, style=Bold'
 $ButtonRestartNow.ForeColor = 'DarkRed'
 $ButtonRestartNow.Location = '42, 17'
 $ButtonRestartNow.Name = 'ButtonRestartNow'
 $ButtonRestartNow.Size = '91, 45'
 $ButtonRestartNow.TabIndex = 0
 $ButtonRestartNow.Text = 'Restart Now'
 $ButtonRestartNow.UseVisualStyleBackColor = $True
 # panel1
 $panel1.BackColor = '0, 114, 198'
 $panel1.Location = '0, 0'
 $panel1.Name = 'panel1'
 $panel1.Size = '375, 67'
 $panel1.TabIndex = 8
 # labelITSystemsMaintenance
 $labelITSystemsMaintenance.Font = 'Microsoft Sans Serif, 14.25pt'
 $labelITSystemsMaintenance.ForeColor = 'White'
 $labelITSystemsMaintenance.Location = '11, 18'
 $labelITSystemsMaintenance.Name = 'labelITSystemsMaintenance'
 $labelITSystemsMaintenance.Size = '269, 23'
 $labelITSystemsMaintenance.TabIndex = 1
 $labelITSystemsMaintenance.Text = 'IT Systems Maintenance'
 $labelITSystemsMaintenance.TextAlign = 'MiddleLeft'
 # labelSecondsLeftToRestart
 $labelSecondsLeftToRestart.AutoSize = $True
 $labelSecondsLeftToRestart.Font = 'Microsoft Sans Serif, 9pt, style=Bold'
 $labelSecondsLeftToRestart.Location = '87, 176'
 $labelSecondsLeftToRestart.Name = 'labelSecondsLeftToRestart'
 $labelSecondsLeftToRestart.Size = '155, 15'
 $labelSecondsLeftToRestart.TabIndex = 5
 $labelSecondsLeftToRestart.Text = 'Seconds left to restart :'
 # labelTime
 $labelTime.AutoSize = $True
 $labelTime.Font = 'Microsoft Sans Serif, 9pt, style=Bold'
 $labelTime.ForeColor = '192, 0, 0'
 $labelTime.Location = '237, 176'
 $labelTime.Name = 'labelTime'
 $labelTime.Size = '43, 15'
 $labelTime.TabIndex = 3
 $labelTime.Text = '00:60'
 $labelTime.TextAlign = 'MiddleCenter'
 # labelInOrderToApplySecuri
 $labelInOrderToApplySecuri.Font = 'Microsoft Sans Serif, 9pt'
 $labelInOrderToApplySecuri.Location = '12, 84'
 $labelInOrderToApplySecuri.Name = 'labelInOrderToApplySecuri'
 $labelInOrderToApplySecuri.Size = '350, 83'
 $labelInOrderToApplySecuri.TabIndex = 2
 $labelInOrderToApplySecuri.Text = 'In order to apply security patches and updates for your system, your machine must be restarted. 

If you do not wish to restart you computer at this time please click on the cancel button below.'
 # timerUpdate
 #endregion Generated Form Code


 #Save the initial state of the form
 $InitialFormWindowState = $MainForm.WindowState
 #Init the OnLoad event to correct the initial state of the form
 #Clean up the control events
 #Store the control values when form is closing
 #Show the Form
 return $MainForm.ShowDialog()

#endregion Source: MainForm.psf

#Start the application
Main ($CommandLine)

Download Link
The script is available to download from:

Powershell – Check Client Machine Uptime

Want to check the last time all of your client machines booted in a particular OU?. Well here is a nice little two liner to do so.

$SearchBase = Get-ADComputer -SearchBase "OU=OUTARGET,DC=YOURDOMAIN,DC=YOURDOMAIN" -Filter * | ForEach-Object (Write-Output {$})
Get-CimInstance -ComputerName $SearchBase -ClassName win32_operatingsystem -ErrorAction SilentlyContinue | select pscomputername, lastbootuptime, Description | Sort-Object -Property lastbootuptime -Descending | Out-GridView

Office365 – Listing all members of both static and dynamic distrubiton groups

Here is a nice little script that connects to your Office365 environment, reads the contents of all distribution groups both static and dynamic and exports the filtered contents into a CSV file thus allowing you to apply filters etc in Excel.

     Created with:     SAPIEN Technologies, Inc., PowerShell Studio 2015 v4.2.85
     Created on:       12/06/2015 9:47 a.m.
     Created by:       Maurice Daly
     Filename:     GetDistributionGroupMembers.ps1
        List all members of all static and dynamic distribution groups from your
        Office 365 portal and export the contents into a CSV.

$UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session

$DistributionGroups = Get-DistributionGroup
$DynDistributionGroups = Get-DynamicDistributionGroup

$FilePath = "C:\DistributionGroupMembers.csv"

# Read Static Distribution Groups
foreach ($DistributionGroup in $DistributionGroups) {
    Get-DistributionGroupMember $DistributionGroup.PrimarySMTPAddress | Sort-Object name | Select-Object @{ Label = "Distribution Group"; Expression = { $ } }, Name | Export-Csv -Path $FilePath -Delimiter ";" -NoTypeInformation -Append -Force

# Read Dynamic Distribution Groups
foreach ($DynDistributionGroup in $DynDistributionGroups)
    Get-Recipient -RecipientPreviewFilter $DynDistributionGroup.RecipientFilter | Sort-Object name | Select-Object @{ Label = "Distribution Group"; Expression = { $ } }, Name | Export-Csv -Path $FilePath -Delimiter ";" -NoTypeInformation -Append -Force

# Close Remote PS Session
Get-PSSession | Remove-PSSession

Citrix XenApp & Office 2016 – AutoCorrect Entries Disappearing

I recently came across an issue with Office 2016 and Citrix XenApp where by a user’s Word autocorrect entries would be wiped intermittently during a live session. After a quick search I found that Microsoft’s product support have identified this as a known issue and have provided a work around ;

The issue here is that the work around requires users to access their AppData folder and modify files based on date, not something you would want Citrix user sessions to have to do.

So I came up with a work around for my environment using PowerShell.

Step1. Backup AutoCorrect Entries
For this process I use the following PS script running as part of a log off script process in a GPO. The backup process will only replace previous backups whereby the file size exceeds that of the previous backup (as users tend to keep adding autocorrect entries and thus the file size increases);

Created with: SAPIEN Technologies, Inc., PowerShell Studio 2015 v4.2.99
Created on: 01/03/2016 12:32
Created by: Maurice.Daly
Word Autotext Backup Script

$AutoTextLocation = $env:APPDATA + "\Microsoft\Templates"
$BackupLocation = [environment]::GetFolderPath("MyDocuments") + "\AutoTextBackup"

If ((Test-Path -Path $BackupLocation) -eq $false)
$AutoCorrectFiles = Get-ChildItem -Path $AutoTextLocation | Where-Object { $_.Name -like "Normal*.*" }
New-Item $BackupLocation -Type dir
Copy-Item -Path $AutoCorrectFiles.FullName -Destination $BackupLocation
$AutoCorrectFiles = Get-ChildItem -Path $AutoTextLocation | Where-Object { $_.Name -like "Normal*.*" }
foreach ($File in $AutoCorrectFiles)
if ((Get-Item -Path $File.FullName).Length -gt (Get-ChildItem -Path ($BackupLocation + "\" + $File.Name)).Length)
Copy-Item -Path $File.FullName -Destination $BackupLocation -Verbose

Step 2. Restoring AutoCorrect Entries
The following PowerShell script can be run automatically via a log on PS script in a GPO or in my instance I opted to publish the script via XenApp so that users can restore data when an issue arises;

Created with: SAPIEN Technologies, Inc., PowerShell Studio 2015 v4.2.99
Created on: 01/03/2016 12:32
Created by: Maurice.Daly
Word Autotext Restore Script

$AutoTextLocation = $env:APPDATA + "\Microsoft\Templates"
$BackupLocation = [environment]::GetFolderPath("MyDocuments") + "\AutoTextBackup"

$ConfirmRestore = [System.Windows.Forms.MessageBox]::Show("Citrix instances Word and Outlook will now close to restore your AutoCorrect files.", "Restore Office AutoCorrect", 4)
if ($ConfirmRestore -eq "YES"){
$OfficeApps = "Winword.exe", "Outlook.exe"
foreach ($App in $OfficeApps)
Taskkill.exe /FI "Username eq $env:Username" /IM $App
Sleep -Seconds 5
Write-Host -ForegroundColor Green "Removing current autocorrect template files"
Get-ChildItem -Path $AutoTextLocation | Where-Object { $_.Name -like "normal*.*" } | Remove-Item -Force -ErrorAction Continue
Write-Host -ForegroundColor Green "Restoring autocorrect templates from backup location"
Get-ChildItem -Path $BackupLocation | Copy-Item -Destination $AutoTextLocation -Force
if ((Get-ChildItem -Path $BackupLocation).Count -eq (Get-ChildItem -Path $AutoTextLocation | Where-Object { $_.Name -like "normal*.*" }).Count)
[System.Windows.Forms.MessageBox]::Show("AutoCorrect files succesfully restored.", "Restore Office AutoCorrect")}else{ [System.Windows.Forms.MessageBox]::Show("AutoCorrect files restore unsuccessful. Please contact IT.", "Restore Office AutoCorrect")}
else { Exit }

The PowerShell scripts assume that you using re-mapped My Documents as a backup location and that you want some form of interactivity during the restore process, i.e to advise them that Word and Outlook will be terminated and whether or not the restore process was successful. Obviously you can chop/change this as required but it does the trick.

Hopefully this helps some of you with this issue.


Azure AD SSO in a non-ADFS environment – Windows 10

In a world without ADFS.. Single Sign On is possible..

Having made the migration from onsite Exchange and SharePoint to Office 365 in early 2015 one of the key issues that users had was the lack of single sign on facilities to applications based in the cloud. You might ask why didn’t you use ADFS for your implementation, well I like many SME systems admins had made a conscious decision to run with DirSync as this lowered the overall foot print of our hybrid set up, I also liked the idea that even in the event of a complete loss of internal resources the Office 365 cloud would still allow employees to authenticate.

In my case the decision was concreted only after speaking with Microsoft Office 365 speakers and product development engineers, as their view was unless you have a specific reason for using ADFS then the DirSync product road map with the migration to Azure AD Connect would be more than sufficient for my needs. I convinced my manager and the migration went ahead..

So to fast forward to today and with the improvements in the ability to sync details to and from Azure Active Directory without an ADFS environment I am going to run through one of the newer authentication features of Windows 10, this being Azure AD SSO on domain joined devices.

Single Sign On without ADFS.. What Is This Black Magic???

The ability to open cloud based resources which integrate with Azure Active Directory without having to sign on again has been the domain of ADFS up until this point. With the latest release of Azure AD Connect and Windows 10 1511 on-wards however we can now achieve a similar experience.

The system works by issuing authentication tokens when registering the physical device of the user. Further in depth technical info is available on TechNet –

So How Do I Implement It?

The implementation process is very straight forward;

  1. Enable users to join devices to Azure AD

    Log onto the Azure Admin Portal, open your tenant and click on the Configure tab. Scroll down to the Devices section and apply either a selected or all device configuration depending on your security requirements. Example screenshots below;

  2. Download and install the latest version of Azure AD Connect from if you are still using DirSync or Azure AD Sync, you should migrate to Azure AD Connect before the 13th of April 2017 as support will be deprecated at that point. The upgrade process from these old legacy tools is very straight forward during the setup wizard.
  3. Ensure your Windows 10 clients are compatible

    Windows 10 build 1511 (November 2015) onwards support Azure AD SSO device join via group poilcy. If you are running the RTM build 10240, you will need to upgrade first.To check this either open a command prompt and read the Windows version on the second line or open PowerShell and type;

    (Get-WmiObject win32_operatingsystem).version


    For those of you using SCCM, I suggest you create a collection based on the version of Windows 10 for management purposes. The following query will add clients running build 1607 to your collection;

    select SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,SMS_R_SYSTEM.Client from SMS_R_System inner join SMS_G_System_OPERATING_SYSTEM on SMS_G_System_OPERATING_SYSTEM.ResourceID = SMS_R_System.ResourceId where SMS_G_System_OPERATING_SYSTEM.BuildNumber = “14393”

  4. Update Your Central Store – Group Policy Administrative Templates In order to enable this feature you will need to ensure that your group policy administrative templates are up to date. The latest Windows 10 templates can be downloaded from the following link –
  5. Enable Device Sync – Azure AD Connect

    Now we have to enable device synchronization with Azure AD. There are two ways to achieve this, you can either run the Azure AD Connect wizard or edit the connectors yourself. In the below example I am going to go with the later;

    • On your Azure AD Connect server, open the Synchronization Service Managersyncscreen1Right-click on the Connectors button and right click on the Active Directory Domain Services and to to Properties
    • Configure Directory Partitions Click on Configure Directory Partitions and then click on the Containers button.syncscreen2
      When prompted enter an account with sufficient rights to Active Directory.
    • Select Computer OU to Sync Browse through your Active Directory and select the OU’s which contain computer accounts you wish to synchronize to Azure AD for the purpose of using Azure SSO.


    • Force Full Synchronization

      Right click on your Active Directory Domain Services connector and click on Run.


      On this screen click on the Full Import button. Once this job has completed do the same for the Windows Azure Active Directory connector.

  6. Configure GPO Settings

    Open Group Policy Management Editor and either create a new GPO or modify an existing one which applies settings to computers within the OU’s you have set to sync.

    Open the Computer Settings\Policies\Administrative Templates\Windows Components\Device Registration folderClick on the Register Domain Joined Computers As Devices setting and click Enable.


  7. Have Patience…There will obviously be delay between devices refreshing their GPO policies, Azure AD Sync times and devices registering.
  8. Verifying Device Sync Status

    Download the Microsoft Azure Active Directory Module for Windows PowerShell from the following link : a Azure AD Module PS window and connect to your Azure tenant environment by typing;



    Once connected type the following to display a full list of registered devices and their current state;

    Get-MsolDevice -All | FT DisplayName, DeviceOS*, DeviceTrust*


    You can also run the following PS command to get a count of machines registered and compare this to your SCCM collection;

    (Get-MsolDevice -All).Count

Its Done..

Once your devices are successfully registered, your users should be automatically signed into their Azure AD cloud services. Now you have another little win for making their lives that bit easier without going to the extent of implementing a full ADFS environment.
Update – 29/9/2016

With the latest 16.0.7341.2035 build of Office 2016, SSO is now fully functional when opening Outlook. The below screen is now presented to the user rather than the usual Add Account wizard which prompts you for your password at the end. Now all the user has to do is click Connect and the settings download & sign in happens in the background in the initial launch;




Automating Management of Local Administrator Passwords – Microsoft LAPS

Managing Local Administrator Passwords

So you have a complex password policy on your domain, ensuring that users change their password every 60-90 days, passwords are complex, their passwords can’t be re-used and your users are not local admins but one thing poses a security risk, the local administrator password.

The chances are you have deployed your standard Windows image using a password specified within the image/MDT or SCCM task sequence and in some cases you do not have the IT resources to ensure that this password is changed on a regular basis and even if you are, are you ensuring that the passwords are being documented and stored securely?.

Security Risk – Local Administrator Rights

If you do not have a well maintained local administrator password strategy it opens your network up to security vulnerabilities including elevation of privilege. It isn’t going to go down well when your standard local workstation admin password is shared and users add themselves to the local admin group, potentially in a worst case scenario adding malware, key loggers etc.

The Answer – Microsoft Local Administrator Password Solution (LAPS)

Microsoft LAPS is a free tool released back on May 1st 2015 and allows you to automate the process of updating local administrator passwords on your workstations and servers across your Active Directory domain/forest. LAPS uses a local agent in conjunction with GPO deployed settings to update the local administrator password at set intervals, based on complexity settings that you specify and most importantly it automatically stores backups of this info within Active Directory.

Deploying Microsoft LAPS

First of all you will need to download the installer from the following URL – In the below section we will run through the entire installation and configuration process;

  1. Management Server Installation

    A single installer is used for both the server and client installs, the only real difference being that the management tools need to be installed on the management server. Run the LAPS.x86 or LAPS.x64 installer as per your system architecture, then run through the following;

    1. Launch the installer


    2. At the custom setup screen select the management tools and select run from my computer and then click Next


    3. Click on the Finish button to finalise the install


  2. Active Direct Schema Modification

    In order for computers to write back their local administrator passwords and expiry date/time, a schema update is required. The update adds the following two values:

    ms-Mcs-AdmPwd – Stores the password in clear text
    ms-Mcs-AdmPwdExpirationTime – Stores the time to reset the password

    To add these values, launch a PowerShell session on your management server and perform the following actions;

    • Type – Import-Module AdmPwd.PS to import the required LAPS module


    • Type – Update-AdmPwdADSchema


      Note: If you have an RODC installed in the environment and you need to replicate the value of the attribute ms-Mcs-AdmPwd to the RODC, you will need to change the 10th bit of the searchFlags attribute value for ms-Mcs-AdmPwd schema objet to 0 (substract 512 from the current value of the searchFlags attribute). For more information on Adding Attributes to or Removing attributes from the RODC Filtered Attribute Set, please refer to

  3. Active Directory Rights

    Computer Rights

    In order for computer accounts to write values to the ms-Mcs-AdmPwdExpirationTime and ms-Mcs-AdmPwd attributes in Active Directory, the following PowerShell command needs to be run (note if closed the previous PowerShell window you will need to run Import-Module again)

    Set-AdmPwdComputerSelfPermission -OrgUnit <name of the OU to delegate permissions>

    User Rights
    By default members of the Domain Admins group will have rights to view the local administrator passwords stored in Active Directory, however what happens if you want your desktop support team to view them?. To facilitate this you will need to delegate rights.

    To do so use the following PS command:

    Set-AdmPwdReadPasswordPermission -OrgUnit <name of the OU to delegate permissions> -AllowedPrincipals <users or groups>

    Going another step further you can also delegate rights to allow groups or individuals to force a password change. To do so use the following PS command:

    Set-AdmPwdResetPasswordPermission -OrgUnit <name of the OU to delegate permissions> -AllowedPrincipals <users or groups>

  4. Group Policy Configuration

    First of all you will need to copy the LAPS ADMX and ADML files to your central store. The two files are located in the %WINDIR%\PolicyDefinitions folder on the management server.

    Now follow the below;

    1. Open Group Policy Manager and either create or modify a GPO that you wish to apply the LAPS settings.

    2. Expand Computer Configuration\Policies\Administrative Templates\LAPS


    3. Configure your Password Settings, Name of the Local Admin Account and Enable Password Management, as per the below examples:


  5. Deploying the LAPS client
    Deploying the client is a simple process. Using the same MSI installation files you can deploy the client to your x86 and x64 clients via GPO, SCCM or other third party application deployment systems. Simply use the /quiet switch for client deployments.
  6. Check its working..Active Directory Users & Computers

    Opening the Active Directory Users & Computers console and viewing the Attribute Editor of a machine located within the OU that you earlier deployed your LAPS enabled GPO to, should result in values being available as per the below screenshot;


    LAPS Admin GUI

    On the management server that earlier installed LAPS on, you will have the LAPS GUI. This will allow you to both look up details from computers but also set a new expiration date for the existing local admin password;


For more info on LAPS, visit Microsoft TechNet –

WMUG 10th Anniversary Follow Up – Azure MFA 




WMUG Attendees – Thank You

First of all I would like to thank everyone who made the effort to turn up to the WMUG 10th Anniversary event in Microsoft, Paddington last month. Community is without a doubt the best place to learn and share real world experiences.

Unfortunately near the end of the session that Terence Beggs (@TerenceBeggs) and I were running on Azure MultiFactor Authentication the demo gods struck and our datacenter ISP suffered a minor outage resulting in a flaky connection to our live demo infrastructure (lesson learned to bring the VM’s on my local machine the next time).

So as promised here is both a follow up and a recap on the subject of Azure MultiFactor Authentication, the installation documentation on this product is excellent ( so I won’t re-hash the same details but I will cover the key area’s.

A full run through of creating the MFA provider, installing the back-end server and demonstration of using MFA with an RD Gateway is included in the PowerPoint presentation from the link below;

Azure MultiFactor Auth Presentation

To recap..

What is Azure MultiFactor Authentication (MFA)?

Microsoft Azure Authentication is a product used to provide secondary authentication for systems both on-premise and in the cloud. Originally developed by a company called Phone Factor, the company was purchased in 2012 and re-branded / integrated into the Azure product stack in 2013.

MFA can be used purely as a cloud service with systems such as Office 365, SaaS apps and Azure App Proxy, for on premise applications including IIS, RDS and VPN systems there is an on-premise application and this is what I am going to focus on for this blog post.

For more information on the differences between MFA cloud vs on-premise visit 

Where Do I Get Started?

MFA is available as both a pay as you use and subscription (OPEN/MPSA etc) service, the pay as you use model is particularly useful if you want to run the system as a proof of concept.

To sign up you will need a Microsoft Azure enabled account, you can create one and even take advantage of a free trial account with €170 of credit at and pricing details are available from

I’ve Signed Up for an Azure Account.. Now Where Do I Go?

Adding the MultiFactor service to your Azure account could not be easier. Simple log onto the Azure management portal and click on the following;

  1. New App Service
  2. Active Directory
  3. Multi-Factor Authentication Provider
  4. Quick Create

At the last step you will be prompted as to the licensing model you wish to use, this can not be changed after this point so be careful when selecting. The choices are;

  1. Per Enabled User
  2. Per Authentication

If you change your mind post set up you must create a new MFA provider so although it is best to get this right the first time, it can be easily re-created if needed.

Downloading MFA On-Premise

Downloading the MFA on-premise server application is a straight forward process. Once your Azure MFA provider has been set up you simply log onto the portal and click on the Downloads link. On the Downloads page you will also have the ability to generate activation credentials, you will need these details to install the service (note that when you generate activation details they only last for 10 minutes for security purposes).


Planning Your MFA Deployment (Securely Publishing The Site)

Planning your MFA installation should take into account security factors for publishing resources over the public internet. Some examples of deployment scenarios are included within the PowerPoint slides, however you should always discuss these with your security team prior to implementation.

For the majority of installations the front end web portal should be published on a VM within your DMZ and access to the back-end MFA server provided via access control lists on your corporate firewall via ports 443 and 4898. An exception to this is whereby you are using a reverse proxy such as Pulse Secure SSL, F5, Citrix NetScaler, Barracuda, etc. 

Note that you will also need to provide direct HTTPS (Port 443) access to the following IP addresses for communications with the Azure MFA service;

  • –
  • –
  • –

To publish the MFA portal you will also need a trusted CA signed certificate. Personally I am using a previously purchased wild card cert but certificates are cheap these days, so visit your trusted CA site (GoDaddy for example) and obtain your internet facing SSL cert.

Installing Your MFA Back-End Application Server


  • IIS
  • .Net Framework 4.5
  • KB 2919355 on Server 2012 R2 (
    The updates must be installed in the following order: clearcompressionflag.exe, KB2919355, KB2932046, KB2959977, KB2937592, KB2938439, and KB2934018

Once you have ensured all pre-requisites are installed, installing the app server component is very straight forward.

Installing the User Portal, Web SDK and Mobile App Portal

During the post installation configuration stage you must install the user portal components to allow users to self enroll and manger their MFA account.

The User and Web Service SDK portals are installed directly within the MFA console, however if you wish to use the Azure Authenticator app for verifying your user sessions you will need to install the Mobile App Web Service portal manually from the following location;
C:\Program Files\Multi-Factor Authentication Server\MultiFactorAuthenticationMobileAppWebServiceSetup64.msi


Modifying Portal Configuration XML 

Depending on the final design of your deployment you will have to modify a number of the Web.Config IIS configuration files so that the portals are externally accessible and a privileged service account which is a member of the PhoneFactor Admins security group is used for authenticating with the MFA back end server.

  • MultiFactorAuthMobileAppWebService 

    Here you will need to edit the following sections;
    <add key=”WEB_SERVICE_SDK_AUTHENTICATION_USERNAME” value=”DOMAIN\Service-Account-Name” />
    <add key=”WEB_SERVICE_SDK_AUTHENTICATION_PASSWORD” value=”Password” /><setting name=”pfpaws_pfwssdk_PfWsSdk” serializeAs=”String”>

  • WebServiceSDK 

    Edit the following;
    <identity impersonate=”true” password=”Password” userName=”DOMAIN\Service-Account-Name” />

Verifying Mobile App Configuration

Once you have configured your XML files and opened the required firewall ports (443, 4898), you should test communications between your front end web server and back-end MFA server.

To do so log onto your front end web server and open the following URL : https://localhost/MultiFactorAuthMobileAppWebService/


Click on TestPfWsSDKConnection and click on the Invoke Button to check communications / security between the two servers.


If successful you should see the following XML string returned;

<?xml version=”1.0″ encoding=”UTF-8″?>
<string xmlns=“>success</string>

Branding Your MFA Portal

If you want to change the look of the MFA portal, this can be achieved by copying the entire default theme located in C:\inetpub\wwwroot\MultiFactorAuth\App_Themes and place it in the same directory. Now edit the Web.Config file located in the C:\inetpub\wwwroot\MultiFactorAuth directory and edit the following section;

<pages theme=”Default” controlRenderingCompatibilityVersion=”3.5″ clientIDMode=”AutoID”/>

<pages theme=”Your_New_Theme” controlRenderingCompatibilityVersion=”3.5″ clientIDMode=”AutoID”/>

A Final Note – Changes On The Way For The Azure Authenticator App 

On August 15th Microsoft is releasing an updated version of the mobile app, they will be re-branding it as “Microsoft Authenticator” and the update will apply to existing installations of the Azure Authenticator app.


The new app brings a complete refreshed design along with these key features;

  • Support for wearable devices
    Both the Apple Watch and Samsung Gear will be supported
  • Finger Print Approval
    Use your finger print instead of your passcode
  • Certificate Based Authentication
    Enterprises can utilise their PKI to deploy codes and remove the need for pass-codes

More information is available on Technet at


If you have any questions regarding the setup, implementation or any aspect of Azure MFA please feel free to reach out and I will do my best to help you out.

WMUG 10th Anniversary Event – 13th July



The WMUG team are hosting their 10th anniversary event in Microsoft, Paddington on the 13th of July 2016. The event is shaping up to be one of the biggest so far with guest speakers including;

  • Aaron Czechowski – @AaronCzechowski
    Senior Product Manager at Microsoft for both ConfigMgr and MDT
  • Nickolaj Andersen  – @NickolajA
    Senior consultant with Lumagate in Sweden, specialising in ConfigMgr – blog
  • Marcus Robinson – @techdiction
    Technical Evangelist with Microsoft UK

From the WMUG team, Peter Egerton will be running a session pitting the audience “Geeks” against the panel of experts “Guests” and I am honoured to be co-hosting a session on Azure Multi Factor Authentication with Terence Beggs.

1E are sponsoring the event and will be providing free lunch and refreshments throughout the day.

The full agenda is as follows;

Speaker name Session title Abstract Time
  Registration & Coffee   09:00
WMUG Welcome A quick welcome and introduction from the WMUG team 09:30
1E 1e Products A technical dive into the 1E product suite. 09:45
Nickolaj Andersen PowerShell and Configuration Manager An overview of Powershell coolness with Configuration Manager 10:45
Marcus Robinson Azure Automation DSC Azure Automation DSC for server based configuration management 11:30
Maurice Daly and  Terence Beggs MFA Goodness Microsoft Azure Multi-factor Authentication 13:00
Robert Marshall TBC TBC 13:45
Peter Egerton Geeks vs Guests We put the audience head to head against our panel of experts to see who knows more about being an IT Pro. 14:45
Q&A Open questions A chance to ask questions, get answers and openly discuss any thoughts you may have around Windows Management. 15:30
Giveaways Prize giveaways We have a System Center Universe Europe ticket to give away. 16:00
Aaron Czechowski What’s new in Configuration Manager Live and direct from Redmond, Aaron will tell us what we can look forward to in Configuration Manager. 16:10
  Close & Thanks   16:55

Registration is now open on the WMUG site –

We will also be giving away a FREE ticket to System Center Universe Europe in Berlin to one lucky attendee.



Veeam B&R V9 – Timed Out Waiting For Guest Interaction Proxy

Veeam Backup & Replication V9 was released on the 12th of January 2016 and brings a host of new features, including a new feature called the Guest Interaction Proxy (For more details on this visit

This new feature allows you to offload the guest interaction process to your proxies / hosts and reduce the load on the Veeam Backup Server. Having gone through the upgrade I started running my backup & replication jobs and noted that some jobs were failing due to the following error:

“Failed to prepare guests for volume snapshot”
This error was preceded in all cases by the line “Failed to inventory guest system: Timed out waiting for guest interaction proxy”.

Screenshot of error

After checking each of my jobs I found that the upgrade had set the Guest Interaction Proxy on each of the jobs to be my Veeam Backup Server for any jobs which were application aware. In each of the cases spreading the load to another proxy server resolved the issue, so I needed to update all of my jobs to set the Guest Interaction Proxy to the automatic setting and thus distribute based on current load.

To do so simply run these few lines of PowerShell from within either the local Veeam Backup Server PS instance or remotely using the Veeam Backup and Replication PowerShell Toolkit (obviously after connecting to your Veeam Server instance);

$Jobs = Get-VBRJob

# Start loop for each job
foreach ($Job in $Jobs)
	# Obtain the current VSS Options assigned
	$VSSOptions = (Get-VBRJob -name $
	# Modify the GuestProxyAutoDetect value, setting it to True
	$VSSOptions.GuestProxyAutoDetect = $true
	# Apply updated VSS Options
	Set-VBRJobVSSOptions -Job $Job -Options $VSSOptions

Screenshot post proxy change

A nice quick fix applied and my application aware jobs are now processing normally without failure.

Office 365 Online Archive Error – Failed to enable the new cloud archive

A couple of days ago I came across a strange issue whilst updating our archive directory contact details. A couple of our user updates did not reflect in the global address book in our Office 365 hybrid environment. I logged onto our domain controller and the details where correct, so I then logged into Azure Active Directory (as we are using Direction Sync) and found that the details pulled back from Azure were also correct, so what was the issue?.

When I opened the user properties in the Office 365 console I received the following error message:

“Exchange: Failed to enable the new cloud archive xxxxxxxx-xxxx-xxxxx-xxxx-xxxxxxxx of mailbox xxxxxxxx-xxxx-xxxxx-xxxx-xxxxxxxx because a different archive xxxxxxxx-xxxx-xxxxx-xxxx-xxxxxxxx exists. To enable the new archive, first disable the archive on-premises. After the next Dirsync sync cycle, enable the archive on-premises again.
Exchange: An unknown error has occurred. Refer to correlation ID: c0e37858-6d41-423a-b0bd-7d9bec686457″
So the issue was obviously related to a mailbox move that took place whereby the offline archive link was broken. So I used the Azure Active Directory PS (download from and connected again to my Azure environment to check the health properties of my users.
To do this run the following PS commands
Get-MSOLUser | Sort-Object DisplayName | ft DisplayName, ValidationStatus
This will return the full list of Office 365 enabled user accounts and their status. Narrowing this down with the following command will display a list of users with validation errors;
Get-MSOLUser | Where-Object {$_.ValidationStatus -eq "Error"} | Sort-Object DisplayName | ft DisplayName, ValidationStatus
Once you have the list of users with errors it is time to get their online archive information to compare against the details from Exchange. First lets retrieve the Archive GUID by running the following script in an Office365 PowerShell environment;
Get-Mailbox "user identity with error status" | ft Name, ArchiveGuid
Now switching over to our Exchange PS console I ran the following script to check what archive details matched;
Get-RemoteMailbox -Identity "user identity with error status" | fl DisplayName, ArchiveGuid
Comparing the two ArchiveGuid values presented the cause of the issue. During a routine mailbox move the ArchiveGuid value had not been updated to the current version.
I resolved this by running the following process;
1. Update the remote mailbox properties with the ArchiveGuid value obtained from Azure AD;
– Set-RemoteMailbox -Identity  “user identity with error status” –ArchiveGuid “Guid value from Azure AD”
2. Run a full directory sync in DirSync
3. Run Start-OnlineCoexistenceSync PS command:
On your DirSync server –
PS:\Import-Module DirSync
Now wait a few minutes, go back into your Office 365 portal and the issue should be resolved 🙂