WMUG 10th Anniversary Follow Up – Azure MFA 




WMUG Attendees – Thank You

First of all I would like to thank everyone who made the effort to turn up to the WMUG 10th Anniversary event in Microsoft, Paddington last month. Community is without a doubt the best place to learn and share real world experiences.

Unfortunately near the end of the session that Terence Beggs (@TerenceBeggs) and I were running on Azure MultiFactor Authentication the demo gods struck and our datacenter ISP suffered a minor outage resulting in a flaky connection to our live demo infrastructure (lesson learned to bring the VM’s on my local machine the next time).

So as promised here is both a follow up and a recap on the subject of Azure MultiFactor Authentication, the installation documentation on this product is excellent (https://azure.microsoft.com/en-us/documentation/articles/multi-factor-authentication-get-started-server/#install-and-configure-the-azure-multi-factor-authentication-server) so I won’t re-hash the same details but I will cover the key area’s.

A full run through of creating the MFA provider, installing the back-end server and demonstration of using MFA with an RD Gateway is included in the PowerPoint presentation from the link below;

Azure MultiFactor Auth Presentation

To recap..

What is Azure MultiFactor Authentication (MFA)?

Microsoft Azure Authentication is a product used to provide secondary authentication for systems both on-premise and in the cloud. Originally developed by a company called Phone Factor, the company was purchased in 2012 and re-branded / integrated into the Azure product stack in 2013.

MFA can be used purely as a cloud service with systems such as Office 365, SaaS apps and Azure App Proxy, for on premise applications including IIS, RDS and VPN systems there is an on-premise application and this is what I am going to focus on for this blog post.

For more information on the differences between MFA cloud vs on-premise visit https://azure.microsoft.com/en-us/documentation/articles/multi-factor-authentication-get-started/ 

Where Do I Get Started?

MFA is available as both a pay as you use and subscription (OPEN/MPSA etc) service, the pay as you use model is particularly useful if you want to run the system as a proof of concept.

To sign up you will need a Microsoft Azure enabled account, you can create one and even take advantage of a free trial account with €170 of credit at https://azure.microsoft.com/en-us/free/ and pricing details are available from https://azure.microsoft.com/en-us/pricing/details/multi-factor-authentication/.

I’ve Signed Up for an Azure Account.. Now Where Do I Go?

Adding the MultiFactor service to your Azure account could not be easier. Simple log onto the Azure management portal and click on the following;

  1. New App Service
  2. Active Directory
  3. Multi-Factor Authentication Provider
  4. Quick Create

At the last step you will be prompted as to the licensing model you wish to use, this can not be changed after this point so be careful when selecting. The choices are;

  1. Per Enabled User
  2. Per Authentication

If you change your mind post set up you must create a new MFA provider so although it is best to get this right the first time, it can be easily re-created if needed.

Downloading MFA On-Premise

Downloading the MFA on-premise server application is a straight forward process. Once your Azure MFA provider has been set up you simply log onto the portal and click on the Downloads link. On the Downloads page you will also have the ability to generate activation credentials, you will need these details to install the service (note that when you generate activation details they only last for 10 minutes for security purposes).


Planning Your MFA Deployment (Securely Publishing The Site)

Planning your MFA installation should take into account security factors for publishing resources over the public internet. Some examples of deployment scenarios are included within the PowerPoint slides, however you should always discuss these with your security team prior to implementation.

For the majority of installations the front end web portal should be published on a VM within your DMZ and access to the back-end MFA server provided via access control lists on your corporate firewall via ports 443 and 4898. An exception to this is whereby you are using a reverse proxy such as Pulse Secure SSL, F5, Citrix NetScaler, Barracuda, etc. 

Note that you will also need to provide direct HTTPS (Port 443) access to the following IP addresses for communications with the Azure MFA service;

  • –
  • –
  • –

To publish the MFA portal you will also need a trusted CA signed certificate. Personally I am using a previously purchased wild card cert but certificates are cheap these days, so visit your trusted CA site (GoDaddy for example) and obtain your internet facing SSL cert.

Installing Your MFA Back-End Application Server


  • IIS
  • .Net Framework 4.5
  • KB 2919355 on Server 2012 R2 (https://support.microsoft.com/kb/2919355)
    The updates must be installed in the following order: clearcompressionflag.exe, KB2919355, KB2932046, KB2959977, KB2937592, KB2938439, and KB2934018

Once you have ensured all pre-requisites are installed, installing the app server component is very straight forward.

Installing the User Portal, Web SDK and Mobile App Portal

During the post installation configuration stage you must install the user portal components to allow users to self enroll and manger their MFA account.

The User and Web Service SDK portals are installed directly within the MFA console, however if you wish to use the Azure Authenticator app for verifying your user sessions you will need to install the Mobile App Web Service portal manually from the following location;
C:\Program Files\Multi-Factor Authentication Server\MultiFactorAuthenticationMobileAppWebServiceSetup64.msi


Modifying Portal Configuration XML 

Depending on the final design of your deployment you will have to modify a number of the Web.Config IIS configuration files so that the portals are externally accessible and a privileged service account which is a member of the PhoneFactor Admins security group is used for authenticating with the MFA back end server.

  • MultiFactorAuthMobileAppWebService 

    Here you will need to edit the following sections;
    <add key=”WEB_SERVICE_SDK_AUTHENTICATION_USERNAME” value=”DOMAIN\Service-Account-Name” />
    <add key=”WEB_SERVICE_SDK_AUTHENTICATION_PASSWORD” value=”Password” /><setting name=”pfpaws_pfwssdk_PfWsSdk” serializeAs=”String”>

  • WebServiceSDK 

    Edit the following;
    <identity impersonate=”true” password=”Password” userName=”DOMAIN\Service-Account-Name” />

Verifying Mobile App Configuration

Once you have configured your XML files and opened the required firewall ports (443, 4898), you should test communications between your front end web server and back-end MFA server.

To do so log onto your front end web server and open the following URL : https://localhost/MultiFactorAuthMobileAppWebService/


Click on TestPfWsSDKConnection and click on the Invoke Button to check communications / security between the two servers.


If successful you should see the following XML string returned;

<?xml version=”1.0″ encoding=”UTF-8″?>
<string xmlns=http://www.phonefactor.com/PfPaWs“>success</string>

Branding Your MFA Portal

If you want to change the look of the MFA portal, this can be achieved by copying the entire default theme located in C:\inetpub\wwwroot\MultiFactorAuth\App_Themes and place it in the same directory. Now edit the Web.Config file located in the C:\inetpub\wwwroot\MultiFactorAuth directory and edit the following section;

<pages theme=”Default” controlRenderingCompatibilityVersion=”3.5″ clientIDMode=”AutoID”/>

<pages theme=”Your_New_Theme” controlRenderingCompatibilityVersion=”3.5″ clientIDMode=”AutoID”/>

A Final Note – Changes On The Way For The Azure Authenticator App 

On August 15th Microsoft is releasing an updated version of the mobile app, they will be re-branding it as “Microsoft Authenticator” and the update will apply to existing installations of the Azure Authenticator app.


The new app brings a complete refreshed design along with these key features;

  • Support for wearable devices
    Both the Apple Watch and Samsung Gear will be supported
  • Finger Print Approval
    Use your finger print instead of your passcode
  • Certificate Based Authentication
    Enterprises can utilise their PKI to deploy codes and remove the need for pass-codes

More information is available on Technet at https://blogs.technet.microsoft.com/enterprisemobility/2016/07/25/microsoft-authenticator-coming-august-15th/


If you have any questions regarding the setup, implementation or any aspect of Azure MFA please feel free to reach out and I will do my best to help you out.

WMUG 10th Anniversary Event – 13th July



The WMUG team are hosting their 10th anniversary event in Microsoft, Paddington on the 13th of July 2016. The event is shaping up to be one of the biggest so far with guest speakers including;

  • Aaron Czechowski – @AaronCzechowski
    Senior Product Manager at Microsoft for both ConfigMgr and MDT
  • Nickolaj Andersen  – @NickolajA
    Senior consultant with Lumagate in Sweden, specialising in ConfigMgr – blog http://www.scconfigmgr.com/)
  • Marcus Robinson – @techdiction
    Technical Evangelist with Microsoft UK

From the WMUG team, Peter Egerton will be running a session pitting the audience “Geeks” against the panel of experts “Guests” and I am honoured to be co-hosting a session on Azure Multi Factor Authentication with Terence Beggs.

1E are sponsoring the event and will be providing free lunch and refreshments throughout the day.

The full agenda is as follows;

Speaker name Session title Abstract Time
  Registration & Coffee   09:00
WMUG Welcome A quick welcome and introduction from the WMUG team 09:30
1E 1e Products A technical dive into the 1E product suite. 09:45
Nickolaj Andersen PowerShell and Configuration Manager An overview of Powershell coolness with Configuration Manager 10:45
Marcus Robinson Azure Automation DSC Azure Automation DSC for server based configuration management 11:30
Maurice Daly and  Terence Beggs MFA Goodness Microsoft Azure Multi-factor Authentication 13:00
Robert Marshall TBC TBC 13:45
Peter Egerton Geeks vs Guests We put the audience head to head against our panel of experts to see who knows more about being an IT Pro. 14:45
Q&A Open questions A chance to ask questions, get answers and openly discuss any thoughts you may have around Windows Management. 15:30
Giveaways Prize giveaways We have a System Center Universe Europe ticket to give away. 16:00
Aaron Czechowski What’s new in Configuration Manager Live and direct from Redmond, Aaron will tell us what we can look forward to in Configuration Manager. 16:10
  Close & Thanks   16:55

Registration is now open on the WMUG site – http://wmug.co.uk/c/e/10

We will also be giving away a FREE ticket to System Center Universe Europe in Berlin to one lucky attendee.



Veeam B&R V9 – Timed Out Waiting For Guest Interaction Proxy

Veeam Backup & Replication V9 was released on the 12th of January 2016 and brings a host of new features, including a new feature called the Guest Interaction Proxy (For more details on this visit https://www.veeam.com/blog/v9-robo-and-tape-backup-enterprise-enhancements.html).

This new feature allows you to offload the guest interaction process to your proxies / hosts and reduce the load on the Veeam Backup Server. Having gone through the upgrade I started running my backup & replication jobs and noted that some jobs were failing due to the following error:

“Failed to prepare guests for volume snapshot”
This error was preceded in all cases by the line “Failed to inventory guest system: Timed out waiting for guest interaction proxy”.

Screenshot of error

After checking each of my jobs I found that the upgrade had set the Guest Interaction Proxy on each of the jobs to be my Veeam Backup Server for any jobs which were application aware. In each of the cases spreading the load to another proxy server resolved the issue, so I needed to update all of my jobs to set the Guest Interaction Proxy to the automatic setting and thus distribute based on current load.

To do so simply run these few lines of PowerShell from within either the local Veeam Backup Server PS instance or remotely using the Veeam Backup and Replication PowerShell Toolkit (obviously after connecting to your Veeam Server instance);

$Jobs = Get-VBRJob

# Start loop for each job
foreach ($Job in $Jobs)
	# Obtain the current VSS Options assigned
	$VSSOptions = (Get-VBRJob -name $Job.name).VSSOptions
	# Modify the GuestProxyAutoDetect value, setting it to True
	$VSSOptions.GuestProxyAutoDetect = $true
	# Apply updated VSS Options
	Set-VBRJobVSSOptions -Job $Job -Options $VSSOptions

Screenshot post proxy change

A nice quick fix applied and my application aware jobs are now processing normally without failure.

MDT 2013 Update 1 – Static IP Address Issue

After upgrading our MDT 2013 environment with the update 1 release I was eager to build my Windows 10 image, until I came up against this issue while attempting to set a static IP for the VM –


I checked the bootstrap.ini, updated the deployment share and boot images but the issue persisted, so as a work around I used F8 to bring up the command prompt and the following commands to manually set the IP address details:

netsh interface ip set address name=”Ethernet” static (IP Address) (Subnet Mask) (Gateway)
netsh interface ip set dnsservers name=”Ethernet” source=static (IP Address of DNS Server)

Doing this allowed me to get past this and build / capture the image, however it since turns out that this is know listed on a recent TechNet Blog (http://blogs.technet.com/b/msdeployment/archive/2015/08/25/mdt-2013-update-1-release-notes-and-known-issues.aspx).

Fingers crossed that a patch is on the way!.

Looking Back @ Ignite 2015

IMG_0366 IMG_0400

This year I was fortunate enough to have the opportunity to attend the Microsoft Ignite event in Chicago. Having attended TechEd NA in Houston last year I was looking forward to more of the same quality technical content and product details around Azure, Server vNext and Systems Center suite.

I arrived on Saturday and spent much of the first few hours of my trip waiting to get on a Go Express shuttle bus, much to my annoyance. Once I arrived at the excellent Thompson hotel, I dropped off my gear and headed straight to the Microsoft store on the Magnificent Mile (North Michigan Avenue) to collect my badge followed by a quick visit to the Disney store to make sure I didn’t disappoint my two girls when I got home.

I must say it was great to see that Microsoft had catered so well for badge pickup with a wide list of hotels and other locations, it is just a pity they didn’t have a collection point in terminal 5 of O’Hare airport. On Sunday morning I attended the Microsoft 101 session presented by Joey Snow and Rick Claus (very funny guys), giving me an early opportunity to get familiar with the layout of McCormick Place and to run through the transportation routes open to me as my hotel was between two routes. During the afternoon I caught up with friends and went for a tour of Chicago and ultimately ended up finishing of night at the Krewe Meet N Greet party.


From the keynote Satya Nadella made it very clear in which direction Microsoft was pushing everyone, the era of cloud computing is well and truly here and it is changing both the role of I.T and the people that it empowers. Unless you have been under a rock for the past few years you would know that cloud is the number one driver for Microsoft especially given the success of their Office 365 platform, Microsoft is building on this with a constant stream of services arriving into Azure. It was good to see that Microsoft are however embracing the hybrid datacentre model, Microsoft is taking a partnership approach here which offers to serve vast resources to you via simple cloud integration where you require it.

Windows 10 dominated the majority of the keynote, in fact it was difficult to keep up with the build numbers being demonstrated across a wide range of devices. The advantage of running the same code across all platforms is clear, given this I would see Microsoft gaining much needed market share in the mobile arena as the ability to move your app from your Desktop, to your Surface to your phone enables people to work freely on any device.

Following the lengthy (3 hour) keynote we were all directed to the main canteen area across from the expo hall which was opening at the same time. This is where the cracks with the new Ignite conference started to open up.


My focus at Ignite was Azure, Hyper-V, Systems Center and Windows 10, so I had a lot of sessions to choose from. I constantly had to decide which of the 4 or 5 sessions that clashed to attend, however this is nothing different to TechEd and I must congratulate Microsoft on providing updated access to the sessions online each evening via the MyIgnite site. Content was hit and miss for some of the sessions, I was only attending level 300 & 400 and I found at times level 300 could have been dropped to 100/200 as the content was more about marketing than the technical substance behind the product.

As always there were absolute gems that some sessions turned up, in fact I found that the best session I attended was presented by a fellow Irishman Aidan Finn with his “The Hidden Treasures of Windows Server 2012 R2 Hyper-V” (http://channel9.msdn.com/Events/Ignite/2015/BRK3506). Aidan didn’t fail to impress the entire room with his PowerShell demonstrations and take away technical content.

From a Systems Center Config Manager / MDT / Windows 10 deployment side of things, sessions with Michael Niehaus, Mikael Nystrom, Johan Arwidmark and Kent Agerlund never disappoint with their clear technical content for future deployments of Windows 10. Now all I need to do is wait for the related SCCM / MDT updates.

Channel 9 Studio


The food this year was in a word, terrible. The packaged lunch on day one could have been excused but it simply didn’t get any better and McDonalds became the stable diet for the majority of attendees. This was so disappointing when compared to the previous year in Houston, I mean I am a fussy eater at the best of times but I always found something I could eat at TechEd 2014.

Snacks between sessions were also very limited, you basically had to know which small area would be set up with a canteen trolley which would be cleared within a minute or two.


The wireless was equal only to the food in terms of its sheer disappointment. I know that providing stable wireless connectivity for 23,000 attendees each of which with probably 2-3 devices is no easy task, but ping responses in the thousands and drop offs are not something I would have expected from a conference of this scale and given the amount of revenue generated. It didn’t improve throughout the week and I believe speakers also had their own issues although they had access to a separate network.


It was great to meet up with some of my friends from twitter and also make new friends along the way. The likeminded individuals that form connections at these conferences is a priceless part of the experience, as is often the case that challenges faced by you in your role have often a solution which can be shared by interacting with the right people. Ignite provides the perfect forum for these connections to be made.


I can’t say enough good things about this city, there was so many things to do I don’t think I got to go to half the places I had wanted to during my stay. The food was incredible in every place I tried, from steak to Indian and everything in between, and I am also now a fan of Goose Island beer :). The signature bar at the John Hancock tower provided probably the best views from a bar in the city and thanks to the Disney Store I received a very warm welcome home by my two girls.

Ignite 2016

I hope that with all the feedback Microsoft have taken from their attendees, Ignite 2016 will be a more mature event and the issues with food and wireless access will be overcome. Overall I had a great experience at Ignite 2015, bringing back a changed view on a number of topics which I am already implementing today in my job. Would I go back?.. Of course I would!.


Finally I must say a big thank you to Trevor Sullivan (@pcgeek86) and Nickolaj Anderson (@NickolajA) for introducing me to some great people while I was there.


PowerShell – Regin Malware Detection

The IT press has been full of stories about Symantec’s discovery of the Regin Malware threat. Symantec have released a security response about the threat (http://www.symantec.com/content/en/us/enterprise/media/security_response/whitepapers/regin-analysis.pdf) which includes MD5 file hashes, registry locations and known file locations associated with the malware. I needed a means of detecting of its presence on our network, so after seeing a post on Twitter from @jsnover I thought I would expand on the MD5 file check and include the file locations, registry items and compile the results into a text file which is stored on a central repository for review.

Note that for best results you should run the PS script under the local system account, you will also need these three files (rename these from .doc to .txt)
The files should be placed in the location specified as $ReginSourceFiles.

Note that the script and source files are provided without support and should be used at your own risk. Details of the registry entries, MD5 hashes and file locations are taken from Symantec’s Regin analysis report.

	 Created by:   	Maurice Daly
	 Filename:    ReginDetect.ps1 	
		PowerShell script to scan for knownn registry, file names and MD5 hashes
		assoicated with the Regin malware threat. Results are uploaded to a
		central file share.

Import-Module Storage
$ErrorActionPreference = "SilentlyContinue"
$ReginSourceFiles = "\\YOURSERVER\YOURSHARE\ReginSourceFiles"
$ReginResults = "\\YOURSERVER\YOURSHARE\ReginResults"
$ReginTemp = 'C:\Temp\ReginScan'
Get-ChildItem -Path $ReginSourceFiles | Copy-Item -Destination $ReginTemp -Include *.txt
If (!(Test-Path -Path $ReginTemp))
	New-Item -ItemType Directory -Path 'C:\Temp\ReginScan'
$MD5Values = Get-Content $ReginTemp\MD5Signatures.txt
$RegValues = Get-Content $ReginTemp\Registry.txt
$FileValues = Get-Content $ReginTemp\Files.txt

# Checking for Registry entries
Write-Host -ForegroundColor 'White' "Regin Scanning Tool"
Write-Host -ForegroundColor 'Cyan' "Checking Registry for Regin entries"
$RegistryDetection = foreach ($RegEntry in $RegValues)
	for ($i = 1; $i -le 10; $i++)
		write-progress -id 1 -activity "Scanning Registry Hive" -status "Checking for $RegEntry" -percentComplete ($i * 10);
		$RegTest = Test-Path $RegEntry
	if ($RegTest -eq $true)
		Write-Output "Regin registry entry found - $RegEntry"
sleep 2

# Checking for known files
Write-Host -ForegroundColor 'White' "Commencing known file name scan."

$KnownFileDetection = foreach ($KnownFile in $FileValues)
	for ($i = 1; $i -le 10; $i++)
		write-progress -id 1 -activity "Scanning Knonwn Files" -status "Scanning $KnownFile" -percentComplete ($i * 10);
		$FileTest = Test-Path $KnownFile
	if ($FileTest -eq $true)
		Write-Output "Known Regin file found at $KnownFile"
sleep 2

# Check entire drive for MD5 hash values
Write-Host -ForegroundColor 'White' "Commencing MD5 file hash scan, this might take several hours."
$FilesToScan = Get-ChildItem C:\ -Recurse -Exclude 0
$FileDetection = foreach ($File in $FilesToScan)
	for ($i = 1; $i -le 10; $i++)
		write-progress -id 1 -activity "Scanning Files & Folders" -status "Scanning $File" -percentComplete ($i * 10);
		$FileTest = Get-FileHash -Path $File -Algorithm MD5 | ? Hash -In $MD5Values
	if ($FileTest -eq $true)
		Write-Output "Regin MD5 file hash found - $File"

Write-Host -ForegroundColor 'Green' "Scanning Complete"
Write-Host ""
$Result = If (($RegistryDetection -gt $null) -or ($KnownFileDetection -gt $null) -or ($FileDetection -gt $null))
	Write-Host -BackgroundColor 'White' -ForegroundColor 'Red' "Regin elements have been found on workstation $env:COMPUTERNAME"
	If ($RegistryDetection -gt $null)
		Write-Host ""
		Write-Host -ForegroundColor 'White' -BackgroundColor 'Red' "Registry entries detected at the following locations:"
	If ($KnownFileDetection -gt $null)
		Write-Host ""
		Write-Host -ForegroundColor 'White' -BackgroundColor 'Red' "Known files detected at the following locations:"
	If ($FileDetection -gt $null)
		Write-Host ""
		Write-Host -ForegroundColor 'White' -BackgroundColor 'Red' "Known files detected at the following locations:"
		Write-Host ""
	$Result | Out-File -FilePath ("$ReginResults\" + $ENV:COMPUTERNAME + ".txt") -Force
	Write-Host ""
	Write-Host "Results uploaded to $ReginResults"

PowerShell – Apply Multiple Hotfixes Automatically

If you have ever had the need to apply multiple hotfixes to a server, this handy little script will apply all hotfixes located within a network share of your choice (simply download the MSU files to this location). The progress of each installation is displayed and you will be prompted to reboot post patching.


$HotfixTemp = 'C:\Temp\Hotfixes'
If (!(Test-Path -Path $HotfixTemp))
	New-Item -ItemType Directory -Path 'C:\Temp\Hotfixes\'
Set-Location -Path $HotfixTemp
Get-ChildItem -Path $HotfixDir | Copy-Item -Destination $HotfixTemp -Include *.msu
$Hotfixes = Get-ChildItem $HotfixTemp | Where-Object { $_.Name -like '*.msu' } | Select Name | Sort-Object

foreach ($Hotfix in $Hotfixes.name)
	$NextHotfix = ("$hotfixtemp" + "\$Hotfix" + " /quiet" + " /norestart")
	Start-Process C:\Windows\System32\wusa.exe -ArgumentList $NextHotfix
	while ((Get-Process wusa -ErrorAction 'SilentlyContinue') -ne $Null)
		for ($i = 1; $i -le 10; $i++)
			write-progress -id 1 -activity "Processing $Hotfix" -status "Installing..." -percentComplete ($i * 10);
			sleep 1;
Write-Host -ForegroundColor 'Green' "Patching completed. Please reboot for updates to take effect"

PowerShell – Restart Client Citrix ICA session & Print Spooler – Self Service

Our IT Helpdesk often receives calls from users who for a multitude of reasons have issues with their Citrix connection,  more often due to the fact they haven’t turned on their printer before connecting and cant see their local printer etc.

To resolve this we deployed the following self-service script that undertake the following processes to ensure that the user’s Citrix session is re-started in a clean state. In the script you will see I have listed a “YourApplication.lnk” as we have a common application that all users of Citrix use so it calls this by default.

The process is as follows:

1. Notify the user that their Citrix session will be closed.
2. Log out of Citrix session gracefully.
3. Terminate the Citrix ICA client.
4. Restart the Print Spooler service (to resolve any printing related issues).
5. Start the Citrix ICA client.
6. Check if the Citrix client is running and open the default business application.

If the process is unsuccessful or the user does not have the Citrix client installed an email is generated to the IT Helpdesk using their local Outlook client.

The script has been tested with both the legacy Citrix Online Plugin and Citrix Receiver 3.4 Enterprise.


$Prompt = new-object -comobject wscript.shell

# Variables for Citrix
$CitrixPNAPath = Get-Process -Name PNAMain -ErrorAction 'SilentlyContinue' | ForEach-Object { Write-Output $($_.Path) }
$CitrixAgentPath = ($CitrixPNAPath | Split-Path) + "\pnagent.exe"
$CitrixInstalled = Test-Path -Path $CitrixPNAPath
$CitrixShortCut = "$ENV:APPDATA" + "\Microsoft\Windows\Start Menu\Programs\Citrix Applications\Claims Applications\YourApplication.lnk"

# Message body for auto failure message
$MailBody = @"
I am having issues with my Citrix connection at present. I tried running the self service script but it was unable to resolve the problem. Can you please take a look.
- This is an automatically generated email on behalf of the user.

# Send email function
Function Notify-Helpdesk
Start-Process Outlook
$Outlook = New-Object -com Outlook.Application
$Mail = $Outlook.CreateItem(0)
$Mail.importance = 2
$Mail.subject = "Citrix Problems"
$Mail.body = $MailBody
$Mail.To = 'it.helpdesk@yourdomain.domain'

# Detect & Repair Citrix
If ($CitrixInstalled -eq $true)
$PromptWindow = $Prompt.popup('You will now be logged out of Citrix', 0, 'IT Self-Service')
Start-Process -FilePath $CitrixAgentPath -ArgumentList " /LogOff"
sleep 5
Get-Process | Where-Object {
$_.ProcessName -like "*WFICA*"
} | Stop-Process -Force
sleep 10
Restart-Service -Name Spooler -Force
Start-Process -FilePath $CitrixPNAPath
sleep 2
If ((Get-Process -Name PNAMain).responding -eq 'True')
$PromptWindow = $Prompt.popup('Citrix ICA client has been restarted successfully. YourApplication will now launch automatically.', 0, 'IT Self-Service')
invoke-item $CitrixShortCut
$PromptWindow = $Prompt.popup('Citrix ICA client failed to restart successfully. Outlook will now open and attempt to notify IT of your issue via email.', 0, 'IT Self-Service')
$PromptWindow = $Prompt.popup('Citrix ICA client not found. Outlook will now open and attempt to notify IT of your issue via email.', 0, 'IT Self-Service')

PowerShell Script to Email Notification of Password Expiration

Things that cross my mind...

We all have that set of users that either mainly use a mobile device for email access or possibly a client running a non Microsoft Windows OS as their main workstation.

Those users don’t get that friendly reminder to change their password that comes with logging onto a Windows OS near to their domain password expiration date, and this usually ends up with passwords expiring and phone calls to the IT Helpdesk to change them.

Wouldn’t it be much simpler if that group of users were emailed near to the time of password expiration, allowing the user to logon to OWA and change their password in their own time, negating the need for calls to the IT Helpdesk. In an attempt to reduce some of those calls to our own IT Helpdesk I wrote a PowerShell script to email members of a security group every day when their domain password…

View original post 379 more words