Client Certificate Renewal for the Azure AD Passthrough Authentication Agent

We recently came across an issue where the client certificate for the Azure AD Passthrough Authentication Agent wasn’t being renewed automatically. I understand that it should be renewed about 30 days out, however this hadn’t happened and we were fast approaching expiry.

A simple way to renew the certificate manually is to jump onto the server where the passthrough agent is installed and run the following Powershell commands to re-register the agent with Azure AD. The agent status in Azure AD will momentarily change to Inactive, but will then become Active again. You’ll need to run the .ps1 script from the directory where the Passthrough Agent is installed, typically this is under C:\Program Files\.

You can probably achieve the same result by uninstalling and reinstalling the agent also, but I haven’t tried that method.

PS C:\Program Files\Microsoft Azure AD Connect Authentication Agent>

$User = '<Insert Global Administrator Username Here>'

$PlainPassword = '<Insert Global Administrator Password Here>'

$SecurePassword = $PlainPassword | ConvertTo-SecureString -AsPlainText -Force

$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $SecurePassword

.\RegisterConnector.ps1 -modulePath "C:\Program Files\Microsoft Azure AD Connect Authentication Agent\Modules\" -moduleName "PassthroughAuthPSModule" -Authenticationmode Credentials -Usercredentials $cred -Feature PassthroughAuthentication

Azure Conditional Access Named Locations IPv6 Support

Named Locations in Azure Conditional Access only supports locations based on IPv4 IP address ranges. For sign-ins that are coming from IPv6 addresses where you are looking to enforce a geo-policy, you’ll need to manually add the IPv6 address ranges for the countries you wish to exclude/include as appropriate.

Whilst looking into this a colleague of mine discovered the following resource that details the IPv4 address ranges for multiple countries – not sure how often it is updated, but a good starting point to reduce any sign-in issues for those users using IPv6 addresses:

Regional Internet Registries Statistics – RIR Delegations – New Zealand (NZ) – IPv6 address delegations (

It’s quite easy to copy the table from there into a CSV file and then import the list into Azure to create a new location based on IPv6 addresses.

More than one MediaPool reported for same Media

I’ve been battling with System Center 2012 R2 Data Protection Manager for the last couple of days when trying to add a HP StoreEver MSL 4048 Tape Library connected via directly attached Fibre.

I could perform a rescan and discover the Tape Library and the Tape Drive, however whenever I tried to perform an Inventory, the DPM console would refresh and the Tape Library and Tape Drive would disappear. Under Monitoring it displayed an error relating to zoning and making sure that the medium changer was not presented to the DPM server.

Eventually, I found an error in the Event Logs reporting “More than one MediaPool reported for same Media xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx”. As a result of this I found this blog post and sure enough I found duplicate entries when running the following SQL query

prc_Global_MM_ArchiveMedia_GetMediaPool 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'

After backing up the database and running the

SetSharedDpmDatabase -RemoveDatabaseSharing

command, I reran the SQL query and the duplicates disappeared.

I could then rescan and perform an Inventory against the Tape Library.

Local Administrator Password Solution

I’ve worked for a number of organisations and with customers who carry out penetration testing on their computer networks. One of the common issues that is raised around penetration testing of Windows clients is that the local administrator password is usually the same on all clients and this increases the risk of a Pass-the-Hash compromise.

Microsoft have now addressed this issue by releasing the Local Administrator Password Solution (LAPS) which in my opinion is long overdue.

Basically LAPS still uses a common local administrator account name, but generates a random password for each client and stores that password in a confidential attribute against the machine account in Active Directory.

The password can then be read from Active Directory by those users who are authorised to do so.

You can read more about this tool and download it here.

ConfigMgr 2012 R2 Operating System Deployment – Failed to get client identity 80004005 – Surface Pro 3

I was on site with a customer today building some Surface Pro 3 devices using Operating System Deployment in System Center Configuration Manager 2012 R2.

The devices were booting from USB Boot Media, however when searching for task sequences the following error message appeared – An error occurred while retrieving policy for this computer (0x80004005). For more information, please contact your system administrator or helpdesk operator.

Checking the SMSTS.log file, I discovered that the time and date stamps in the log seemed to be a few days old and I had come across a previous issue where if the time and date on the device were out, the task sequence would not run. I therefore decided I would have a look in the UEFI BIOS and check the time and date – only problem is on the Surface Pro 3, you cannot set the date and time in the UEFI BIOS.

To get round this issue, I opened the command prompt by pressing F8 during Windows PE (ensure you have Command Support Testing enabled) and used the date and time commands to reset the date and time to the current values, rebooted and the task sequence ran successfully.

Increasing the size of the SCCM Client Cache during Operating System Deployment

Sometimes, when you have large applications to deploy during an Operating System Deployment task sequence, it will fail because the size of the SCCM Client Cache is not big enough to cache the application installation files. By default, the SCCM Client Cache is set to 5120MB unless you specify a different value when installing the client using the SMSCACHESIZE property.

You can change the size of the SCCM client cache during a deployment by using a script. I usually implement this as a “Run Command Line” task sequence step which I run immediately before the application installation task sequence steps.

Here is the script, together with details of how to configure it:

strCacheSize = 10240
Set oUIResource = CreateObject("UIResource.UIResourceMgr")
Set cacheinfo = oUIResource.GetCacheInfo
cacheinfo.TotalSize = strCacheSize

1. Save the above script as smscachesize.vbs and amend the strCacheSize value as appropriate.
2. Place the script in a directory that can be used as a package source folder.
3. Create a package without a program and use the directory created above as the source folder.
4. Distribute the package to the appropriate distribution points.
5. In the task sequence, after the “Setup Windows and Configuration Manager” step, but before the steps that will install the applications, add a “Run Command Line” step.
6. Configure the “Run Command Line” step to execute the script using the following command line: “cscript.exe smscachesize.vbs”, and reference the package you created above.
7. Perform a test deployment before making the changes to your live deployment task sequence.

If you wish, you can of course add another step after the applications have been installed to set the cache back to the default value of 5120MB.

I’ve only ever used this script with System Center 2012 Configuration Manager, but it should also work with Configuration Manager 2007.