Category Archives: Automation

Devices that lack a bitlocker recovery key in AzureAD

With Intune’s new Bitlocker Encryption Report administrators have an effective way of seeing which of their devices have been encrypted.

But if we want to know if we can actually recover the bitlocker key of a device, we need to know if it was ever uploaded to AzureAD.

Network or local device issues can sometimes prevent the recovery key from reaching AzureAD, resulting in lost data if the device’s disk needs to be recovered for any reason. To hunt down devices that have not escrowed their recovery key to AzureAD, you can use my report function (in PowerShell as always):

GitLab source download link

Programmatically triggering a group licenses refresh for AzureAD

Azure AD allows us to assign licenses to groups, a nifty feature that has made a host of automation scripts dealing with bulk license assignment obsolete.

A problem I’ve encountered is that when you assign users to a group, license assignments are not processed right away, especially if you didn’t have enough licenses when you assigned the user to the group (and added licenses to the tenant later).

Azure AD has a button to trigger an update manually:

But of course, this can also be automated with PowerShell!

function Invoke-AzHAPIReprocessGroupLicenses{
    <#
        .SYNOPSIS
        reprocesses group license assignment

        .NOTES
        Author: Jos Lieben

        .PARAMETER AzureRMToken
        Use Get-azureRMToken to get a token for this parameter

        .PARAMETER groupGUID
        GUID of the group to reprocess licenses of
    
        Requires:
        - Global Administrator Credentials (non-CSP!)
        - AzureRM Module
        - supply result of get-azureRMToken function
    #>
    param(
        [Parameter(Mandatory = $true)]$AzureRMToken,
        [Parameter(Mandatory = $true)]$groupGUID
    )
    $header = @{
        'Authorization' = 'Bearer ' + $AzureRMToken
        'X-Requested-With'= 'XMLHttpRequest'
        'x-ms-client-request-id'= [guid]::NewGuid()
        'x-ms-correlation-id' = [guid]::NewGuid()
    }   

    $url = "https://main.iam.ad.ext.azure.com/api/AccountSkus/Group/$groupGUID/Reprocess"
    Invoke-RestMethod –Uri $url –Headers $header –Method POST -Body $Null -UseBasicParsing -ErrorAction Stop -ContentType "application/json"
}

Source on GIT: https://gitlab.com/Lieben/assortedFunctions/blob/master/invoke-AzHAPIReprocessGroupLicenses.ps1https://gitlab.com/Lieben/assortedFunctions/blob/master/invoke-AzHAPIReprocessGroupLicenses.ps1

Disclaimer: the ‘hidden azure api’ is not officially supported.

Requires output from the Get-AzureRMToken function

Removing special characters from UTF8 input for use in email addresses or login names

When working with non-US customers, users often have characters in their names like ë, ó, ç and so on. Most of the time, a ‘human process’ converts these to their simple equivalent of e, o and c for use in computerized systems.

When searching for such a mapping of special characters to ‘safe’ characters I had a hard time finding a good list or PowerShell method to automatically convert special characters to standard A-Z characters so I wrote one:

function get-sanitizedUTF8Input{
    Param(
        [String]$inputString
    )
    $replaceTable = @{"ß"="ss";"à"="a";"á"="a";"â"="a";"ã"="a";"ä"="a";"å"="a";"æ"="ae";"ç"="c";"è"="e";"é"="e";"ê"="e";"ë"="e";"ì"="i";"í"="i";"î"="i";"ï"="i";"ð"="d";"ñ"="n";"ò"="o";"ó"="o";"ô"="o";"õ"="o";"ö"="o";"ø"="o";"ù"="u";"ú"="u";"û"="u";"ü"="u";"ý"="y";"þ"="p";"ÿ"="y"}

    foreach($key in $replaceTable.Keys){
        $inputString = $inputString -Replace($key,$replaceTable.$key)
    }
    $inputString = $inputString -replace '[^a-zA-Z0-9]', ''
    return $inputString
}

#example usage:
get-sanitizedUTF8Input -inputString "Jösè"
#result:
Jose

Edit: my colleague Gerbrand alerted me to a post by Grégory Schiro which solves this issue much more elegantly using native .NET functions. My slightly modified version to really ensure nothing non a-zA-Z0-9 gets past the function:

function Remove-DiacriticsAndSpaces
{
    Param(
        [String]$inputString
    )
    $objD = $inputString.Normalize([Text.NormalizationForm]::FormD)
    $sb = New-Object Text.StringBuilder
 
    for ($i = 0; $i -lt $objD.Length; $i++) {
        $c = [Globalization.CharUnicodeInfo]::GetUnicodeCategory($objD[$i])
        if($c -ne [Globalization.UnicodeCategory]::NonSpacingMark) {
          [void]$sb.Append($objD[$i])
        }
      }
    
    $sb = $sb.Normalize([Text.NormalizationForm]::FormC)
    return($sb -replace '[^a-zA-Z0-9]', '')
}
#example usage:
Remove-DiacriticsAndSpaces -inputString "Jösè"
#result:
Jose

And an even easier oneliner I converted to a function by John Seerden:

function Remove-DiacriticsAndSpaces
{
    Param(
        [String]$inputString
    )
    #replace diacritics
    $sb = [Text.Encoding]::ASCII.GetString([Text.Encoding]::GetEncoding("Cyrillic").GetBytes($inputString))

    #remove spaces and anything the above function may have missed
    return($sb -replace '[^a-zA-Z0-9]', '')
}

And the most advanced function I’ve found so far is by 
Daniele Catanesi (PsCustomObject): https://github.com/PsCustomObject/New-StringConversion/blob/master/New-StringConversion.ps1 in which all features of above functions are supported and parameterized.

Reporting on global tenant storage usage and per site storage usage

As my employer is a Microsoft Cloud Service Provider, we want to monitor the total storage available and the total storage used by all of the tenants we manage under CSP, including storage used by Sharepoint and Teams. This called for a script!

per customer total storage usage overview

I slimmed down the resulting script to work for just a single tenant that you can use to generate an XLSX report of which of your sites / teams are nearing their assigned storage quota. You can either build your own alerting around this to raise site quota’s before your users upload too much data, or you can use it to buy additional storage from Microsoft before your tenant reaches the maximum quota 🙂

per site storage overview in excel

As usual, find it on Gitlab!

Finding files in Sharepoint Online or Teams that exceed 218 path length

A well known issue when migrating to Office 365 (Sharepoint, Teams and Onedrive) is path length.

Recently, Microsoft increased the maximum path length in Sharepoint Online from 256 to 400 characters (total length of the URL). This causes issues when you use Office, because Office 2013, 2016 and 2019 do not support paths over 218 characters in length.*

To help you proactively identify files that exceed this limit I wrote a PowerShell script you can run:

  • it can filter based on file type
  • automatically finds and processes all sharepoint sites in your tenant
  • automatically finds and processes all team sites in your tenant
  • it can handle multi-factor authentication

Find get-filesWithLongPathsInOffice365.ps1 on GitLab

It leans heavily on the great work done by the community around OfficePnP, all credits to the community for providing so much quality code for free!

*longer paths may still work, this is not a hard limit

SAP SuccessFactors to Active Directory Sync (updated users)

For a customer that is using SuccessFactors to manage their employees / contractees, I wrote a script that will update the AD accounts of any person that is updated in SuccessFactors.

I expect you’ll have working knowledge on how to configure SF PerformanceManager to export the users you wish to update to a CSV file on the sFTP server SF provides for you.

With that, you should be able to configure the script. It’ll basically map any field you export to any field in Active Directory you wish. In some cases, such as the Manager field, special logic has been added to the script to look up the user’s manager. For other special fields you may have to write your own logic.

If you wish, the script will provide you with a full report in your email, for example:

Get it @ Gitlab directly: https://gitlab.com/Lieben/assortedFunctions/blob/master/update-AdUsersFromSAPSuccessFactorsReport.ps1

SAP SuccessFactors to Active Directory Sync (disabled users)

For a customer that is using SuccessFactors to manage their employees / contractees, I wrote a script that will disable the AD accounts of any person that is disabled in SuccessFactors.

I expect you’ll have working knowledge on how to configure SF PerformanceManager to export the users you wish to disable to a CSV file on the sFTP server SF provides for you.

With that, you should be able to configure the script. If you wish, the script will provide you with a full report in your email, for example:

Get it @ Gitlab directly: https://gitlab.com/Lieben/assortedFunctions/blob/master/disable-AdUsersFromSAPSuccessFactorsReport.ps1

Full AzureAD Applications Permission overview

So you’d like to know which applications are living in your AzureAD?

And you’d like to know which of those were added by your admins, and what permissions those applications have?

And you’d also like to know which applications your users are consenting to, and what rights those applications have on your users?

Look no further, I wrote a script to export all of that to Excel for you!

Application overview

Apps an admin has consented to and the type of rights it needs

Apps a user has consented to and the type of rights it needs

Apps to user mapping, for an easy overview of which user has consented to which app

Get it at:

Credits to Doug Finke for the Excel module I’m using!

 

Getting remoteapps through vm custom extension on Azure session brokers

So I wanted to retrieve the remoteapps present on VM’s in a uniform way, without logging in to either VM’s or database.

Using a custom extension, I tried to execute the Get-RDRemoteApp command and got the following:

Get-RDRemoteApp : A Remote Desktop Services deployment does not exist on server. This operation can be perfor
med after creating a deployment. For information about creating a deployment

Apparently, all the powershell commands for RDS require that you DON’T run them under SYSTEM. Of course VMExtensions run under SYSTEM. So, to get all remoteapps in a RDS deployment, execute the following Powershell script as VMExtension on a connection broker VM:

 

$farms = get-childitem "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Terminal Server\CentralPublishedResources\PublishedFarms"
foreach($farm in $farms){
    (get-childItem "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Terminal Server\CentralPublishedResources\PublishedFarms\$($farm.PSChildName)\Applications").PSChildName
}

To register this Powershell script as a VM extension and retrieve the results

  1. Save the above PS code to a file
  2. Upload the file somewhere (e.g. public blob storage)
  3. Get the URL of the File
  4. Use Login-AzureRMAccount
  5. Execute Set-AzureRmVMCustomScriptExtension -FileUri URL TO SCRIPT -Run FILENAME OF SCRIPT -VMName VMNAME -Name “RetrieveRemoteApps” -ResourceGroupName RESOURCEGROUP NAME -location “westeurope” -ForceRerun $(New-Guid).Guid
  6. To retrieve the list (after execution): [regex]::Replace(((Get-AzureRmVMDiagnosticsExtension -ResourceGroupName RESOURCEGROUP NAME -VMName VM NAME -Name “RetrieveRemoteApps” -Status).SubStatuses[0].Message), “\\n”, “`n”)

Running an Azure runbook on a System hybrid worker

Azure Runbooks are usually run in the cloud (on an automatically assigned ‘Microsoft’ host) or on a Hybrid Worker Group.

Hybrid Worker Groups consist of 1 or more machines, but there are also ‘System hybrid workers’, which are machines monitored by OMS. If you want to execute a Powershell script directly on a specific System hybrid worker, or on a specific group member of a worker group, you can use Powershell and specify the host instead of the group:

Start-AzureRmAutomationRunbook -Name “RunbookName” -RunOn hybridWorkerName -AutomationAccountName “automationaccount” -ResourceGroupName “resourcegroup”

If you try this on a System Hybrid Worker, you’ll get an error on the device itself and in the runbook results:

“Invalid Runbook xxx Authenticode signature status – NotSigned”.

This can be ‘fixed’ by setting the following registry key to ‘False’:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\HybridRunbookWorker\GuidOfYourWorker\EnableSignatureValidation

Et voila, the runbook runs nicely. I do not recommend disabling this key in production, this article is purely to share knowledge, and if someone knows how to do this without disabling this key, I’d love to hear it!