I Love Powershell Gallery, but...
Sometimes you have to go it alone.
In my last post I went through how to retrieve secrets from Azure Key Vault, from creating Key Vaults, through securing them with certificates, and fetching those secrets - all from powershell.
For the most part I kept it simpler by making use of Powershell modules - in particular MSAL.PS and Az (or subsets thereof). And then, in the final script, I switched to using the Azure REST API for the very last step.
Why?
Two reasons.
First is that modules - and the models underlying them - come and go. You write your powershell code around the model du jour, and then the future arrives, the model is declared obsolete, your code no longer works. A case in point is most definitely Azure. In the beginning we had Azure Classic, then we had Azure RM, now we have Azure AD, with Powershell modules to match. I have tens of obsolete scripts (and two obsolete books) on this alone to prove my point.
Second relates to my new job. I'm writing powershell for execution in an environment I don't control. I may want to make use of MSAL.PS to simplify my code, but my code is running on a client's environment, and they get to say what's permissible. So learning to write code without the crutch of unlimited PSGallery wealth is part of the deal.
So that's what I'm doing now, mandated by my work. The downside is that I have to spend time researching how all this stuff really works, when I could just PSG-shortcut to delivering cool stuff.
The upside is that I get to learn and understand how stuff really works.
So I'm choosing to take the same approach for my personal stuff, particularly where there's a good chance there'll be an overlap with work. Which happens quite often.
So for this exercise I'm going back to first principles, where the key problem is to create a token without the use of MSAL.PS while staying with the secure-by-certificate model. Getting a token starting from a client secret is well-documented, but that doesn't help, because then you need a way of keeping the client secret safe. And no, for obvious reasons saving the client secret in Key Vault is not the answer.
In that previous blog, the following pieces of information were supplied to MSAL.PS, so they should be sufficient for my own substitute code:
the TenantId (identifies the Azure instance we're working with)
the ApplicationId (the application we created with permissions to read from Key Vault)
the Certificate (the application has the certificate's public key, my machine has the full certificate, including the private key)
the Scope (identifies the resources we're requesting the token to authorise)
That's one constraint. Another is that - for my purposes - I'll be putting the code into a module, for my own personal use. For version 1, it's sufficient to hard-code everything but the certificate, so we can dispense with the Connect-AzAccount and Get-AzAdApplication function calls. (Version 2 may provide a function call to override the defaults.)
The final constraint is that I'm only permitting myself to use Microsoft documentation and RFCs from the web. I'm not permitting myself to reverse engineer MSAL.PS, nor to find ready-made answers in blogs written by Olga from Omsk.
So I'm going to quote sources as I go. This is because I did actually find a ready-made answer on Adam the Automator's blog here, quite late in the development. Adam commented that the process 'wasn't obvious in Microsoft's documentation', so I wanted to make sure that I had references.
The first reference is Microsoft identity platform application authentication certificate credentials which links to Microsoft identity platform and the OAuth 2.0 client credentials flow . The matching RFC is RFC 7519. None of these references, however, explained the precise algorithm of Base64url encoding, for which we need to see RFC 7515 (appendix C).
That function appears in my draft module, BPAzureAuth...
#
# module to handle token creation and related functions
#
#
# see https://communary.net/2015/04/05/encodedecode-base64url/
function Invoke-Base64UrlEncode {
<#
.SYNOPSIS
.DESCRIPTION
.NOTES
http://blog.securevideo.com/2013/06/04/implementing-json-web-tokens-in-net-with-a-base-64-url-encoded-key/
Author: Øyvind Kallstad
Date: 23.03.2015
Version: 1.0
#>
[CmdletBinding()]
param (
[Parameter(Position = 0, Mandatory)]
[byte[]] $Argument
)
$output = [System.Convert]::ToBase64String($Argument)
$output = $output.Replace('=', '')
$output = $output.Replace('+', '-')
$output = $output.Replace('/', '_')
Write-Output $output
}
Export-ModuleMember Invoke-Base64UrlEncode
The above lines were the source of a particularly awkward bug. The error manifested when I presented the JWT to the KeyVault API, and it got bounced, saying something like 'Invoke-RestMethod : Specified value has invalid Control characters'. That turned out to be a smart-hyphen (rather than a simple ANSI hyphen) inserted when pasting from the web.
Taken together, the Microsoft and RFC references allow us to build the JSON Web Token (JWT):
Build the Header and Claims structures. A powershell hashtable is a suitable data structure for each structure.
Convert the Header and Claims structures to JSON - powershell's
ConvertTo-JSON -Compress
cmdlet is appropriate here.Convert the JSON strings from the previous step to Base64url encoding.
Join the two Base64url-encoded string with a period '.' character. This is the 'data' part of the complete assertion.
Create a signature for the resulting string using the certificate's private key, and encode it using the same base64url algorithm.
To the result of step 4, append a period '.' character and the base64url-encoded signature. This is the complete assertion (the JWT).
Steps 3-6 are encapsulated into the function below.
function New-JSONWebToken {
[CmdletBinding()]
param (
[Parameter(Mandatory)]
[string[]] $StringList,
[Parameter(Mandatory)]
$Certificate
)
$EncodedStringList = @()
$StringList | foreach {
$ClearTextString = $_
# Convert supplied strings to base64
$ClearTextBytes = [System.Text.Encoding]::UTF8.GetBytes($ClearTextString)
$Base64EncodedString = Invoke-Base64UrlEncode $ClearTextBytes
$EncodedStringList += $Base64EncodedString
}
# Join header and Payload with "." to create a valid (unsigned) JWT
$StringToSign = $EncodedStringList -join '.'
# Get the private key object of your certificate
$PrivateKey = $Certificate.PrivateKey
# Define RSA signature and hashing algorithm
$RSAPadding = [Security.Cryptography.RSASignaturePadding]::Pkcs1
$HashAlgorithm = [Security.Cryptography.HashAlgorithmName]::SHA256
# Create a signature of the JWT
$Signature = Invoke-Base64UrlEncode ($PrivateKey.SignData([System.Text.Encoding]::UTF8.GetBytes($StringToSign),$HashAlgorithm,$RSAPadding))
# Join the signature to the JWT with "."
$JWT = $StringToSign + "." + $Signature
$JWT
}
Export-ModuleMember New-JSONWebToken
We now need to build this into a REST API call to obtain a token. This is described by Microsoft identity platform and the OAuth 2.0 client credentials flow, at the section Get a token (Second case: Access token request with a certificate)
Note that we are only concerned with the Request token flow, as the consent phase is implied by the association of the Azure Application with the Certificate.
function Get-AzRESTToken {
[CmdletBinding()]
param (
[Parameter(mandatory=$true)]
$Certificate,
[Parameter(mandatory=$true)]
$ApplicationId,
[Parameter(mandatory=$false)]
$TenantId = 'd394964e-de09-4ed8-84c8-e6457963f5fc',
[Parameter(mandatory=$false)]
$Scope = "https://vault.azure.net/.default"
)
$epoch = [datetime]::Parse("1970-01-01T00:00:00Z")
$now = Get-Date
$nowSeconds = [math]::Floor(($now - $epoch).TotalSeconds)
$after10minsSeconds = $nowSeconds + (10 * 60)
$CertificateBase64Hash = [System.Convert]::ToBase64String($Certificate.GetCertHash())
$EndPoint = "https://login.microsoftonline.com/$TenantId/oauth2/token"
$randomGuid = ([guid]::NewGuid()).Guid
#
# build the payload
$JWTHeader = @"
{
"alg": "RS256",
"typ": "JWT",
"x5t": "$($CertificateBase64Hash -replace '\+','-' -replace '/','_' -replace '=')"
}
"@
$JWTHeader = $JWTHeader | ConvertFrom-Json | ConvertTo-Json -Compress
$JWTClaims = @"
{
"aud": "$EndPoint",
"exp": $after10minsSeconds,
"iss": "$ApplicationId",
"jti": "$randomGuid",
"nbf": $nowSeconds,
"sub": "$ApplicationId"
}
"@
$JWTClaims = $JWTClaims | ConvertFrom-Json | ConvertTo-Json -Compress
$JWT = New-JSONWebToken -StringList @($JWTHeader,$JWTClaims) -Certificate $Certificate
# Create a hash with body parameters
$Body = @{
client_id = $ApplicationId
client_assertion = $JWT
client_assertion_type = "urn:ietf:params:oauth:client-assertion-type:jwt-bearer"
scope = $Scope
grant_type = "client_credentials"
}
$Url = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
# Use the self-generated JWT as Authorization
$Header = @{
Authorization = "Bearer $JWT"
}
# Splat the parameters for Invoke-Restmethod for cleaner code
$PostSplat = @{
ContentType = 'application/x-www-form-urlencoded'
Method = 'POST'
Body = $Body
Uri = $Url
Headers = $Header
}
$Request = Invoke-RestMethod @PostSplat
$Request
}
Export-ModuleMember Get-AzRESTToken
As an aside, that construction:
$JWTClaims = $JWTClaims | ConvertFrom-Json | ConvertTo-Json -Compress
I'll be using that in a later blog post, I expect. For readability's sake, I work with JSON laid out across multiple lines for readability, but the technique above flattens-out the JSON into a single line, which is what Key Vault requires for its secrets.
Note that the format of the returned token is different to the token object returned by MSAL.PS so care is needed when accessing the token structure - see how it's handled below.
That completes the module file, which is named BPAzureAuth.psm1 . At some future date I'll complete that process (and maybe look at setting up a module server) - but for testing purposes it's fine to import the psm1 file directly - as you'll see in the invoking script below.
So here's the code to retrieve the secret, modified from my last post, removing the dependencies on MSAL.PS, and removing queries against Azure to determine things that I'll probably embed as constants in the final version of the module:
#
# script to test the KB module can get a secret
#
param ($TenantId = 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
$ApplicationId = '1acfcfbe-049a-4a4f-a14d-2ec2cb26bbc3', # 'BillPSecretReader'
$KVScope = 'https://vault.azure.net/.default',
$KeyVaultName = 'MyVaultName',
$KeyVaultUri = 'https://myvaultname.vault.azure.net/'
)
Remove-Module BPAzureAuth
Import-Module ..\BPAzureAuth\BPAzureAuth.psm1 -Verbose
$CertFriendlyName = "KVReader_$($env:USERNAME)_$($env:COMPUTERNAME)"
$ClientCertificate = Get-ChildItem Cert:\CurrentUser\My | where {$_.FriendlyName -eq $CertFriendlyName}
$Token = Get-AzRESTToken -Certificate $ClientCertificate `
-ApplicationId $ApplicationId `
-Scope $KVScope
function Get-KVSecretSet {
[CmdletBinding()]
# [OutputType([string])]
param (
[parameter (mandatory=$false)] [string] $SecretName='PSAutomation',
[parameter (mandatory=$true)] $KeyVaultUri,
[parameter (mandatory=$true)] $Token
)
if ($Token.GetType().Name -eq 'AuthenticationResult') {
#
# MSAL token
$AccessToken = $Token.AccessToken
}
else {
#
# client secret-derived token from KeyVault REST API
$AccessToken = $Token.access_token
}
$headers = @{}
$headers["Content-Type"] = 'application/json'
$headers["Authorization"] = "Bearer $($AccessToken)"
$headers["Accept"] = 'application/json; version=4'
$Url = $KeyVaultUri + "secrets/$SecretName"
$Url = $Url + "?api-version=7.3"
# Splat the parameters for Invoke-Restmethod for cleaner code
$GetSplat = @{
ContentType = 'application/json'
Method = 'GET'
# Create string by joining bodylist with '&'
# Body = $ExternalReference #| ConvertTo-Json
Uri = $Url
Headers = $headers
}
# Request the secret!
$Response = Invoke-RestMethod @GetSplat
$Response
}
$Secret = Get-KVSecretSet -SecretName TestSecret -Token $Token -KeyVaultUri $KeyVaultUri
Write-Host "Secret is '$($Secret.value)'"
That function - GetKVSecretSet
- is (of course) going to find its way into a future module that abstracts whatever Key Vault operations I need. That will be the interface I program to when I'm working with Key Vault, and if Microsoft does declare its REST APIs obsolete at some future date, then I'll preserve that interface, and create a new set of modules to implement my interface, and not have to recode every script that uses Key Vault.
Next post to follow soon...