Use node.js LTS for Azure Functions

Did a proof-of-concept for an Azure Function in node.js that uses Redis Cache for session storage as the function runtime is 100% managed and using memory store doesn’t make sense. Once I had this I wanted to play with the local development support but that didn’t work as I was using node.js v. 13 and the Azure tooling only works an LTS version of node.js (meaning 12.x at the time of this writing). To fix I had to uninstall my current version of node.js and switch to v. 12.x as follows (I’m using Homebrew to manage dependencies):

brew uninstall node
brew install node@12

Then I had to update my PATH as the node binary is not in /usr/local/bin anymore but rather in /usr/local/opt/node@12/bin. Once that was done the Azure tooling for local application development worked like a charm.

The proof-of-concept is at https://github.com/lekkimworld/poc-azure-functions-with-session.

Utility to generate JWT’s for use with Salesforce

Whenever you work with Json Web Tokens (JWT’s) generating them for testing is always a hassle as they usually are required to expire quite quickly (like in the order of minutes). To make that easier I wrote a small utility in node.js to generate JWT’s compatible with the Salesforce OAuth 2.0 JWT Bearer Flow.

Code is on Github at https://github.com/lekkimworld/salesforce-jwt-generator

YMMV!

Using the inbound OAuth 2.0 JWT Bearer Flow in Salesforce

If working with JWT’s for use with the inbound OAuth 2.0 JWT Bearer Token Flow you need to import a public key and a certificate to validate the signature of the JWT when calling into Salesforce. This is done on the Connected App and the import supports binary DER-format and the plain text PEM-format.

Once you’ve created your Connected App, check “Enable OAuth Settings” and under “Use digital signatures” import the certificate (PEM or DER format) to use to validate the signature of the JWT.

Using the actual flow requires you set the Consumer Key as the issuer (“iss”), the username of the user to act as, as the subject (“sub”) and the login-url as the audience (“aud”). The login-url will be https://login.salesforce.com, https://test.salesforce.com or a community url.

To make it easier to work with and test I’ve created a node.js console app that allows you to generate JWT’s that are compatible with Salesforce. The code is on Github at https://github.com/lekkimworld/salesforce-jwt-generator.

Exchanging the JWT for an access_token is as below setting the grant_type to urn:ietf:params:oauth:grant-type:jwt-bearer and specifying the signed JWT using the assertion-parameter:

POST /services/oauth2/token HTTP/1.1
 Host: login.salesforce.com
 Content-Type: application/x-www-form-urlencoded

grant_type=urn:ietf:params:oauth:grant-type:jwt-bearer
&assertion=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJpYX...VtbyJ9.AjsbapI5XTeLpPLZJk2_a2PnpAV0iUOT6xxgUWZsjYBeH9FcWHjiS6DMw1xuNyOcHNxY6hTAp1_D6HPDY4i0hgOFzb0YUaaWf9MoplpNknsGhYZ0SOHX2OSIfFVZ7KdPx1_BudRSi3VDNt33EZhf3cm07rMSJu-DOzHP1BSJE4HXALusEV3WgdSyijUce4daF3PVANI8w-yGAhFkdO8RCrCAufaZVxtTI1ZmnXeDRxbULQZ9hnn0vtgYHaMcgTK41ZGay3UN7XVa-FERG4WcdnvylPAhnalgSFlCDX3UHvUdn-wxYX0pSPw41R2rjPUDCWBiEV8ULzEiWQrBpyqkww

Generating JWT’s for Azure in Apex

Lately I’ve been playing around with Azure and integrating Salesforce and Azure. One of the integration patterns calls for using Json Web Tokens (JWT) that you can the exchange for an access token in Azure. There is a catch however…

Since Azure requires that the thumbprint of the certificate be added to the header of the JWT (using the key “x5t”) we cannot use the built in support for JWT in Named Credentials as there are no provisions for custom header key/values. The JTW/JWS classes in Apex cannot be used either as we cannot customize the header there either. Building upon https://github.com/salesforceidentity/jwt I’ve created https://github.com/lekkimworld/azurejwt-apex that bridges the gap.

This allows you to build and sign a JWT that you may exchange for an access token using your tenants OAuth token endpoint v.2 in Azure. Example Apex code is like this:

// declarations (because I'm old school)
final String azureClientId = '88d888a5-0cf4-473a-b9a0-7c88e6fc888e';
final String azureTenantId = 'b34feb2b-132f-4322-af1d-c888f5d888d0';
final String azureCertThumbprint = '4rElsDFTysrbKhB0zTsrRNSxT6s=';
final String azureScopes = '5384888d-868f-442b-b1b3-8688807de914/.default';

// create JWT with certificate from keys mgmt and set the x5t in the header to the 
// thumbprint of the cert as expected by Azure
AzureJWT jwt = new AzureJWT();
jwt.cert = 'JWT_Callout_Certificate';
jwt.iss = azureClientId;
jwt.sub = azureClientId;
jwt.aud = 'https://login.microsoftonline.com/' + azureTenantId + '/oauth2/v2.0/token';
jwt.x5t = azureCertThumbprint;

// invoke the flow and obtain an access_token
final String access_token = AzureJWTBearerFlow.getAccessToken(azureClientId, azureTenantId, azureScopes, jwt);

// use the access token against a Function App in Azure
HttpRequest req = new HttpRequest();
req.setEndpoint('https://foo-functions-demo.azurewebsites.net/api/MyFunction?name=Salesforce');
req.setMethod('GET');
req.setHeader('Authorization', 'Bearer ' + access_token);
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

In the https://github.com/lekkimworld/azurejwt-apex Github repo you will find the two Apex classes from the above example together with the example code.

The certificate thumbprint (bold above) isn’t the regular SHA-1 thumbprint but is a special hexdump/base64 encoded edition. To make it even more interesting the thumbprint displayed in Azure Portal is not the thumbprint we need. The thumbprint/hash may be computed this like (gleaned from https://stackoverflow.com/a/52625165):

echo $(openssl x509 -in yourcert.pem -fingerprint -noout) | sed 's/SHA1 Fingerprint=//g' | sed 's/://g' | xxd -r -ps | base64

Note-to-Self – Powerpoint Presentation as Powerpoint Template on Mac

Power-user tip if you’re on Mac – to make a Powerpoint Presentation into a Presentation Template in Powerpoint – do as follows:

  1. Save the Powerpoint presentation to your Mac, open it in Powerpoint, press Cmd-Shift-S (Save-as) and save as a Powerpoint Template (.potx)
  2. Go to Finder and press Cmd-Shift-G and paste in the following path without the quotes “~/Library/Group Containers/UBF8T346G9.Office/User Content/Templates”
  3. Move the potx-file to this folder
  4. (Re-)Launch Powerpoint (or select “New from Template” from the File-menu)
  5. Search for a word from the template title or switch to Personal-templates to see the template

YMMV!

Using an Auth. Provider and Named Credentials in Salesforce with Azure OAuth

Please note: When I refer to “Azure” below I’m referring to Microsoft Azure, the cloud product from Microsoft.

Please note: If you know all about why Auth. Providers and Named Credentials are great and simply wanna know about how to use them with Microsoft Azure feel free to skip down to “How does it apply to Azure”.

All this started some time back when I was at a customer showing how to integrate with Salesforce and call other API’s from Salesforce. In Salesforce we have a great concept called Authentication Providers (“Auth. Provider”) that handles the underlying authentication protocol such as OAuth 2.0. Auth. Providers may be used to provide Single-Sign-On in Communities (our portals) or with Named Credentials. The latter is a way to externalize authentication from a piece of code or functionality. So if I need to call an external API from Apex (our Java-like programming language) I can simply do something like the code below:

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_NamedCredential/accounts/list');
req.setMethod('GET');
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

As you can see there is nothing here about authentication. Nor is there anything around the actual URL being called. That information is externalized into a Named Credential called My_NamedCredential in this case. The only part I specify is the specific path of the API I’m referring to (here “/accounts/list”). This is great for development as it makes it easier for me the developer but it’s also easier to admins moving changes between environments as the endpoint and credential management is externalized as setup for the org. It means it’s easy to change credentials and endpoints between development, test, QA and production. Awesome!

Setting this up is pretty easy as well and is done in 3 steps:

  1. Start by setting up the Auth. Provider by specifying your client_id (we call it the “Consumer Key”), the client_secret (we call it the “Consumer Secret”), the Authorization endpoint and the token endpoint. The last two you get from your provider – in this case Azure. For now with version 2 of their identity platform they will be https://login.microsoftonline.com/<tenant>/oauth2/v2.0/authorize and https://login.microsoftonline.com/<tenant>/oauth2/v2.0/token respectively (replace <tenant> with your tenant id).
  2. Now create a Named Credential specifying the root URL you would like to call against in the “URL” field. For “Identity Type” select “Named Principal” to use the same credentials across the org or “Per User” to use user specific credentials and set “Authentication Protocol” to “OAuth 2.0”. In “Authentication Provider” select the provider we created above set the scope to use.
  3. Now use the Named Credential as discussed above.

Now let’s discuss what’s special about Azure.

How does it apply to Azure?

Above I was intentionally pretty loose when discussing the scope to set in the Named Credentials. The reason is that this is quite specific when dealing with Azure.

In Azure an access token is actually a Json Web Token (JWT, https://jwt.io) which is a standardized token format containing signed claims that may be verified by the recipient. The payload in a JWT access token from Azure could look like this:

{
  "aud": "2dd1b05d-8b45-4aba-9181-c24a6f568955",
  "iss": "https://sts.windows.net/e84c365f-2f35-8c7d-01b4-c3412514f42b/",
  "iat": 1573811923,
  "nbf": 1573811923,
  "exp": 1573815823,
  "aio": "42VgYJj0Zt3RXQIblViYGkuiHs9dAQA=",
  "appid": "32c0ba71-04f4-4b3a-a317-1f1febd5fc22",
  "appidacr": "1",
  "idp": "https://sts.windows.net/e84c365f-2f35-8c7d-01b4-c3412514f42b/",
  "oid": "394e0c1a-0992-42e7-9875-3b04786147ca",
  "sub": "394e0c1a-0992-42e7-9875-3b04786147ca",
  "tid": "e84c365f-2f35-8c7d-01b4-c3412514f42b",
  "uti": "ESAOZDPFeEydSYxohgsRAA",
  "ver": "1.0"
}

Here the important piece is the “aud” claim as it contains the ID of the application or API on Azure the token is valid for. In this case it’s for an App Registration in Azure.

When we deal with OAuth providers we might be used to deal with standard OpenID Connect scopes like openid, email, profile and offline_access. For Azure which is much more of a generic platform the scope we specify is used to indicate what application we are requesting access to. In Azure the issued access token is specific to the application we request access to and we can only request access for a single application at a time which have some implications:

  1. If you request an access token for an application (App Registration in Azure) the access token is valid for that application only
  2. The access token is not valid for other API’s like the Microsoft Graph
  3. If you do not request an access token for a specific application Azure issues you an access token for the Microsoft Graph API

This may be obvious but it caused me quite some troubleshooting.

OAuth scopes in Azure

When you request an access token from Azure you must specify what API you intend to use it for. There are two versions of the OAuth endpoints (v1 and v2) – in version 1 you use a resource-parameter to indicate the target application to Azure. In version 2 this has been standardized and is now using the standard scope-parameter. It also means that scope is now not simply the OpenID Connect standard scopes (such as openid, offline_access) or the application specific scopes i.e. from Microsoft Graph but is also used to indicate the API you are requesting access to i.e. an App Registration.

Now to the fun, the stuff you just need to know and the stuff which is easy enough to find if you know what to Google for…

Specifying the URI of an App Registration is the scope is not enough. You have to add a scope to the Application URI and unless you’ve defined a custom scope for your app you should use “.default”. So if your App Registration has an URI of “api://2dd1b05d-8b45-4aba-9181-c24a6f568955” use “2dd1b05d-8b45-4aba-9181-c24a6f568955/.default” as the scope. If you’ve registed custom scopes for an application those may be used in place of .default but you cannot combine .default with more specific scopes. Also using specific scopes are also restricted in use for certain OAuth flows due to the way delegated permissions work in Azure. But that’s for another time.

Using the client_credentials grant type (works somewhat like the Salesforce username/password OAuth flow) this becomes a POST like the one below. Please note that the Azure tenant_id is both in the URL and a parameter in the POST body:

POST /f84c375f-3f35-4d9d-93b3-c4216511f87a/oauth2/v2.0/token HTTP/1.1
 Host: login.microsoftonline.com
 Content-Type: application/x-www-form-urlencoded
 Content-Length: XYZ
 Connection: close
client_id=32c0ba71-04a4-4b3a-a317-1b1febd5fc22
&client_secret=shhhhh...
&grant_type=client_credentials
&tenant=f84c375f-3f35-4d9d-93b3-c4216511f87a
&scope=2dd1b95d-8b45-4aba-9181-c24f6f5e8955/.default

The response is a JSON document with an access_token (a JWT which may then be used as a Bearer token):

{
   "token_type":"Bearer",
   "expires_in":3599,
   "ext_expires_in":3599,
   "access_token":"eyJ0eXAiOiJKV1QiLCJhbGciO..."
}

When using Azure with Salesforce I would recommend using version 2 of the OAuth endpoints as Salesforce Auth. Providers and Named Credentials do not have a way to send custom parameters without resorting to writing a custom Auth. Provider implementation. This means there is no standard way to send the “resource” parameter to the version 1 OAuth endpoint.

What scopes should I specify in Salesforce?

When you create your Auth. Provider or Named Credentials specify the scopes you need from Azure. My personal preference is to specify the scopes on the Auth. Provider when using it for Single-Sign-On and specifying the scopes on the Named Credentials when using it for API access. One thing to note again is that an access token is for one API only and that an access token for a custom application will not work for the Microsoft Graph as well.

Lesson learned: You should not specify the UserInfo endpoint on the Auth. Provider in Salesforce unless it’s used for Single-Sign-On AND you are not specifying an App Registration in the Scopes-field.

If you specify an “App Registration scope” in the Scopes-field and specify the UserInfo endpoint Salesforce will attempt to read from the UserInfo endpoint following successful authentication using the obtained access token which will fail because the access token is only valid for the intended API and not for the Microsoft Graph.

Feel free to add other standard OpenID Connect scopes for Auth. Providers for Single-Sign-On. For most uses you would want to specify the offline_access scope as it ensures your Auth. Provider or Named Credential receives a refresh token.

Calling API’s protected by Azure API Management (APIM)

So far so good. Now you can obtain access tokens and use them with Azure Function Apps or read from the Microsoft Graph. But what if you need to access API’s hosed in Azure API Management (APIM)? Well read on…

In Azure API Management API’s are governed by subscriptions and you need to specify a subscription ID when calling into the API. The subscriptions map into users which are different from the ones in Azure AD. I’m not really sure why this is the case but I’m sure there are reasons.

The subscription ID you should use would be supplied by the people managing the API and is a GUID like string. When you call the API be sure to supply it using the “Ocp-Apim-Subscription-Key”-header. Failing to supply the subscription ID will result in an error like the one below:

{"statusCode": 401,   
"message": "Access denied due to missing subscription key. Make sure to include subscription key when making requests to an API."}

Putting all the above together to POST to an API behind Azure API Management using Apex would be something like the below using a Named Credentials called “My_NamedCredential”:

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_NamedCredential/echo
req.setMethod('POST');
req.setHeader('Ocp-Apim-Subscription-Key', '7f9ed...1d6e8');
req.setBody('Hello, Salesforce, World!');
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

As always… YMMV!

Running Mulesoft AnyPoint Studio on MacOS with homebrew

I’m using homebrew to manage many of the addons I use on my Mac. I had openjdk 12.0 installed previously but needed OpenJDK 1.8 to run Mulesoft AnyPoint Studio 7.4. Making that work I had to install OpenJDK 1.8 from adoptopenjdk (https://adoptopenjdk.net/) using homebrew and then tweak the ini-file controlling the AnyPoint Studio which is Eclipse based.

I started by downloading and installing Mulesoft AnyPoint Studio from Mulesoft. Then I installed openjdk 1.8 using homebrew.

brew tap AdoptOpenJDK/openjdk
brew cask install adoptopenjdk8

OpenJDK 1.8 was installed to /Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk and the Java Runtime Environment could be found in /Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/jre. Making sure it works is always a good idea.

/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/jre/bin/java -version

Now edit the AnypointStudio.ini for and specify the Java Runtime to use using the -vm switch (in bold).

-startup
../Eclipse/plugins/org.eclipse.equinox.launcher_1.4.0.v20161219-1356.jar
--launcher.library
../Eclipse/plugins/org.eclipse.equinox.launcher.cocoa.macosx.x86_64_1.1.551.v20171108-1834
-vm 
/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/jre
-vmargs
--add-modules=ALL-SYSTEM
-Xms512m
-Xmx1024m
-XX:MaxPermSize=512m
-Dosgi.instance.area.default=@user.home/AnypointStudio/studio-workspace
-Dhttps.protocols=TLSv1.2,TLSv1.1,TLSv1
-Dsun.zip.disableMemoryMapping=true
-Dequinox.resolver.revision.batch.size=1
-Dmule.testingMode=true
-Dorg.mule.tooling.runtime.args=-XX:-UseBiasedLocking,-Dfile.encoding=UTF-8
-Dorg.mule.tooling.runtime.proxyVmArgs=-Dcom.ning.http.client.AsyncHttpClientConfig.useProxyProperties=true
-Djdk.http.auth.tunneling.disabledSchemes=
-XX:ErrorFile=./studio_crash_report.log
-Dorg.mule.tooling.client.usecache=true
-Dtooling.concurrent.local.repository.enabled=false
-Dtooling.client.configuration.filter.parameters.reserved.names=false
-Dfile.encoding=UTF-8
-Djava.awt.headless=true
-XstartOnFirstThread
-Dorg.eclipse.swt.internal.carbon.smallFonts