Utility to generate JWT’s for use with Salesforce

Whenever you work with Json Web Tokens (JWT’s) generating them for testing is always a hassle as they usually are required to expire quite quickly (like in the order of minutes). To make that easier I wrote a small utility in node.js to generate JWT’s compatible with the Salesforce OAuth 2.0 JWT Bearer Flow.

Code is on Github at https://github.com/lekkimworld/salesforce-jwt-generator

YMMV!

Using the inbound OAuth 2.0 JWT Bearer Flow in Salesforce

If working with JWT’s for use with the inbound OAuth 2.0 JWT Bearer Token Flow you need to import a public key and a certificate to validate the signature of the JWT when calling into Salesforce. This is done on the Connected App and the import supports binary DER-format and the plain text PEM-format.

Once you’ve created your Connected App, check “Enable OAuth Settings” and under “Use digital signatures” import the certificate (PEM or DER format) to use to validate the signature of the JWT.

Using the actual flow requires you set the Consumer Key as the issuer (“iss”), the username of the user to act as, as the subject (“sub”) and the login-url as the audience (“aud”). The login-url will be https://login.salesforce.com, https://test.salesforce.com or a community url.

To make it easier to work with and test I’ve created a node.js console app that allows you to generate JWT’s that are compatible with Salesforce. The code is on Github at https://github.com/lekkimworld/salesforce-jwt-generator.

Exchanging the JWT for an access_token is as below setting the grant_type to urn:ietf:params:oauth:grant-type:jwt-bearer and specifying the signed JWT using the assertion-parameter:

POST /services/oauth2/token HTTP/1.1
 Host: login.salesforce.com
 Content-Type: application/x-www-form-urlencoded

grant_type=urn:ietf:params:oauth:grant-type:jwt-bearer
&assertion=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJpYX...VtbyJ9.AjsbapI5XTeLpPLZJk2_a2PnpAV0iUOT6xxgUWZsjYBeH9FcWHjiS6DMw1xuNyOcHNxY6hTAp1_D6HPDY4i0hgOFzb0YUaaWf9MoplpNknsGhYZ0SOHX2OSIfFVZ7KdPx1_BudRSi3VDNt33EZhf3cm07rMSJu-DOzHP1BSJE4HXALusEV3WgdSyijUce4daF3PVANI8w-yGAhFkdO8RCrCAufaZVxtTI1ZmnXeDRxbULQZ9hnn0vtgYHaMcgTK41ZGay3UN7XVa-FERG4WcdnvylPAhnalgSFlCDX3UHvUdn-wxYX0pSPw41R2rjPUDCWBiEV8ULzEiWQrBpyqkww

Using an Auth. Provider and Named Credentials in Salesforce with Azure OAuth

Please note: When I refer to “Azure” below I’m referring to Microsoft Azure, the cloud product from Microsoft.

Please note: If you know all about why Auth. Providers and Named Credentials are great and simply wanna know about how to use them with Microsoft Azure feel free to skip down to “How does it apply to Azure”.

All this started some time back when I was at a customer showing how to integrate with Salesforce and call other API’s from Salesforce. In Salesforce we have a great concept called Authentication Providers (“Auth. Provider”) that handles the underlying authentication protocol such as OAuth 2.0. Auth. Providers may be used to provide Single-Sign-On in Communities (our portals) or with Named Credentials. The latter is a way to externalize authentication from a piece of code or functionality. So if I need to call an external API from Apex (our Java-like programming language) I can simply do something like the code below:

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_NamedCredential/accounts/list');
req.setMethod('GET');
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

As you can see there is nothing here about authentication. Nor is there anything around the actual URL being called. That information is externalized into a Named Credential called My_NamedCredential in this case. The only part I specify is the specific path of the API I’m referring to (here “/accounts/list”). This is great for development as it makes it easier for me the developer but it’s also easier to admins moving changes between environments as the endpoint and credential management is externalized as setup for the org. It means it’s easy to change credentials and endpoints between development, test, QA and production. Awesome!

Setting this up is pretty easy as well and is done in 3 steps:

  1. Start by setting up the Auth. Provider by specifying your client_id (we call it the “Consumer Key”), the client_secret (we call it the “Consumer Secret”), the Authorization endpoint and the token endpoint. The last two you get from your provider – in this case Azure. For now with version 2 of their identity platform they will be https://login.microsoftonline.com/<tenant>/oauth2/v2.0/authorize and https://login.microsoftonline.com/<tenant>/oauth2/v2.0/token respectively (replace <tenant> with your tenant id).
  2. Now create a Named Credential specifying the root URL you would like to call against in the “URL” field. For “Identity Type” select “Named Principal” to use the same credentials across the org or “Per User” to use user specific credentials and set “Authentication Protocol” to “OAuth 2.0”. In “Authentication Provider” select the provider we created above set the scope to use.
  3. Now use the Named Credential as discussed above.

Now let’s discuss what’s special about Azure.

How does it apply to Azure?

Above I was intentionally pretty loose when discussing the scope to set in the Named Credentials. The reason is that this is quite specific when dealing with Azure.

In Azure an access token is actually a Json Web Token (JWT, https://jwt.io) which is a standardized token format containing signed claims that may be verified by the recipient. The payload in a JWT access token from Azure could look like this:

{
  "aud": "2dd1b05d-8b45-4aba-9181-c24a6f568955",
  "iss": "https://sts.windows.net/e84c365f-2f35-8c7d-01b4-c3412514f42b/",
  "iat": 1573811923,
  "nbf": 1573811923,
  "exp": 1573815823,
  "aio": "42VgYJj0Zt3RXQIblViYGkuiHs9dAQA=",
  "appid": "32c0ba71-04f4-4b3a-a317-1f1febd5fc22",
  "appidacr": "1",
  "idp": "https://sts.windows.net/e84c365f-2f35-8c7d-01b4-c3412514f42b/",
  "oid": "394e0c1a-0992-42e7-9875-3b04786147ca",
  "sub": "394e0c1a-0992-42e7-9875-3b04786147ca",
  "tid": "e84c365f-2f35-8c7d-01b4-c3412514f42b",
  "uti": "ESAOZDPFeEydSYxohgsRAA",
  "ver": "1.0"
}

Here the important piece is the “aud” claim as it contains the ID of the application or API on Azure the token is valid for. In this case it’s for an App Registration in Azure.

When we deal with OAuth providers we might be used to deal with standard OpenID Connect scopes like openid, email, profile and offline_access. For Azure which is much more of a generic platform the scope we specify is used to indicate what application we are requesting access to. In Azure the issued access token is specific to the application we request access to and we can only request access for a single application at a time which have some implications:

  1. If you request an access token for an application (App Registration in Azure) the access token is valid for that application only
  2. The access token is not valid for other API’s like the Microsoft Graph
  3. If you do not request an access token for a specific application Azure issues you an access token for the Microsoft Graph API

This may be obvious but it caused me quite some troubleshooting.

OAuth scopes in Azure

When you request an access token from Azure you must specify what API you intend to use it for. There are two versions of the OAuth endpoints (v1 and v2) – in version 1 you use a resource-parameter to indicate the target application to Azure. In version 2 this has been standardized and is now using the standard scope-parameter. It also means that scope is now not simply the OpenID Connect standard scopes (such as openid, offline_access) or the application specific scopes i.e. from Microsoft Graph but is also used to indicate the API you are requesting access to i.e. an App Registration.

Now to the fun, the stuff you just need to know and the stuff which is easy enough to find if you know what to Google for…

Specifying the URI of an App Registration is the scope is not enough. You have to add a scope to the Application URI and unless you’ve defined a custom scope for your app you should use “.default”. So if your App Registration has an URI of “api://2dd1b05d-8b45-4aba-9181-c24a6f568955” use “2dd1b05d-8b45-4aba-9181-c24a6f568955/.default” as the scope. If you’ve registed custom scopes for an application those may be used in place of .default but you cannot combine .default with more specific scopes. Also using specific scopes are also restricted in use for certain OAuth flows due to the way delegated permissions work in Azure. But that’s for another time.

Using the client_credentials grant type (works somewhat like the Salesforce username/password OAuth flow) this becomes a POST like the one below. Please note that the Azure tenant_id is both in the URL and a parameter in the POST body:

POST /f84c375f-3f35-4d9d-93b3-c4216511f87a/oauth2/v2.0/token HTTP/1.1
 Host: login.microsoftonline.com
 Content-Type: application/x-www-form-urlencoded
 Content-Length: XYZ
 Connection: close
client_id=32c0ba71-04a4-4b3a-a317-1b1febd5fc22
&client_secret=shhhhh...
&grant_type=client_credentials
&tenant=f84c375f-3f35-4d9d-93b3-c4216511f87a
&scope=2dd1b95d-8b45-4aba-9181-c24f6f5e8955/.default

The response is a JSON document with an access_token (a JWT which may then be used as a Bearer token):

{
   "token_type":"Bearer",
   "expires_in":3599,
   "ext_expires_in":3599,
   "access_token":"eyJ0eXAiOiJKV1QiLCJhbGciO..."
}

When using Azure with Salesforce I would recommend using version 2 of the OAuth endpoints as Salesforce Auth. Providers and Named Credentials do not have a way to send custom parameters without resorting to writing a custom Auth. Provider implementation. This means there is no standard way to send the “resource” parameter to the version 1 OAuth endpoint.

What scopes should I specify in Salesforce?

When you create your Auth. Provider or Named Credentials specify the scopes you need from Azure. My personal preference is to specify the scopes on the Auth. Provider when using it for Single-Sign-On and specifying the scopes on the Named Credentials when using it for API access. One thing to note again is that an access token is for one API only and that an access token for a custom application will not work for the Microsoft Graph as well.

Lesson learned: You should not specify the UserInfo endpoint on the Auth. Provider in Salesforce unless it’s used for Single-Sign-On AND you are not specifying an App Registration in the Scopes-field.

If you specify an “App Registration scope” in the Scopes-field and specify the UserInfo endpoint Salesforce will attempt to read from the UserInfo endpoint following successful authentication using the obtained access token which will fail because the access token is only valid for the intended API and not for the Microsoft Graph.

Feel free to add other standard OpenID Connect scopes for Auth. Providers for Single-Sign-On. For most uses you would want to specify the offline_access scope as it ensures your Auth. Provider or Named Credential receives a refresh token.

Calling API’s protected by Azure API Management (APIM)

So far so good. Now you can obtain access tokens and use them with Azure Function Apps or read from the Microsoft Graph. But what if you need to access API’s hosed in Azure API Management (APIM)? Well read on…

In Azure API Management API’s are governed by subscriptions and you need to specify a subscription ID when calling into the API. The subscriptions map into users which are different from the ones in Azure AD. I’m not really sure why this is the case but I’m sure there are reasons.

The subscription ID you should use would be supplied by the people managing the API and is a GUID like string. When you call the API be sure to supply it using the “Ocp-Apim-Subscription-Key”-header. Failing to supply the subscription ID will result in an error like the one below:

{"statusCode": 401,   
"message": "Access denied due to missing subscription key. Make sure to include subscription key when making requests to an API."}

Putting all the above together to POST to an API behind Azure API Management using Apex would be something like the below using a Named Credentials called “My_NamedCredential”:

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_NamedCredential/echo
req.setMethod('POST');
req.setHeader('Ocp-Apim-Subscription-Key', '7f9ed...1d6e8');
req.setBody('Hello, Salesforce, World!');
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

As always… YMMV!

Populating the user object with passport.js and Salesforce OAuth

Using passport.js is a great option for doing authentication in node.js applications with great strategies for authenticating through just about anything on the planet including Salesforce. Using passport.js with Salesforce involves using the OAuth2Strategy but the user object in the session is not usable really as I really want actual information about the user to be there. The solution I came up with was overriding the userProfile-method and adding a call to the Salesforce userinfo endpoint as shown below.

// configure authentication using oauth2
app.use(passport.initialize());
passport.serializeUser(function(user, done) {
    done(null, user.username);
});
passport.deserializeUser(function(login, done) {
    done(undefined, {
        "username": login
    });
});

OAuth2Strategy.prototype.userProfile = function(accessToken, done) {
    this._oauth2.get(`https://${process.env.SF_LOGIN_URL || "login.salesforce.com"}/services/oauth2/userinfo`, accessToken, function (err, body, res) {
        if (err) { return done(new InternalOAuthError('Failed to fetch user profile', err)); }
        try {
            let json = JSON.parse(body);
            let profile = {
                "provider": "Salesforce.com",
                "username": json.preferred_username,
                "name": json.name,
                "email": json.email,
                "firstname": json.given_name,
                "lastname": json.family_name,
                "payload": json
            };
            
            done(null, profile);
        } catch(e) {
            done(e);
        }
    });
}

passport.use(new OAuth2Strategy({
        authorizationURL: `https://${process.env.SF_LOGIN_URL || "login.salesforce.com"}/services/oauth2/authorize`,
        tokenURL: `https://${process.env.SF_LOGIN_URL || "login.salesforce.com"}/services/oauth2/token`,
        clientID: process.env.SF_CLIENT_ID,
        clientSecret: process.env.SF_CLIENT_SECRET,
        callbackURL: process.env.SF_CALLBACK_URL
    },
    function(accessToken, refreshToken, profile, cb) {
        cb(undefined, profile);
    }
));

The interesting piece is really the code in bold where I inject a call to /services/oauth2/userinfo to get information about the user and then add that as the user object.

Of course after having done all this I found passport-salesforce which is a strategy that does exactly the same thing – duh!!! Anyways it was fun to code it up.

Salesforce Canvas Apps

A Salesforce Canvas app is an often overlooked easy way to integrate existing apps into Salesforce. A Canvas app is inlined into the Salesforce user interface and it requires only a very small change to your app to have it play nice with Salesfore. In theory you could get away without any change but usually you’d like to know who the calling user is. What’s really great about a Canvas App is that this information is POST’ed to the application at invocation together with an OAuth access_token to allow authenticated callbacks to Salesforce. To implement this you need to:

  1. Support POST at a URL you specify
  2. Render the application from here or redirect the user after the POST has been received
  3. Receive and handle the signed request

The signed request is a base64 encoded blob in two parts separated by a period. It looks very much like a JSON Web Token (jwt). To verify it you compute keyed hash (hmac) using the sha-256 algorithm with the client secret of the Connected App from Salesforce being the secret. Doing this in node.js is done like so:

const ourSignature = Buffer.from(crypto.createHmac(algorithm, clientSecret).update(objPart).digest()).toString('base64')
The algorithm is “sha-256”, the client secret is a string and objPart of the object part of the signed request.
To make it even easier I’ve created a repo showing how it’s done in node.js in an Express app. The source including an example app is available at https://github.com/lekkimworld/salesforce-oauth-express-middleware. The repo also contains a test app (canvas-test-app) that is easily deployable to Heroku.

Salesforce username/password OAuth flow against a sandbox

We had issues today because our OAuth password flow wouldn’t work against one of our sandboxes although the code worked against production. Instead we got this error:

{"error":"invalid_grant","error_description":"authentication failure"}

After Googling and finding this thread it turned out that when using the username/password flow against a sandbox you have to either relax IP restrictions for login or authenticate against test.salesforce.com instead of login.salesforce.com (which of course makes sense).

Below is curl commands for using the username/password flow against a sandbox:

$ curl -d "grant_type=password 
   &client_id=3MVG9X0_oZyBSzHrnzENlR...JSDz0_MiwxyieREuBhtgZJrF7Lzx8542TFpU_ 
   &client_secret=6235860963257688256 
   &username=mikkel.heisterberg%40example.com.sandboxname 
   &password=Passw0rd.SecurityToken" https://test.salesforce.com/services/oauth2/token
{"access_token":"00D6E000000Cpmu!AQ0AQIj4...cGCRqmNnYc6dmgLT09VNoIFXJtHvsPGLqrBs0VlK",
   "instance_url":"https://someaddress.my.salesforce.com",
   "id":"https://test.salesforce.com/id/00D6E000000CpmuUAC/0056E000000OCcCQAW",
   "token_type":"Bearer","issued_at":"1485850119972","signature":"malODIaSULh1siHzdw...pHKjBpWoQcm66UQ="}

$curl -H 'Authorization: Bearer 00D6E000000Cpmu!AQ0AQIj4...cGCRqmNnYc6dmgLT09VNoIFXJtHvsPGLqrBs0VlK' 
   https://test.salesforce.com/id/00D6E000000CpmuUAC/0056E000000OCcCQAW
{"id":"https://test.salesforce.com/id/00D6E000000CpmuUAC/0056E000000OCcCQAW",
   "asserted_user":true,"user_id":"0056E000000OCcCQAW","organization_id":"00D6E000000CpmuUAC",
   "username":"mikkel.heisterberg@example.com.sandboxname","nick_name":"mheis",
   "display_name":"Mikkel Heisterberg","email":"mheisterberg@foo.com","email_verified":true,"first_name":"Mikkel",
   "last_name":"Heisterberg","timezone":"Europe/Paris","photos":{"picture":"https://someaddress.content.force.com/profilephoto/005/F",
   "thumbnail":"https://someaddress.content.force.com/profilephoto/005/T"},"addr_street":null,"addr_city":null,"addr_state":null,
   "addr_country":null,"addr_zip":null,"mobile_phone":"+45 12345678","mobile_phone_verified":true,"status":{"created_date":null,
   "body":null},"urls":{"enterprise":"https://someaddress.my.salesforce.com/services/Soap/c/{version}/00D6E000000Cpmu",
   "metadata":"https://someaddress.my.salesforce.com/services/Soap/m/{version}/00D6E000000Cpmu",
   "partner":"https://someaddress.my.salesforce.com/services/Soap/u/{version}/00D6E000000Cpmu",
   "rest":"https://someaddress.my.salesforce.com/services/data/v{version}/",
   "sobjects":"https://someaddress.my.salesforce.com/services/data/v{version}/sobjects/",
   "search":"https://someaddress.my.salesforce.com/services/data/v{version}/search/",
   "query":"https://someaddress.my.salesforce.com/services/data/v{version}/query/",
   "recent":"https://someaddress.my.salesforce.com/services/data/v{version}/recent/",
   "profile":"https://someaddress.my.salesforce.com/0056E000000OCcCQAW",
   "feeds":"https://someaddress.my.salesforce.com/services/data/v{version}/chatter/feeds",
   "groups":"https://someaddress.my.salesforce.com/services/data/v{version}/chatter/groups",
   "users":"https://someaddress.my.salesforce.com/services/data/v{version}/chatter/users",
   "feed_items":"https://someaddress.my.salesforce.com/services/data/v{version}/chatter/feed-items",
   "feed_elements":"https://someaddress.my.salesforce.com/services/data/v{version}/chatter/feed-elements",
   "custom_domain":"https://someaddress.my.salesforce.com"},"active":true,"user_type":"STANDARD","language":"en_US","locale":"en_US",
   "utcOffset":3600000,"last_modified_date":"2017-01-26T13:49:33.000+0000","is_app_installed":true}

IBM Connections application development state of the union – part 6

Part 5 was about extensions/apps on-premises and this – probably final post – will be about extensions for IBM Connections Cloud. There are different ways to extend IBM Connections Cloud – one is to add links to the app menu and another is to add actual UI extensions to the applications within IBM Connections. This post is about the latter (although the observations about the administration UI applies to both). To get it out of the way from the beginning I might as well say it flat out. IBM has really missed the mark here. The extensibility mechanism for IBM Connections in Cloud is close to unusable from my point of view. Let me explain…

Basically the extension mechanism for cloud is an iframe and you may only extend Communities which is so wrong to begin with. As mentioned previously IBM Connections is a piece of social software that focus on people and not being able to extend Profiles is baffling to me. Using a clumsy UI in the administration portal you can upload a JSON file describing the extension which in turn will make the extension show up in the main UI. The smallest file I could make work is 34 lines of JSON but basically I could do away with 3 lines. Almost all of the JSON I upload is simply cruft that seems to carry over from the on-premises widget container and as I really cannot change it why should I specify it? In essence I can only change the following 3 parameters:

  • defId – seems to be an ID of the widget
  • url – the URL to set into the iframe
  • height – the height of the iframe

Part of the JSON I upload is the widget ID. I have to specify the ID of the widget (defId) but there is no check whether it’s used. Using an already used ID is allowed but only one of the widgets with the same ID shows up which is an issue as this is an obvious copy/paste error on the users part. Also the widget is added to the community page using the defId as the title but shown in the administration UI using a “name” parameter from JSON which is pretty confusing. Part of the JSON I upload is also the actual iWidget that creates and builds the iframe. I can specify my own iWidget description and the only thing that makes it not work is the ajaxProxy rejecting it making the UI fail when users load the community page. There is no upload time check. Often times an invalid JSON file only makes the UI do nothing – there is no response as to what might be wrong.

Sigh…

Once the JSON is uploaded I get an iframe of a static height with a URL set into it. The height is one thing that makes this extension mechanism hard to work with for production apps. Often times the height of the content cannot be decided at deployment time but is only known at runtime and unfortunately there is no way to change the iframe height at runtime. At least nothing which is obvious and/or documented. But now we have an iframe set the URL specified in the application JSON. The iframe is sandboxed with the following policy: “allow-same-origin allow-scripts allow-popups allow-forms”. This restricts the extension and basically it may only do the following:

  • Run JavaScript
  • Make xhr requests to the server it was loaded from (same origin)
  • Open a new window/tab in the browser

There is no way for the widget to even talk to IBM Connections itself – not even the IBM Connections API. The widget may basically show static / server side generated HTML and run JavaScript. The JavaScript may make xhr requests to the server it was loaded from. That’s it.

When the widget loads it may ask the surrounding page for a widget context by registering a message listener and posting a message to the parent page (parent.postMessage). The context looks like this:

{
   "source": {
      "resourceId": "ff7dd8b4-95d6-4fb4-f094-edb52e5d8eee",
      "resourceName": "Some Community Title",
      "resourceType": "community",
      "orgId": "12345678"
   },
   "user": {
      "userId": "87654321",
      "orgId": "12345678",
      "displayName": "John Doe",
      "email": "jdoe@example.com"
   },
   "extraContent": {
      "canContribute": "true",
      "canPersonalize": "true"
   }
}

From the context the widget can figure out who the user is and what community the user is in. The problem is however that the user information is unusable as there is no way my application server can trust this user information. As the context is not verifiable in any way there is no way for my server to trust the information it receives from the extension. The only way to convey user identity to my server is by using SAML and assume that a SAML assertion dance is performed when the iframe contents is loaded so the user has a session cookie relationship with my server. But this is doable – I now know the user identity based on the SAML dance.

Next thing is to make sure the user is actually a member of the community he/she is sending to my server – but oh – there is no way to decide this. My server side code cannot make requests on behalf of the user back to IBM Connections without the user having already performed an OAuth dance and authorized my application to IBM Connections. I could tell the user that we might not have tokens for him/her but it yields a crappy user experience. Plus any authorization granted expires from time to time (at least every 90 days). Also there is no organization wide OAuth authorization capabilities in IBM Connections Cloud like is the case for Google or Microsoft plus there is no super-user for IBM Connections so we’re pretty stuck here.

Now this is pretty bad and combining these things basically makes it impossible to create any kind of customer or ISV solution with a decent user experience. At least if the context is important and the contents is not static.

So what do we do about it? Well IMHO the solution is pretty easy and simple which makes it even worse that IBM decided to ship this capability. Let me suggest the following points:

  • Administration UI
    Fix the administration UI including the widget JSON I have to upload. Only ask for the stuff that actually matter and induce the rest if not specified. If the uploaded file doesn’t validate tell me – maybe even provide a clue as to what’s missing…
  • Make the context verifiable
    When I register a widget add an option to indicate that my server needs to verify the information in the context (the JSON blob above). If I check the box generate a set of asymmetric keys and provide me one of the keys. Now the JSON context could be signed with the IBM Connections part of the key making my server capable of verifying that the information indeed came from IBM Connections. And since it’s asymmetric there is no way for my server to impersonate IBM Connections. Oh and this would make the information in the context trustable even if the customer is not using SAML.
  • Making calls back to IBM Connections possible
    When I register a widget add an option for me to indicate that my server needs to make calls back to IBM Connections on behalf of the user. For additional credits allow me to specify which parts of the IBM Connections API my server may use. In combination with the asymmetric key pair above this option would include an encrypted opaque token in the JSON context blob. This token could be used by my server to authenticate my server and the request back to IBM Connections. It could be a set of automatically generated OAuth tokens but doesn’t need to be. This is a secure solution as we already have a key pair in place so the token could be encrypted using the IBM Connections part of the key pair so that the widget code in the browser cannot use it. Only the server with the matching key may decrypt the token and use it for the IBM Connections API.

Now I’m no security expert but this should be secure and pretty easy to implement. With a single sweep it would make widgets in IBM Connections Cloud way more powerful than widgets on-premises and would make them much easier to develop. Only thing left then is making it possible to adjust the height at runtime but I’ll let that slip for now as a basic oversight in the design of the extensibility mechanism and assume this capability will be available soon anyway.

</rant >

I have a small IBM Connections Cloud community apo on Github if you would like to see a minimal example: IBM Connections Cloud Community App Example