Generating JWT’s for Azure in Apex

Lately I’ve been playing around with Azure and integrating Salesforce and Azure. One of the integration patterns calls for using Json Web Tokens (JWT) that you can the exchange for an access token in Azure. There is a catch however…

Since Azure requires that the thumbprint of the certificate be added to the header of the JWT (using the key “x5t”) we cannot use the built in support for JWT in Named Credentials as there are no provisions for custom header key/values. The JTW/JWS classes in Apex cannot be used either as we cannot customize the header there either. Building upon https://github.com/salesforceidentity/jwt I’ve created https://github.com/lekkimworld/azurejwt-apex that bridges the gap.

This allows you to build and sign a JWT that you may exchange for an access token using your tenants OAuth token endpoint v.2 in Azure. Example Apex code is like this:

// declarations (because I'm old school)
final String azureClientId = '88d888a5-0cf4-473a-b9a0-7c88e6fc888e';
final String azureTenantId = 'b34feb2b-132f-4322-af1d-c888f5d888d0';
final String azureCertThumbprint = '4rElsDFTysrbKhB0zTsrRNSxT6s=';
final String azureScopes = '5384888d-868f-442b-b1b3-8688807de914/.default';

// create JWT with certificate from keys mgmt and set the x5t in the header to the 
// thumbprint of the cert as expected by Azure
AzureJWT jwt = new AzureJWT();
jwt.cert = 'JWT_Callout_Certificate';
jwt.iss = azureClientId;
jwt.sub = azureClientId;
jwt.aud = 'https://login.microsoftonline.com/' + azureTenantId + '/oauth2/v2.0/token';
jwt.x5t = azureCertThumbprint;

// invoke the flow and obtain an access_token
final String access_token = AzureJWTBearerFlow.getAccessToken(azureClientId, azureTenantId, azureScopes, jwt);

// use the access token against a Function App in Azure
HttpRequest req = new HttpRequest();
req.setEndpoint('https://foo-functions-demo.azurewebsites.net/api/MyFunction?name=Salesforce');
req.setMethod('GET');
req.setHeader('Authorization', 'Bearer ' + access_token);
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

In the https://github.com/lekkimworld/azurejwt-apex Github repo you will find the two Apex classes from the above example together with the example code.

The certificate thumbprint (bold above) isn’t the regular SHA-1 thumbprint but is a special hexdump/base64 encoded edition. To make it even more interesting the thumbprint displayed in Azure Portal is not the thumbprint we need. The thumbprint/hash may be computed this like (gleaned from https://stackoverflow.com/a/52625165):

echo $(openssl x509 -in yourcert.pem -fingerprint -noout) | sed 's/SHA1 Fingerprint=//g' | sed 's/://g' | xxd -r -ps | base64

Using an Auth. Provider and Named Credentials in Salesforce with Azure OAuth

Please note: When I refer to “Azure” below I’m referring to Microsoft Azure, the cloud product from Microsoft.

Please note: If you know all about why Auth. Providers and Named Credentials are great and simply wanna know about how to use them with Microsoft Azure feel free to skip down to “How does it apply to Azure”.

All this started some time back when I was at a customer showing how to integrate with Salesforce and call other API’s from Salesforce. In Salesforce we have a great concept called Authentication Providers (“Auth. Provider”) that handles the underlying authentication protocol such as OAuth 2.0. Auth. Providers may be used to provide Single-Sign-On in Communities (our portals) or with Named Credentials. The latter is a way to externalize authentication from a piece of code or functionality. So if I need to call an external API from Apex (our Java-like programming language) I can simply do something like the code below:

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_NamedCredential/accounts/list');
req.setMethod('GET');
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

As you can see there is nothing here about authentication. Nor is there anything around the actual URL being called. That information is externalized into a Named Credential called My_NamedCredential in this case. The only part I specify is the specific path of the API I’m referring to (here “/accounts/list”). This is great for development as it makes it easier for me the developer but it’s also easier to admins moving changes between environments as the endpoint and credential management is externalized as setup for the org. It means it’s easy to change credentials and endpoints between development, test, QA and production. Awesome!

Setting this up is pretty easy as well and is done in 3 steps:

  1. Start by setting up the Auth. Provider by specifying your client_id (we call it the “Consumer Key”), the client_secret (we call it the “Consumer Secret”), the Authorization endpoint and the token endpoint. The last two you get from your provider – in this case Azure. For now with version 2 of their identity platform they will be https://login.microsoftonline.com/<tenant>/oauth2/v2.0/authorize and https://login.microsoftonline.com/<tenant>/oauth2/v2.0/token respectively (replace <tenant> with your tenant id).
  2. Now create a Named Credential specifying the root URL you would like to call against in the “URL” field. For “Identity Type” select “Named Principal” to use the same credentials across the org or “Per User” to use user specific credentials and set “Authentication Protocol” to “OAuth 2.0”. In “Authentication Provider” select the provider we created above set the scope to use.
  3. Now use the Named Credential as discussed above.

Now let’s discuss what’s special about Azure.

How does it apply to Azure?

Above I was intentionally pretty loose when discussing the scope to set in the Named Credentials. The reason is that this is quite specific when dealing with Azure.

In Azure an access token is actually a Json Web Token (JWT, https://jwt.io) which is a standardized token format containing signed claims that may be verified by the recipient. The payload in a JWT access token from Azure could look like this:

{
  "aud": "2dd1b05d-8b45-4aba-9181-c24a6f568955",
  "iss": "https://sts.windows.net/e84c365f-2f35-8c7d-01b4-c3412514f42b/",
  "iat": 1573811923,
  "nbf": 1573811923,
  "exp": 1573815823,
  "aio": "42VgYJj0Zt3RXQIblViYGkuiHs9dAQA=",
  "appid": "32c0ba71-04f4-4b3a-a317-1f1febd5fc22",
  "appidacr": "1",
  "idp": "https://sts.windows.net/e84c365f-2f35-8c7d-01b4-c3412514f42b/",
  "oid": "394e0c1a-0992-42e7-9875-3b04786147ca",
  "sub": "394e0c1a-0992-42e7-9875-3b04786147ca",
  "tid": "e84c365f-2f35-8c7d-01b4-c3412514f42b",
  "uti": "ESAOZDPFeEydSYxohgsRAA",
  "ver": "1.0"
}

Here the important piece is the “aud” claim as it contains the ID of the application or API on Azure the token is valid for. In this case it’s for an App Registration in Azure.

When we deal with OAuth providers we might be used to deal with standard OpenID Connect scopes like openid, email, profile and offline_access. For Azure which is much more of a generic platform the scope we specify is used to indicate what application we are requesting access to. In Azure the issued access token is specific to the application we request access to and we can only request access for a single application at a time which have some implications:

  1. If you request an access token for an application (App Registration in Azure) the access token is valid for that application only
  2. The access token is not valid for other API’s like the Microsoft Graph
  3. If you do not request an access token for a specific application Azure issues you an access token for the Microsoft Graph API

This may be obvious but it caused me quite some troubleshooting.

OAuth scopes in Azure

When you request an access token from Azure you must specify what API you intend to use it for. There are two versions of the OAuth endpoints (v1 and v2) – in version 1 you use a resource-parameter to indicate the target application to Azure. In version 2 this has been standardized and is now using the standard scope-parameter. It also means that scope is now not simply the OpenID Connect standard scopes (such as openid, offline_access) or the application specific scopes i.e. from Microsoft Graph but is also used to indicate the API you are requesting access to i.e. an App Registration.

Now to the fun, the stuff you just need to know and the stuff which is easy enough to find if you know what to Google for…

Specifying the URI of an App Registration is the scope is not enough. You have to add a scope to the Application URI and unless you’ve defined a custom scope for your app you should use “.default”. So if your App Registration has an URI of “api://2dd1b05d-8b45-4aba-9181-c24a6f568955” use “2dd1b05d-8b45-4aba-9181-c24a6f568955/.default” as the scope. If you’ve registed custom scopes for an application those may be used in place of .default but you cannot combine .default with more specific scopes. Also using specific scopes are also restricted in use for certain OAuth flows due to the way delegated permissions work in Azure. But that’s for another time.

Using the client_credentials grant type (works somewhat like the Salesforce username/password OAuth flow) this becomes a POST like the one below. Please note that the Azure tenant_id is both in the URL and a parameter in the POST body:

POST /f84c375f-3f35-4d9d-93b3-c4216511f87a/oauth2/v2.0/token HTTP/1.1
 Host: login.microsoftonline.com
 Content-Type: application/x-www-form-urlencoded
 Content-Length: XYZ
 Connection: close
client_id=32c0ba71-04a4-4b3a-a317-1b1febd5fc22
&client_secret=shhhhh...
&grant_type=client_credentials
&tenant=f84c375f-3f35-4d9d-93b3-c4216511f87a
&scope=2dd1b95d-8b45-4aba-9181-c24f6f5e8955/.default

The response is a JSON document with an access_token (a JWT which may then be used as a Bearer token):

{
   "token_type":"Bearer",
   "expires_in":3599,
   "ext_expires_in":3599,
   "access_token":"eyJ0eXAiOiJKV1QiLCJhbGciO..."
}

When using Azure with Salesforce I would recommend using version 2 of the OAuth endpoints as Salesforce Auth. Providers and Named Credentials do not have a way to send custom parameters without resorting to writing a custom Auth. Provider implementation. This means there is no standard way to send the “resource” parameter to the version 1 OAuth endpoint.

What scopes should I specify in Salesforce?

When you create your Auth. Provider or Named Credentials specify the scopes you need from Azure. My personal preference is to specify the scopes on the Auth. Provider when using it for Single-Sign-On and specifying the scopes on the Named Credentials when using it for API access. One thing to note again is that an access token is for one API only and that an access token for a custom application will not work for the Microsoft Graph as well.

Lesson learned: You should not specify the UserInfo endpoint on the Auth. Provider in Salesforce unless it’s used for Single-Sign-On AND you are not specifying an App Registration in the Scopes-field.

If you specify an “App Registration scope” in the Scopes-field and specify the UserInfo endpoint Salesforce will attempt to read from the UserInfo endpoint following successful authentication using the obtained access token which will fail because the access token is only valid for the intended API and not for the Microsoft Graph.

Feel free to add other standard OpenID Connect scopes for Auth. Providers for Single-Sign-On. For most uses you would want to specify the offline_access scope as it ensures your Auth. Provider or Named Credential receives a refresh token.

Calling API’s protected by Azure API Management (APIM)

So far so good. Now you can obtain access tokens and use them with Azure Function Apps or read from the Microsoft Graph. But what if you need to access API’s hosed in Azure API Management (APIM)? Well read on…

In Azure API Management API’s are governed by subscriptions and you need to specify a subscription ID when calling into the API. The subscriptions map into users which are different from the ones in Azure AD. I’m not really sure why this is the case but I’m sure there are reasons.

The subscription ID you should use would be supplied by the people managing the API and is a GUID like string. When you call the API be sure to supply it using the “Ocp-Apim-Subscription-Key”-header. Failing to supply the subscription ID will result in an error like the one below:

{"statusCode": 401,   
"message": "Access denied due to missing subscription key. Make sure to include subscription key when making requests to an API."}

Putting all the above together to POST to an API behind Azure API Management using Apex would be something like the below using a Named Credentials called “My_NamedCredential”:

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_NamedCredential/echo
req.setMethod('POST');
req.setHeader('Ocp-Apim-Subscription-Key', '7f9ed...1d6e8');
req.setBody('Hello, Salesforce, World!');
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug(res.getBody());

As always… YMMV!

Calling a Swagger service from Apex using openapi-generator

For a demo I needed to make calls to the Petstore API using a Swagger / OpenAPI definition from Apex. Unfortunately the Apex support that External Services in Salesforce provides is only callable from Flows so I needed another approach. Using openapi-generator and installing via Homebrew it went relatively smoothly anyway.

It’s always nice to stand on the shoulder of giants so the examples provided by Rene Winkelmeyer (Connecting to Swagger-backed APIs with Clicks or Code ) came in handy. Unfortunately didn’t work as the prefix for the generated classes was different plus the petId was was 1 and not 100. Anyway again easy to fix.

Below are my steps to get working using Homebrew and Salesforce DX.

$ brew install openapi-generator
$ openapi-generator generate \
-i https://raw.githubusercontent.com/openapitools/openapi-generator/master/modules/openapi-generator/src/test/resources/2_0/petstore.yaml \
-g apex \
-o /tmp/petstore_swagger/
$ cd /tmp/petstore_swagger
$ sfdx force:org:create \
-v my_devhub \
-f config/project-scratch-def.json -a petstore_swagger
$ sfdx force:source:push -u petstore_swagger
$ sfdx force:apex:execute -u petstore_swagger
>> Start typing Apex code. Press the Enter key after each line,
>> then press CTRL+D when finished.
OASClient client = new OASClient();
OASPetApi api = new OASPetApi(client);
Map<String,Object> params = new Map<String, Object>();
Long petId = Long.valueOf('1');
params.put('petId', petId);
OASPet pet = api.getPetById(params);
System.debug(pet);
<Ctrl-D>

Trying to explain Person Accounts in Salesforce

Person Accounts in Salesforce keeps confusing developers not at home on the platform due to their special behaviour. The purpose of the repo (https://github.com/lekkimworld/salesforce-personaccount-field-reference) is to hold some examples on how to work with Person Accounts in an org using Apex and the Bulk API to try and illustrate a few points.

What ARE PersonAccounts

First of knowing WHAT a Person Account is is important. In Salesforce we normally talk about Accounts and Contacts with the Account being the company entity (i.e. Salesforce.com Inc.) and the Contact being the people that we track for that company (i.e. Marc Benioff, Parker Harris etc.). It means that we have to have an Account and a Contact to track a person in Salesforce. But what if that doesn’t make any sense like when tracking individuals for B2C commerce or similar? Meet the PersonAccount.

Please Note: There is no such object as PersonAccount in Salesforce. There are only Account and Contact but in the following I’ll use PersonAccount to reference this special case for Account.

PersonAccount is a special kind of Account that is both an Account AND a Contact giving you the possibility to treat an individual using an Account. The secret to understanding PersonAccount is knowing that using a special record type and specifying it when you create the Account, Salesforce will automatically create both an Account AND a Contact record and automatically link them and thus create the PersonAccount. Salesforce automatically makes the fields that are normally available (including custom fields) on the Contact available on Account. Only thing you need to do is follow a few simple rules that are listed below.

Please Note: When using PersonAccounts you should always access the Account and never the associated Contact.

Referencing Fields

Because there is both an Account and a Contact for a PersonAccount there are some special rules to follow when referencing fields. This goes for any access whether that be using Apex, REST API and the Bulk API. The rules are pretty easy and are as follows:

  1. Always reference the Account object
  2. When creating a PersonAccount create an Account specifying the record type ID of the PersonAccount record type configured in Salesforce. Doing this makes the Account a PersonAccount.
  3. Fields from Account are available on Account (as probably expected):
    1. Standard fields from Account Referenced using their API name as usual (i.e. Site, Website, NumberOfEmployees)
    2. Custom fields from Account Referenced using their API name as usual (i.e. Revenue__c, MyIntegrationId__c)
  4. Fields from Contact are available directly on Account:
    1. Standard fields from Contact The API name of the field is prefixed with “Person” (i.e. Contact.Department becomes Account.PersonDepartment, Contact.MobilePhone becomes Account.PersonMobilePhone) UNLESS we are talking FirstName and LastName as they keep their names (i.e. Contact.FirstName becomes Account.FirstName, Contact.LastName becomes Account.LastName)
    2. Custom fields from Contact The field API name suffix is changed from __c to __pc (i.e. Contact.Shoesize__c becomes Account.Shoesize__pc)

Bash one-liner for Apex test coverage percentage using SalesforceDX

Update 3 May 2018: There are issues with the percentages reported by SalesforceDX plus it doesn’t report coverage on classes with 0% coverage which will shrew the results. The approach outlined above can be used as an indication but cannot as of today be used as a measure for code coverage when it comes to production deployments. As an example I’ve had the above snippet report a coverage of 88% where as a production deploy reported 63% coverage. We – Salesforce – are aware of the issue and are working to resolve it. Stay tuned!

Note to self – quick note on how to run all tests in a connected org (as identified by the -u argument) and use jq and awk to grab the overall test coverage percentage.

$ sfdx force:apex:test:run -u mheisterberg@example.com.appdev -c -w 2 -r json | jq -r ".result.coverage.coverage[].coveredPercent" | awk '{s+=$1;c++} END {print s/c}'
> 88.1108

YMMV!

 

Using SalesforceDX to automate getting Apex class test coverage percentages

So SalesforceDX is good for many things but this particular blog post is going to be around how it provides easy access to something which is otherwise hard or cumbersome to get at. Like Apex class test coverage. It’s available through other means such as the UI and the tooling api but there it takes manual work (clicking) or requires additional plumbing to set up and extract. With SalesforceDX it’s surprisingly easy.

As opposed to popular belief SalesforceDX may be used with any org and not just the scratch orgs that SalesforceDX affords for development. Connecting to any org is as simple as using the Force to do the OAuth dance:

$ sfdx force:auth:web:login

For additional points you can give the org connection an alias for easy reference (using –setalias) and specify the login URL if required (using –instanceurl) i.e. if you’re adding a sandbox.

$ sfdx force:auth:web:login --setalias MyOrg --instanceurl https://test.salesforce.com

Once you have the org connection you can use force:apex:test:run to run tests and force:apex:test:report to – surprise – return the test report.

$ sfdx force:apex:test:run -u mheisterberg@example.com.appdev
Run "sfdx force:apex:test:report -i 7076E00000Uo5sc -u mheisterberg@example.com.appdev" to retrieve test results.

$ sfdx force:apex:test:report -i 7076E00000Uo5sc -u mheisterberg@example.com.appdev
=== Test Results
TEST NAME OUTCOME MESSAGE RUNTIME (MS)
────────────────────────────────────────────────────────────────── ─────── ─────── ────────────
ChangePasswordControllerTest.testChangePasswordController Pass 11
AccountTriggerHandlerTest.testSetAccountOwner Pass 3623
AccountTriggerHandlerTest.testSetContactId Pass 114
AccountTriggerHandlerTest.testSetLowecaseEmail Pass 215
AccountTriggerHandlerTest.testSetPCAK Pass 83
AccountTriggerHandlerTest.testValidateEmailUniquenessNegative Pass 42
AccountTriggerHandlerTest.testValidateEmailUniquenessPositive Pass 80
AddressesListRestTest.testGetAddressesList Pass 11097
AddressRestTest.testDeleteAddress Pass 1388
AddressRestTest.testGetAddress Pass 753
AddressRestTest.testPostAddress Pass 734
AddressRestTest.testPutAddress Pass 731
ConsentRestTest.testGetConsent Pass 959
ConsentRestTest.testPostConsent Pass 768
ConsentRestTest.testPutConsent Pass 975
ConsentsListRestTest.testGetConsensList Pass 3761
ConsumerRestTest.testGetConsumer Pass 1004
ConsumerRestTest.testPostConsumer Pass 988
MarketRelationTriggerHandlerTest.testBehavior Pass 8
ProfileRestTest.testDeleteProfile Pass 1071
ProfileRestTest.testGetProfile Pass 710
ProfileRestTest.testPostProfile Pass 739
ProfileRestTest.testPutProfile Pass 679
ProfilesListRestTest.testGetProfilesList Pass 921
ForgotPasswordControllerTest.testForgotPasswordController Pass 29
MyProfilePageControllerTest.testSave Pass 258
SiteLoginControllerTest.testSiteLoginController Pass 17
SiteRegisterControllerTest.testRegistration Pass 16
=== Test Summary
NAME VALUE
─────────────────── ─────────────────────────────
Outcome Passed
Tests Ran 28
Passing 28
Failing 0
Skipped 0
Pass Rate 100%
Fail Rate 0%
Test Start Time Apr 13, 2018 10:18 AM
Test Execution Time 31774 ms
Test Total Time 31774 ms
Command Time 50941 ms
Hostname https://cs85.salesforce.com
Org Id 00D6E0000008eojUAA
Username mheisterberg@example.com.appdev
Test Run Id 7076E00000Uo5sc
User Id 0051r0000087iv9AAA

It’s pretty nifty huh!?

Again for added points add –json to the test report command to get the data back in JSON. And if you already have something that accepts test coverage data from say JUnit you can just add “–resultformat junit” and boom! You’ll get the test report in JUnit XML format. But everything started with me wanting to retrieve code coverage data and that hasn’t been part of the output so far. But again SalesforceDX to the rescue… Just add –codecoverage and you’ll receive code coverage percentages as well as part of the report.

$ sfdx force:apex:test:report -i 7076E00000Uo5sc -u mheisterberg@example.com.appdev -c
== Apex Code Coverage
ID NAME % COVERED UNCOVERED LINES
────────────────── ───────────────────────────────── ────────────────── ────────────────────────────────────────────────────────────────────
01p6E000000aSHvQAM SiteLoginController 100%
01p6E000000aSHxQAM SiteRegisterController 81.48148148148148% 39,40,43,44,45
01p6E000000aSHzQAM ChangePasswordController 100%
01p6E000000aSI1QAM ForgotPasswordController 88.88888888888889% 15
01p6E000000aSI3QAM MyProfilePageController 87.5% 21,37,38
01p6E000000brQtQAI AccountTriggerHandler 95% 61,63,66,181
01p6E000000cBdhQAE MarketRelationTriggerHandler 78% 38,40,41,42
01p6E000000csObQAI Wrappers 98% 5
01p6E000000aIXsQAM ConsentRest 79% 35,36,54,55,67,69,70,88,89,104,105,106,119,121,125
01p6E000000ak0yQAA SegmentBuilder 100%
01q6E0000004xLIQAY AccountTrigger 100%
01q6E0000004zB0QAI MarketRelationTrigger 80% 13
01p6E000000aUJVQA2 UserBuilder 100%
01p6E000000aJjqQAE ConsentsListRest 94.11764705882352% 48
01p6E000000cnhQQAQ AddressRest 78% 35,36,60,68,70,92,93,110,111,112,126,128,132,149,150,162,164,165
01p6E000000caCxQAI ConsumerRest 84% 17,18,27,28,49,50,51,111,120,146,148,149,159,220,222,225,284,290,297
01p6E000000aIXxQAM ProfileRest 76% 27,28,49,57,59,78,79,91,93,94,111,112,127,128,129,141,143,147
01p6E000000aJjWQAU ProfilesListRest 94.11764705882352% 41
01p6E000000cr6iQAA AddressesListRest 83.33333333333334% 39,52,54
=== Test Results
<snipped>

Combine that with –json and you have the foundation for automating this. So sweet. You could even write a little script to output this any way you like.

Happy scripting…

The best way to learn a platform is to use a platform

Wow what a week it’s been. First week back from vacation and I’m diving right into a sprint of stuff that needs to be delivered to the customer. My task for the week has been develop a connectivity layer between Salesforce and Dropbox using OAuth. This task has taken me on quite a learning journey.

Now I’ll call myself quite a seasoned programmer but ever since joining Salesforce 9 months ago I’ve had to relearn a lot of stuff. A new programming language in Apex, new framework technologies such as Salesforce Lightning Design System (SLDS) and the Lightning Platform and the entire underlying Salesforce Platform which is enormous. There are just different ways to do stuff… Previously I would have had my choice of any language, technology and framework to solve the issue but now those are kind of given.

Just this afternoon as I was putting the final semi-colons in the core OAuth layer (Apex), the Dropbox specific OAuth layer (Apex), the business specific facade class in front of those layers (Apex) and the management pages for the integration (Lightning) I was reminded of an old quote from an old biochemistry text book of mine: “The best way to learn a subject is the teach the subject”. And that continues to hold true and is equally true when saying “the best way to learn a platform is to build on the platform”. I’ve learned much stuff about Apex and Lightning in the past week. All the cool stuff you can do but also some of the crazy stuff that falls into that “you have have to know that”-category. But for all those things it holds true that you have to spend the time to reap the benefits.

For now I’ll just say that although the Apex language has it quirks and the compiler is definitely showing age (and Salesforce knows this – a new compiler is being developed or at least that’s what I heard in a Dreamforce 2016 session) there is still much so much power in the language. Is there stuff you cannot do? Sure! But there are cool interfaces like System.StubProvider which I read and hear little talk about. The built in support to unit tests and mocking of HTTP requests is awesome and allows you to quickly and easily stub out calls to external services. The runtime screams with performance and I’m pretty impressed about the speed with which my code runs.

Good stuff!

So in closing – I have my OAuth layer done, unit tested and commited. I’ll surely be blogging about how to chose to build it so others may learn from it if they want to spend the time and learn the platform.