Yet another reason for using Platform as a Service (PaaS)

I’m using a Platform as a Service (PaaS) for all my application development — because why wouldn’t I?! Heroku is the platform of choice with the full disclaimer that Heroku is a Salesforce company and I work for Salesforce.

However the driving reason of my usage of Heroku is that being a PaaS Heroku provides me with a high level of abstraction of the underlying compute infrastructure. On Heroku I just worry about the app and not anything else. No load balancing. No compute instances. No gluing and regluing stuff together. Just the app.

As part of the applications I have on Heroku I have a wide variety of databases including Heroku Postgres (a fully managed and elastic Postgres instance). A nice aspect of this is that for me Postgres is just another data store but I do not have to worry about the operations of it. This is great as I’m much better at writing apps than managing an always-on, fully load balanced and highly available database cluster.

Yesterday I received a notice about upcoming maintenance and one section in the email made me take notice so I thought I would share:

Your database must undergo maintenance.

At that time, we will create a hidden follower, wait for it to catch up, push the hidden follower’s creds to your app, unfollow it from your former leader and repoint any followers you may have to properly follow your new leader.

Let’s break that out really quick… When they — Heroku — do the maintenance for me and I won’t have to lift a finger they will:

  • Create a hidden follower (database)
  • Wait for it to catch up (to allow uninterrupted operations)
  • Push the hidden follower’s creds to your app (so the app uses the new follower database)
  • Unfollow it from your former leader (to ensure smooth operations)
  • Repoint any followers you may have to properly follow your new leader (again to ensure smooth operations)

All this without me doing anything. Again I just worry about my app that will continue running without any change by the way. How cool is that?! I just worry about the app.

Just another reason for using a PaaS…

Simultaneous INSERTs with Heroku Connect in the same transaction

So had some fun working with Heroku Connect the other day writing some data from Heroku to Salesforce using Postgres. On Heroku that is easily done using Heroku Connect which is a turn-key solution we have for that. Easy to use and easy to setup. The way it works is that you configure a connection from Heroku to Salesforce, map it to a Postgres database on Heroku and selects the objects and fields from Salesforce to synchronize and Heroku Connect does the rest. Powerful stuff!

So what did I do and what did I learn…

In Salesforce I have two custom objects – a parent custom object (Parent__c) and a child custom object (Child__c) with a master-detail relation to the parent (Parent__r / Parent__c). Both the Parent and Child custom objects have an external ID (External_ID__c) to make writing back from Heroku easy.

Once the objects are mapped using Heroku Connect I get 2 new tables in the “salesforce”-schema in Postgres (“salesforce.parent__c” and “salesforce.child__c”) with the selected fields.

What I wanted to do was – in a transaction – write a new parent record and the associated child records. I’m using node.js but since I’m using raw SQL that shouldn’t be too different to any other language. What I did do however is use promises which is important for this post.

Below is some code to illustrate what I did. Basically the code does the following:

  1. Starts a transaction (BEGIN)
  2. Inserts the parent record (INSERT INTO salesforce.parent__c…)
  3. Creates a promise to insert the 3 child records (INSERT INTO salesforce.child__c…) and resolves them all at once (i.e. the inserts will happen simultaneously)
  4. Commits the transaction (COMMIT) if all is good or rolls it back (ROLLBACK)
// data to work with
const objData = {
   "id": "ffe16a92-0f22-4a16-947a-461d307b4905", 
   "text": "Some Parent Text...", 
   "children": [
      {"id": "ffe16a92-0f22-4a16-947a-461d307b5906", "text": "Child1"},
      {"id": "ffe16a92-0f22-4a16-947a-461d307c4907", "text": "Child2"},
      {"id": "ffe16a92-0f22-4a16-947a-461d307b3904", "text": "Child3"}
   ]
};

// start a new tx
pool.query("BEGIN").then(rs => {
   // insert parent record
   return pool.query(`INSERT INTO salesforce.parent__c 
      (External_ID__c, Text__c) 
      VALUES 
      ('${objData.id}', '${objData.text}');`);
}).then(rs => {
   // map each child to a promise to insert child record
   const promises = objData.children.map(child => {
      return pool.query(`INSERT INTO salesforce.Child__c 
         (parent__r__external_id__c, "External_ID__c", "Text__c") 
         VALUES 
         ('${objData.id}', '${child.id}', '${child.text}')`);
   return Promise.all(promises);
}).then(rs => {
   // commit tx
   console.log("Commiting tx");
   return pool.query("COMMIT");
}).then(err => {
   // roll tx back
   console.log(`Rolling back tx: ${err.message}`);
   return pool.query("ROLLBACK");
})

The issue was that I kept seeing 2 errors in the log when the code ran as shown below. Funny thing is that I didn’t see any information about the transaction being rolled back.

2019-03-16T14:32:20-07:00 event="record 8 ERROR ↑SALESFORCE [SOAP] Foreign key external ID: ffe16a92-0f22-4a16-947a-461d307b4905 not found for field External_ID__c in entity Parent__c" addon_id=ba3d4995-092d-4095-aa5e-7ac41981bd93 object_type=sync object_id=ba3d4995-092d-4095-aa5e-7ac41981bd93 state=POLLING_DB_CHANGES level=debug  
22019-03-16T14:32:20-07:00 event="record 9 ERROR ↑SALESFORCE [SOAP] Foreign key external ID: ffe16a92-0f22-4a16-947a-461d307b4905 not found for field External_ID__c in entity Parent__c" addon_id=ba3d4995-092d-4095-aa5e-7ac41981bd93 object_type=sync object_id=ba3d4995-092d-4095-aa5e-7ac41981bd93 state=POLLING_DB_CHANGES level=debug  

Okay so clearly the insert fails because the foreign key of the child records cannot be resolved i.e. the ID pointing from salesforce.child__c to salesforce.parent__c. But why only 2 errors and not 3 errors? (I was attempting to insert 3 child records) I added an intermediate COMMIT before inserting the child records and started a new transaction and then all was well. So it had to do with the transaction… In Postgres the solution would be to defer the referential constraint check but I couldn’t do it. Why? Because Heroku Connect doesn’t use constraints as can be seen if inspecting the child table (psql: “\d salesforce.child__c”) but seems to do the referential integrity check using a trigger on the tables created by Heroku Connect. I’m guessing this is part of the magic that it Heroku Connect.

I played a bit more around with the code and finally solved the issue by doing the inserts in turn by resolving the promises sequentially. So instead of using Promise.all (that resolves the promises simultaneously) I resolved each promise in turn and then it worked.

My understanding of this is must be as follows. The triggers created by Heroku Connect that handles the underlying integration to Salesforce uses some log-tables (as described in the documentation) but the triggers must also employ some locking of the salesforce.parent__c table and hence the other inserts to salesforce.child__c running at the same time fails. Also this doesn’t cause my transaction to roll back which from my point of view is unfortunate as I really want the inserts to be atomic hence the use of a transaction. In this case there was a solution in doing the inserts sequentially. I’ll follow up internally in Salesforce and see if I can provide a better answer that the above.

Individual Object enablement in Salesforce Scratch orgs – now available!

I’m so happy that one of my scratch org pet peeves finally made it into the product for the Spring 19 release. It has not been possible to enable the Individual Object when creating new scratch orgs and IMO this has been lacking and a cause for many issues, discussions and workarounds in customer projects. But all this is a thing of the past now! Starting Spring 19 you may add the below orgPreferenceSetting to your scratch org definition file:

"settings": {
    "orgPreferenceSettings": {
      "consentManagementEnabled": true
    }
 }

Now the setting hasn’t made it into the documentation as of this writing but I know that it’s in the works. No go and create scratch orgs with the Individual Object enabled!

Further reading:

Calling a Swagger service from Apex using openapi-generator

For a demo I needed to make calls to the Petstore API using a Swagger / OpenAPI definition from Apex. Unfortunately the Apex support that External Services in Salesforce provides is only callable from Flows so I needed another approach. Using openapi-generator and installing via Homebrew it went relatively smoothly anyway.

It’s always nice to stand on the shoulder of giants so the examples provided by Rene Winkelmeyer (Connecting to Swagger-backed APIs with Clicks or Code ) came in handy. Unfortunately didn’t work as the prefix for the generated classes was different plus the petId was was 1 and not 100. Anyway again easy to fix.

Below are my steps to get working using Homebrew and Salesforce DX.

$ brew install openapi-generator
$ openapi-generator generate \
-i https://raw.githubusercontent.com/openapitools/openapi-generator/master/modules/openapi-generator/src/test/resources/2_0/petstore.yaml \
-g apex \
-o /tmp/petstore_swagger/
$ cd /tmp/petstore_swagger
$ sfdx force:org:create \
-v my_devhub \
-f config/project-scratch-def.json -a petstore_swagger
$ sfdx force:source:push -u petstore_swagger
$ sfdx force:apex:execute -u petstore_swagger
>> Start typing Apex code. Press the Enter key after each line,
>> then press CTRL+D when finished.
OASClient client = new OASClient();
OASPetApi api = new OASPetApi(client);
Map<String,Object> params = new Map<String, Object>();
Long petId = Long.valueOf('1');
params.put('petId', petId);
OASPet pet = api.getPetById(params);
System.debug(pet);
<Ctrl-D>

Backing iOS up to a secondary drive on macOS

So my wife got a new iPhone yesterday which of course means me playing tech support and me backing the old one up and restoring onto the new one. My wife has a Mac Air with a somewhat limited internal drive to there wasn’t enough space to back up the old iPhone (128gb). Easy I thought! I’ll just move the iTunes library folder to an external USB drive and the backup will go there. Wrong! The backups are not kept in the iTunes library so that didn’t work.

After a little research it turned out that iOS backups are kept in ~/Library/Application Support/MobileSync/Backup with each backup being a folder in that directory. Hmmm… Maybe a synlink would work? Oh yes! So here’s what I did:

  1. Delete older backups from iTunes using Preferences/Devices
  2. Quit iTunes
  3. Break open a terminal
  4. Move to the folder containing the Backup folder:
    cd ~/Library/Application\ Support/MobileSync
  5. Remove the folder (now empty):
    rm -rf Backup
  6. Create a symbolic link (synlink) to the directory on the external drive to hold the backup (here the USB drive is called MM):
    ln -s /Volumes/MM/iOSbackups/ ./Backup
  7. Open iTunes and backup the iPhone and restore onto the new iPhone
  8. Quit iTunes
  9. Reverse the process (after having ejected the USB drive):
    cd ~/Library/Application\ Support/MobileSync
    rm Backup
    mkdir Backup
  10. Done!

 

Salesforce Canvas Apps

A Salesforce Canvas app is an often overlooked easy way to integrate existing apps into Salesforce. A Canvas app is inlined into the Salesforce user interface and it requires only a very small change to your app to have it play nice with Salesfore. In theory you could get away without any change but usually you’d like to know who the calling user is. What’s really great about a Canvas App is that this information is POST’ed to the application at invocation together with an OAuth access_token to allow authenticated callbacks to Salesforce. To implement this you need to:

  1. Support POST at a URL you specify
  2. Render the application from here or redirect the user after the POST has been received
  3. Receive and handle the signed request

The signed request is a base64 encoded blob in two parts separated by a period. It looks very much like a JSON Web Token (jwt). To verify it you compute keyed hash (hmac) using the sha-256 algorithm with the client secret of the Connected App from Salesforce being the secret. Doing this in node.js is done like so:

const ourSignature = Buffer.from(crypto.createHmac(algorithm, clientSecret).update(objPart).digest()).toString('base64')
The algorithm is “sha-256”, the client secret is a string and objPart of the object part of the signed request.
To make it even easier I’ve created a repo showing how it’s done in node.js in an Express app. The source including an example app is available at https://github.com/lekkimworld/salesforce-oauth-express-middleware. The repo also contains a test app (canvas-test-app) that is easily deployable to Heroku.