Clone database between apps on Heroku

When doing development on Heroku using review apps, pipelines and all the other goodness it’s sometimes required to test against a clone of a production database. Doing that is pretty easy on Heroku when you know how even without moving the data down to your own laptop and then backup to Heroku. This is done by creating a backup, generating a public, secret, URL for the backup image and then restoring from that URL.

Below are the steps in the Heroku CLI to clone a database from an app called “my-production-app” to an app called “my-development-app”. Step no. 3 is to create a new Heroku Postgres addon attached to “my-development-app” that fits the same amount of data as the one from “my-production-app”.

$ heroku pg:backups:capture --app my-production-app
Starting backup of postgresql-aerodynamic-55964… done

Use Ctrl-C at any time to stop monitoring progress; the backup will continue running.
Use heroku pg:backups:info to check progress.
Stop a running backup with heroku pg:backups:cancel.

Backing up DATABASE to b003… done

$ heroku pg:backups:url --app my-production-app b003
https://xfrtu.s3.amazonaws.com/112d4cf2-xxxx-475b-xxxx-ac809400fa78…

$ heroku addons:create heroku-postgresql:hobby-basic --app my-development-app
Creating heroku-postgresql:hobby-basic on ⬢ my-development-app… $9/month
Database has been created and is available
! This database is empty. If upgrading, you can transfer
! data from another database with pg:copy
Created postgresql-octagonal-31270 as HEROKU_POSTGRESQL_YELLOW_URL
Use heroku addons:docs heroku-postgresql to view documentation

$ heroku pg:backups:restore --app my-development-app https://xfrtu.s3.amazonaws.com/112d4cf2-xxxx-475b-xxxx-ac809400fa78… HEROKU_POSTGRESQL_YELLOW

Once restored you may promote the database you restored into using the “heroku pg:promote” command and / or delete the backup you created.

$ heroku pg:backup:delete --app my-production-app b003
▸ WARNING: Destructive Action
▸ This command will affect the app my-production-app
▸ To proceed, type my-production-app or re-run this command with --confirm
▸ my-production-app

2 legged vs 3 legged OAuth

Had a question come in from a customer that centered around understanding the difference between 2 legged and 3 legged OAuth so I thought I would write a little about it. I short 2 legged and 3 legged OAuth refers to the number of players involved in the OAuth dance but let’s explain each.

3 legged OAuth is used when you as a user wants to allow an application (i.e. Salesforce) to access a service (i.e. Azure) on your behalf without the application (Salesforce knowing your credentials for the service (Azure). This is accomplished by me (the user) being redirected by the application (Salesforce) to the service (Azure) where I log in directly, I obtain a code that the application (Salesforce) can use to obtain an access token for the service (Azure) out of band (i.e. the application contacts the service directly). Key is that I the user am involved because I need to authenticate to the service (Azure) and the application is afterwards able to impersonate me towards the service.

2 legged OAuth is used when an application needs to access a service using a service account (e.g. an App Registration as it’s called in Azure). The key here is that the application has all the information it needs to authenticate to the service. In 2 legged OAuth the application makes a single call to the service to basically exchange credentials (username/password, client_id/client_secret, Json Web Token (JWT)) for an access token. As an aside there is usually no way to get a refresh token issued in a 2 legged OAuth dance which is fair as the application could just perform the 2 legged OAuth dance again to get a new access token hence no need for a refresh token.

Looking back towards Salesforce and Named Credentials which is the way we recommend customers manage credentials for accessing services outside Salesforce. In Named Credentials you can use 3 legged OAuth if you selected “OAuth 2.0” for “Authentication Protocol” and 2 legged OAuth if you select “JWT” for “Authentication Protocol”.

Awarded an award

During our regional (Nordic) kickoff last week (while I was skiing with my family) I was proud and happy to hear that I’d been awarded the Outstanding SE Achievement award. Picked up the award today including the accompanying two bottles of champagne. Very nice.

Outstanding SE Achievement FY20

(SE is short for Solution Engineer)

Using Heroku Postgres from local development setup (take 2)

Previously wrote about this (Using Heroku Postgres from local development setup) but starting with Postgres 8 implicit disabling of certificate verification is deprecated and will be removed. To avoid the deprecation message add “rejectUnauthorized=false” to the URL as well so a full URL in your .env file (or similar will be):

DATABASE_URL=postgres://foo:bar@ec2-55-222-96-152.eu-west-1.compute.amazonaws.com:5432/baz?rejectUnauthorized=false&ssl=true

Yet another certification bites the dust!

Yay!

Tonight I took and passed the exam to become Salesforce Certified Sharing and Visibility Designer. This is one of the two missing certifications on my track to become a Salesforce Certified Application Architect. Very happy I passed and must say that it is among the hardest Salesforce certifications I’ve taken so far.

Not necessarily because of the topic but more because the questions asked are very long and very hard to decipher at times. Many questions seems to test your ability to read and understand English more than your ability to grasp sharing and visibility topics. Also for many of the questions you need to select 2 or 3 answers instead of the single radio-button answer of many other certifications.

Generate a Java Keystore (JKS) which is importable in Salesforce

Salesforce only supports the Java Keystore (JKS) format for importing private/public key pairs (with certificate) into a Salesforce org. Certificates and private/public keypairs are important when using Json Web Tokens (JWT’s) for integration using outbound flows as the JWT needs to be signed using the private key.

If working with Named Credentials for an outbound JWT token flow you need to import a private/public key into Salesforce using “Certificate and Key Management” in Setup. In the latter case you could also use a self-signed certificate generated in Salesforce.

What ever you do you need a valid keystore. Below are the commands I use to generate a private/public keypair with openssl and then use keytool (the Java keystore tool) to import into a Java keystore valid for Salesforce.

# generate private/public keypair
openssl req -newkey rsa:2048 -nodes -keyout private_key.pem -x509 -days 365 -out certificate.pem

# write certificate in binary file (some sytems need binary format)
openssl x509 -outform der -in certificate.pem -out public_key.der

# get the public key from the certificate
openssl x509 -in certificate.pem -pubkey > public_key.pem

# import certificate into Java Key Store (JKS)
# !!! Be sure to trust the certificate - otherwise it's not imported
keytool -importcert -file certificate.pem -keystore keystore.jks -alias mycertificate -storetype jks

# create a PKCS12 keystore with private/public keypair
openssl pkcs12 -inkey private_key.pem -in certificate.pem -export -out keystore.p12 -name mykey

# import keypair into Java keystore
keytool -importkeystore -destkeystore keystore.jks -srckeystore keystore.p12 -srcstoretype pkcs12 -destalias mykey -srcalias mykey

Use node.js LTS for Azure Functions

Did a proof-of-concept for an Azure Function in node.js that uses Redis Cache for session storage as the function runtime is 100% managed and using memory store doesn’t make sense. Once I had this I wanted to play with the local development support but that didn’t work as I was using node.js v. 13 and the Azure tooling only works an LTS version of node.js (meaning 12.x at the time of this writing). To fix I had to uninstall my current version of node.js and switch to v. 12.x as follows (I’m using Homebrew to manage dependencies):

brew uninstall node
brew install node@12

Then I had to update my PATH as the node binary is not in /usr/local/bin anymore but rather in /usr/local/opt/node@12/bin. Once that was done the Azure tooling for local application development worked like a charm.

The proof-of-concept is at https://github.com/lekkimworld/poc-azure-functions-with-session.

Utility to generate JWT’s for use with Salesforce

Whenever you work with Json Web Tokens (JWT’s) generating them for testing is always a hassle as they usually are required to expire quite quickly (like in the order of minutes). To make that easier I wrote a small utility in node.js to generate JWT’s compatible with the Salesforce OAuth 2.0 JWT Bearer Flow.

Code is on Github at https://github.com/lekkimworld/salesforce-jwt-generator

YMMV!