Deprecation and End Of Support for Platform API Legacy Versions

So this is finally happening. When I write finally it’s because I am sure this will lead to removing quite a lot of code in the Salesforce core application. With Summer 21 we are deprecating a bunch of API versions (see the release notes for more info) which means we will finally remove support for the API versions in Summer 22.

Below are the API versions that will be deprecated and ultimately removed:

SOAP AP: I7.0, 8.0, 9.0, 10.0, 11.0, 11.1, 12.0, 13.0, 14.0, 15.0, 16.0, 17.0, 18.0, 19.0, 20.0
REST API: 20.0
Bulk API: 16.0, 17.0, 18.0, 19.0, 20.0

Consider yourself notified… 🙂

Salesforce API Postman Collection

If you develop with the Salesforce API’s the following will save you a lot of time and make your life a lot easier. In our ongoing effort to make the Salesforce API’s easier to consume our Developer Relations have built a Postman Collection with 200+ requests. I know I’ve spent a lot of time to maintain my own collection for demos – that time is over!

The collection covers the following Salesforce APIs:

  • Auth
  • Bulk (V1 & V2)
  • Async Query
  • Rest
  • UI
  • Tooling
  • Metadata
  • Composite
  • Chatter
  • Connect

The collection and instructions are available on Github and there is even a webinar to help you get going.

Trying to explain Person Accounts in Salesforce

Person Accounts in Salesforce keeps confusing developers not at home on the platform due to their special behaviour. The purpose of the repo (https://github.com/lekkimworld/salesforce-personaccount-field-reference) is to hold some examples on how to work with Person Accounts in an org using Apex and the Bulk API to try and illustrate a few points.

What ARE PersonAccounts

First of knowing WHAT a Person Account is is important. In Salesforce we normally talk about Accounts and Contacts with the Account being the company entity (i.e. Salesforce.com Inc.) and the Contact being the people that we track for that company (i.e. Marc Benioff, Parker Harris etc.). It means that we have to have an Account and a Contact to track a person in Salesforce. But what if that doesn’t make any sense like when tracking individuals for B2C commerce or similar? Meet the PersonAccount.

Please Note: There is no such object as PersonAccount in Salesforce. There are only Account and Contact but in the following I’ll use PersonAccount to reference this special case for Account.

PersonAccount is a special kind of Account that is both an Account AND a Contact giving you the possibility to treat an individual using an Account. The secret to understanding PersonAccount is knowing that using a special record type and specifying it when you create the Account, Salesforce will automatically create both an Account AND a Contact record and automatically link them and thus create the PersonAccount. Salesforce automatically makes the fields that are normally available (including custom fields) on the Contact available on Account. Only thing you need to do is follow a few simple rules that are listed below.

Please Note: When using PersonAccounts you should always access the Account and never the associated Contact.

Referencing Fields

Because there is both an Account and a Contact for a PersonAccount there are some special rules to follow when referencing fields. This goes for any access whether that be using Apex, REST API and the Bulk API. The rules are pretty easy and are as follows:

  1. Always reference the Account object
  2. When creating a PersonAccount create an Account specifying the record type ID of the PersonAccount record type configured in Salesforce. Doing this makes the Account a PersonAccount.
  3. Fields from Account are available on Account (as probably expected):
    1. Standard fields from Account Referenced using their API name as usual (i.e. Site, Website, NumberOfEmployees)
    2. Custom fields from Account Referenced using their API name as usual (i.e. Revenue__c, MyIntegrationId__c)
  4. Fields from Contact are available directly on Account:
    1. Standard fields from Contact The API name of the field is prefixed with “Person” (i.e. Contact.Department becomes Account.PersonDepartment, Contact.MobilePhone becomes Account.PersonMobilePhone) UNLESS we are talking FirstName and LastName as they keep their names (i.e. Contact.FirstName becomes Account.FirstName, Contact.LastName becomes Account.LastName)
    2. Custom fields from Contact The field API name suffix is changed from __c to __pc (i.e. Contact.Shoesize__c becomes Account.Shoesize__pc)

IBM Connections application development state of the union – part 1

IBM Connections has been on the market for a lot of time now and has always been a real strong player when it comes to application development. I thought it was time to review where we are application development wise over what will probably be a couple of posts. First off is API’s…

IBM Connections is and has always been strong from the point of API’s – there is an API for almost all areas of the product and always has. I think IBM Connections was the first product (maybe still the only one) that was built by IBM Collaboration Solutions using an API first approach or at least with a strong emphasis on API’s.

The API’s are extensive and are pretty straight forward to use. The main caveat about the IBM Connections API’s is that they were designed a looooong time ago and hasn’t been updated since. I went to the documentation to check and the majority of the API’s hasn’t changed one bit since v. 2.5. This means they are still using the once cool Atom publishing protocol using lots of XML and lots of namespaces. Stability is very nice but an XML based API is really appdev unfriendly these days and the use of namespaces makes it even worse and difficult to handle – you either handle the complexity or you simply choose the parse without namespaces. Extracting data is either done using XPath or string parsing none of which is easy or performs well and again the namespaces makes XPath notoriously difficult.

This being said there is one exception to the XML based nature of the API’s. When Activity Streams were added in IBM Connections 4.5 it was brought to market with a JSON based API and is still the only component in the IBM Connections suite using JSON based API’s. Due to its roots in OpenSocial the API was however hard to grasp and I don’t think it got much traction despite my best efforts to make it approachable. My presentation on how to use the Activity Stream remains one of my most popular presentations at conferences.

When this is said and done I think that the most important thing IBM could do for IBM Connections API wise would be to update the API’s. In todays world a JSON based API is really expected and is the defacto way of accessing data. Do away with XML and namespaces and adopt a standards based JSON approach to messages being sent to the server and returned from the server. Of course the legacy Atom based API should be left in but it should really be augmented with a JSON ditto as soon as possible.

Besides the API’s there is also a set of SPI’s (Service Provider Interfaces) for stuff like hooking into the event subsystem, discovering features etc. The event SPI is pretty well designed and is very useful. However it seems that it was never really polished, completed and designed for customer / partner consumption as much of the functionality is reserved for IBM widgets and never documented or supported. Other pieces of the SPI’s can only run within the same JVM as IBM Connections which really makes it unusable from a partner perspective.

The worst thing about the API or SPI is however what is not there…

There is no support for programmatic user registration or user deletion / deactivation. There is no support for feature discovery from outside the JVM. There is no support for getting a complete member lists for a community based on a (LDAP) directory groups. Using the community membership API will return a list of members some of which may be groups. If that’s the case good luck. Your best / only option there would be to use WIM (WebSphere Identity Manager) to resolve the group and then manually combine that result with the direct community members. Of course if you are using IBM Domino as the LDAP server start by figuring out how IBM mangled the group UID before resolving it in WIM. That’s really a can of worms of its own and maybe worth a blog post of its own. There is no support for a super-user that can access data on all users behalf. There is no support for easily reusing the IBM Connections header bar if using IBM Connections on-premises.

I don’t want to finish this post being too pessimistic but there is really room for improvement from the API / integration standpoint. IBM needs to step up to the plate and update the platform and make it current from an API perspective. Oh and while you are at it document the API’s, create up to date examples and actually make the documentation discoverable using Google…

Calendar integration example using OnTime Group Calendar API

The Problem

A Danish insurance company was running a CRM system to control and maintain customer relationship information and plan meetings for its insurance agents. Because the CRM system wasn’t integrated with their IBM Domino infrastructure running their mail and calendar the insurance agents in effect had two calendars – one in IBM Notes holding their company appointments and one in a CRM system holding their external customer meetings. This meant that the insurance agents effective stopped using their calendars or maintained a third, non-company, calendar) as there were no single place to see all appointments. Besides this the organization as a whole was unable to plan internal meetings with the insurance agents as their calendars didn’t reflect their actual whereabouts.

The Solution

To remedy this they decided to use the OnTime Group Calendar API to integrate the two systems using web services using an intermediate Enterprise Service Bus (ESB). The OnTime Group Calendar API web services are hosted directly on IBM Domino and performs extremely well. After implementing the solution the insurance agents only need to maintain their calendar in Notes as it will reflect their true calendar showing both internal, external and personal appointments and meetings.

The solution provides a true two-way synchronization so any appointment planned from their CRM system shows up in the calendar in Notes. If the user reschedule the appointment the corresponding appointment in CRM is automatically updated and if the appointment is deleted the appointment get cancelled in the CRM system as well as a follow-up activity being created to make sure a new meeting is planned. The personal calendar in Notes is also updated once a meeting is marked completed in the CRM system to allow for automated expense reporting based on the personal calendars in Notes. As an added benefit of the using the OnTime Group Calendar API all insurance agents are now able to use their Notes client, their iNotes webmail or their mobile device to do their job resulting in true mobility and added flexibility.

Below is an architectural drawing showing how it all integrates using Domino as a central application server.


(click image for a larger version)

Domino rules as an API application server platform

Over the last 1.5 years we at OnTime have been making a serious investment into rewriting our backend for the OnTime Group Calendar product. The most fundamental change has been the change from a traditional Notes/Domino application (if it ever was a such) to be a server/API first approach. This means that there’s now a layer of separation between the business logic on the server and the business login in the clients namely the API. The API is now king!! It means that most, if not all, knowledge about the server operation and data storage has been removed from the clients and server and clients may evolve indepently of one another.

Another key benefit of an “API first” approach is that new exciting UI’s and uses of the OnTime data has started to pop up as it’s now easy to get going. As an application developer you do not have to worry about the OnTime backend or even worry about Domino. As long as you have HTTP access and may use JSON you’re all set.

As an example of where you’ve seen the power of the API approach – maybe without thinking about it – is in Mail and Calendar in Notes 8+. Think about how adding Java views to the “mail and calendar experience” in Notes 8+ has allowed the “mail and calendar experience” to leapfrog the rest of the client. This is because there’s a layer of separation between the data and the views and it allows them to evolve separately and at different paces.

Redesigning OnTime to an “API first approach” hasn’t been a smooth ride all along and sure there has been things we’ve changed along and way and the API backend has been rewritten more than once. The key is however that the interface to the API has only changed once which really wasn’t so much a change as a move to a new, additional, data format for the response messages. This has been key as it allowed the clients to stay constant while the API changed.

The server side API has been written in Java all along which has been a very good choice for us. The request and response messages started out being in plain text in our own proprietary format which worked out well in the beginning but it has become clear that the end user (programmer) still needed a fair amount of knowledge about OnTime to use the API. Over the last 7-9 months we’ve made a transition from first plain text in/plain text out, over plain text in/JSON out to a place where we’re now completely JSON in/JSON out. Taking it slow has been important as not to break any clients meaning our own clients or any customer API programs there might be out there now since we opened up the API for purchase.

One thing we’ve learned however is that API versions in the “real world” needs to coexist and we really couldn’t rip and replace. We’ve now move to a model where the user of the API sends an expected API version number along with the requests to signal what API version he/she expects to be returned and the API will honor that. This means that we’ve managed to move along while still maintaining backwards compatibility. Furthermore we are quite safe that we will be able to futher evolve the API while shielding customers and clients.

The real beauty about the API is the Domino application server though.

We have implemented and evolved on the API from day one using nothing more than a standard Domino server – running the HTTP task if HTTP access to the API is required – using standard Java agents. No fancy OSGi plugins here (yet!) – everything is plain ol’ Domino. Keeping everything on Domino has also allowed us to easily leverage the API in an integration problem we’re doing at a Danish customer at the moment (more on that in a later post). How? Using the web services framework available in Domino of course. Nothing fancy there. Just add a web services endpoint to respond to a SOA server and we were golden – it too scales extremely well.

Builing on Domino hasn’t limited us in any way yet – quite contrary. It has allowed us to add additional application endpoints using the easy of development approach of Domino and it has allowed us to scale well beyond thousands of users (think 15.000+ users) without problems. All this and still allowing us to seamlessly service clients from Domino 7 and upwards. So sweet.

Below is the API stack we’re using as you can see Domino feature prominently at the bottom of the stack.



(click image for a larger version)

So start building you own API’s for your Domino apps. It will be very much needed to get the full potential from Notes 8.5.4 Social Edition and the embedded experiences support in there. It’s all nothing if there is no API to read from. If only there were a Domino oAuth provider… Oh well – I’m sure it will all be there in good time.

Latest pet project

My latest pet project has been working with the OnTime Group Calendar API and try to look into new product ideas or use-cases for calendar data. One thing that came up really quickly is a way to expose all the calendars in OnTime in iCal format and using the API and ical4j it took all about 10 minutes to do. So now we have the option to expose each user in OnTime Group Calendar as an iCal feed honoring the access rights of the logged in user. Now that the data is available as iCal as well it makes remixing and reusing the data in custom scenarios extremely easy and the number of clients grew from our 6 to basically unlimited.

Below is an screenshot example of my adding OnTime calendars as subscribed calendars on my iPad (click the image for a larger version).



(click on image for larger version)

Now using multiple overlays on my iPad (or my iPhone) is just as horrible for as using multiple Lotus Notes overlays (or other clients) which only justifies why we develop client UI’s tailor made for the purpose of group calendaring. But… The overlay option is very nice and allows me to quickly discover what a colleague is doing while not in another OnTime client. It even allowed me to expose my business calendar very easily to my wife. No more manual weekend calendar sync – win!!

I’m not saying this will ever become a part of OnTime Group Calendar but it sure is nice.

The code is written as an agent so it was very easy to deploy and using a Internet Site rewrite rule I was able to do a nice URL per user. So now we have a url like http://example.com/ical/jdoe@example.com for each user.

Now off to look for more ways to use the API.

Notes 8.5.3 and Sametime Web API woes

For one of our products we are using the Sametime Web API which is a great way to Sametime enable applications outside the Notes client. Basically it uses the web container running within Notes Standard to provide a HTTP API for Sametime awareness. Great. We had an issue however after upgrading from Notes 8.5.2 to Notes 8.5.3 where Sametime awarenss stopped working but thanks to the design partner program and the help of Khuan Hoe Kong from IBM we’re now back in the business. As this issue may bite others I thought I would post it here.

Our problem was that we were using the “mystatus” and “getShortStatus” API calls to get awareness status that is we were using local calls like http://localhost:59449/stwebapi/mystatus/ and http://localhost:59449/stwebapi/getShortStatus. These calls returned JSON we could use to get awareness status and as previously mentioned these calls worked fine in Notes 8.5.2 but after going to Notes 8.5.3 (and hence a newer version of the embedded Sametime client) they stopped working. Apparently these calls were considered a security risk so there were blocked and needed to be unblocked in order to work. Below is the response from IBM.

For security reasons, many of the WEBApi functions are disabled by default, including the mentioned two. To enable the function, add a preference to the plugin_customization.ini in the following format:

com.ibm.collaboration.realtime.webapi.<function>Enabled=true

For example, to enable the getstatusshort, you would add this preference:

com.ibm.collaboration.realtime.webapi
   .getstatusshortEnabled=true

To enabled all the Web API functions you can use the global override:

com.ibm.collaboration.realtime.webapi/
   enableAllWebApisOverride=true

The details of this change may be found in ST8.5.1 SDK.

While the above solution works it requires the customer to configure the client and hence wont work for us. The funny thing is that while the getstatusshort method is blocked for “security reasons” the getstatus method works just fine. Only difference is that the getstatus method returns more info. Makes no sense.

My main reason for considering this a bad approach is that I get an HTTP code 200 back but the response is empty/invalid. That seems wrong. A better REST solution would be to return a 403 Forbidden with an explanation why. What would in my mind have been the correct solution. Hope that can be incorporated into a future release.

But it works now so I will go back under my rock.