GDPR

So I’m not usually a guy who enjoys legalese and toying with paragraphs but I must admit that GDPR interests me. Both as a consumer and as a professional. As a consumer I find it nice and a great initiative to protect my rights and privacy as a consumer. I find the privacy regulations and the added responsibility put on service providers to be a welcome change. With the economic penalties outlined in the legislation the GDPR has to be respected. And I think they will – maybe once the initial battles has been fought.

As a professional I have a different approach and a different take on it. While also interesting the burden put on companies are very big and the challenges that has to be solved can seem somewhat insurmountable. Thinking about data in CRM, ERP, file shares, web site logs, e-commerce, data from POS terminals to name but a few makes this potentially a very big thing. What does it mean to allow transparency and data portability? What does it mean to be forgotten? With an IP address being considered PII (private identifiable information) it makes even core systems like web site logs and tracking systems subject to change. How do I even figure out where these pieces of information are stored. It’s indeed a great challenge. At least for B2C companies – it will most likely be much less burdensome for B2B.

To make matters worse the GDPR legislation was adopted by the EU on 27 April 2016 and it becomes enforceable from 25 May 2018 after a two-year transition period. But yet we are only really starting to take it serious now. How can that be? I’m starting to see this as a next year 2000 problem but whereas Y2K was takes serious a long way out this seems to have been mostly ignored. At least from where I sit. It will be very interesting to follow.

The project I’m on now is actually about transitioning a series of black-box consumer signup systems into a transparent Salesforce Service Cloud installation for a customer while ensuring double opt-in and keeping records of consent. We are on a pretty tight schedule to be ready for 25 May but it’s looking okay but the scope is also pretty well defined. But if this had been for the entirety of the customer data it would have been much worse. Now the project is much bigger than this but it’s interesting how it took the GDPR to get them going – maybe it was a good thing as it probably helped their business case internally.

There once was a product called Hannover…

<rant>

Once upon a time – think 2006’ish – a complete revamp of a product was unveiled. The revamp went under the codename Hannover after the name of the city it was unveiled in. The revamp was to blow competition away and make the supplier of the product rule to World with the new product platform, new technologies and all the amazing stuff the client would be able to do. And it was amazing. It was like magic and provided access to new and amazing features and exposed great API’s that allowed developers to build sweet apps to bridge the gap between the proprietary world of yesteryear and the new internet era. It was built on a proven open source source platform and built using a proven industry standard programming language that many developers knew. It could be said that the language was the Lingua Franca of its time. To make it even better the client would be backwards compatible and run all the apps of its predecessors – like all the way back to the very first version of the product from the good ol’ DOS days. In many ways it was almost too good to be true.

It did however also not quite turn out the way the supplier had hoped. There was a problem with all this goodness. Not in the product. Not in the ambitions. Not in the chosen platform. In many respects it was a good idea, a good launch and the product delivered in most – if not all – of the areas it had promised new and amazing solutions for.

The problem was in the application developer support. They failed the product. Or maybe more to the point – the supplier failed the developers.

For the last 5-10 years nothing much had happened on the platform. Sure the platform had adopted JavaScript and Java and sure it had brought incremental improvements to the appdev experience. New feature here. A new simply action there. But nothing massive. But now the supplied threw this completely new way of developing apps on the market. The change was so massive and all the supplier would talk about was all these new capabilities – why wouldn’t they? Problem was that they had lulled developers to sleep with histories of declarative programming and how visual programming and laying out elements on screen was enough. One client to rule to them all. Simple actions and formulas to solve complex issues. But then all of a sudden developers was expected – from one day to the next – to grasp component development (called JSR 168 portlets at the time), data coming from different backend sources, UI threads, async programming, regex’es and low level widget development.

I’ll go out on a limb and state that the product was a failure. Sure customers migrated to the new client but many hesitantly. And it took a long time. Many never reaped any of the benefits of the new platform and ended up jumping ship.

Some developers did make the jump however as they were real developers. But many did not. It sounded too complex to them and it was. It didn’t align with the World they knew. They were business peoples trained to be developers – not developers by trade which was really a requirement. They chose to ignore the revamped product and all the features it brought. So all the good effort, energy, time and money put into product fell by the wayside. Sure it was used by some (including yours truly) but for many it was never adopted. So sad…

But why is this important and why am I writing it now?

Well to be completely honest it’s just a brain dump of thoughts. I find it interesting as time passes how often I see the same pattern reappear. Different products, different ways they try to reinvest themself and different results. In my opinion completely reinventing the way things are done in a product without making absolutely sure you either address the new message to the right audience or make sure the story is complete when told is due to fail. Of course some products are better than others and some suppliers are better at listening than others. But it’s interesting to watch.

But boy that Hannover product could have ruled the World if the supplier had been better at preparing developers beforehand and making sure they got on board.

</rant>

That moment when grown men start crying…

OK I admit it. I cried. A little… So there are many emotional moments in a life but seeing the succesful launch of the SpaceX Falcon Heavy is one of those. I cannot not be impressed of the work it must have taken to a) design the thing, b) build the thing, c) launch the thing and d) bring the freaking boosters back to Earth. Oh and the secret e) the social media coverage and small touches from the Tesla Roaster onboard, Space Man,  the “Don’t Panic” text on the screen, the plaque of names and finally but not least Life on Mars by David Bowie playing alongside. Wow! Just wow!

I highly recommend watching the video of the launch with the intro or if you’re not quite to it the actual launch from T-10 seconds below it.

Salesforce Release Notes

So I’m reading through the Salesforce Spring 18 release notes for a customer project starting later this week for specifics on changes related to GDPR. So happy for searching as the release notes are a wopping 459 pages. What! Seriously? Oh yes…

The way I tend to go through the release notes is to get the PDF version as I find it easier to skim and search through. The release notes are also available in HTML which can be good if you know you’re only looking for say Lightning related changes or AppDev related changes.

Only other piece of advise – make sure you’re sitting comfortably and have coffee at the ready!

Salesforce Spring 18 release notes (PDF)
Salesforce Spring 18 release notes (HTML)

Hello WordPress World!

So I’ve finally made the move from the old trusted blogging platform I’ve been using since 2004 (!!) to WordPress.com. So no more hosting myself but hosting and paying a provider in the Cloud (WordPress.com). I’ve never really used WordPress besides looking looking a bit so this should be a fun adventure.

The conversion was done by yours truly using a script I wrote to convert from the old proprietary XML format to the WordPress format. The script is written in Typescript and can be found on Github if anyone is interested. To start out with the script would convert both posts, comments, tags, images and files but I was finding the import of images pretty unstable so for the final import I scrapped that part and just uploaded all images and files manually.

While I’ve been testing the support from WordPress.com has been very cool and very helpful. Thank you.

Final stats for the conversion was as follows:
Posts: 1633
Comments: 2542
Files: 142
Images: 402

All this since 2004 with my first post ever on adding additional tuners into my MythTV box to my last on the Pebble platform in 2018 on unsigned integers on Arduino.

Thank you Pebble. I’ll soon be putting the old server to rest in the recycle bin on some old VMWare server somewhere. We had fun. You’ve followed along for the last 14 years – let’s see what the future will bring.

Note to self – subtracting unsigned integers

I was having a small Arduino problem this weekend involving counters so had to do some research on subtracting unsigned integers from one another in the context of counter rollover. Interesting solution to that problem using modulus UINT_MAX+1. Thank you to this answer on Stackoverflow.com (Is unsigned integer subtraction defined behavior?). Below is my test code.

#include <stdio.h>

unsigned char counter = 0;
unsigned char previousCounter = 0;
unsigned int interval=47;

int main () {
   for (int x=0; x= interval) { // check for rollover
         printf("Reached interval at %d (%d), %d n",
            currentCounter, previousCounter, (unsigned char)(currentCounter - previousCounter));
         previousCounter = currentCounter;
      }
   }
}

Solving my Sonos morning alarm problem

I’ve jumped on the Sonos bandwagon a while back and I have a couple of Sonos players around the house. One in the bathroom, one in the kitchen, in the living room etc. I’ve been using them with alarms in the morning (weekdays only) so that the bathroom one tuned in to local radio (streamed of course) at 5.30am, the kitchen one joined in at 6am and then the upstairs one at 6.30am when it’s time to wake the kids. The problem with this setup is that Sonos does not offer a way to let the players join into the same group meaning that eventually / sometimes they will be slightly apart in the streaming of the audio which is really annoying. I’ve been looking for a solution for a while but didn’t find a solution until now.

My solution has been to use the node-sonos-http-api running on a Raspberry Pi and using cron. Very simple really. Basically the node-sonos-http-api offers up a local HTTP server on port 5005 that accepts GET requests to perform actions on players such as playing, tuning in to Tunein stations, joining groups etc. So I have a couple of cron jobs that first thing makes sure the bathroom player leaves any other group it might be in, sets the volume and then tunes into the radio station. Later the kitchen players volume is set and the player joins the bathroom player and so on. Very cool and works like a charm.

The biggest problem turned out to be getting node and npm installed on the Raspberry Pi v.2 I’m using but downloading the binaries manually and installing manually (unzipping really) instead of using apt-get worked like a charm. Then I simply run the node-sonos-http-api server on startup and have cron do the requests using curl chaining multiple requests together using && where necessary i.e. when I need to change volume and join a group in one cron job.