Note to self – script to update all my CLI stuff

npm_update_all_global() {
   npm list -g --json | jq ".dependencies | keys[]" -r | while read line; do
      npm -g update "$line"
   done
}
brew_update_all() {
   brew update && brew upgrade
}
heroku_update_all() {
   heroku update
}
sfdx_update_all() {
   sfdx update
}
lekkim_update_all_cli() {
   npm_update_all_global
   brew_update_all
   heroku_update_all
   sfdx_update_all
}

Loving streams in node.js

Node.js is a great platform for writing scripts. Besides being Javascript and besides having access to npm it lends it very well to data processing as it’s completely async unless you specifically tell it not to be. One of the best aspects in my opinion about node.js as a data processing language is the concepts of streams and using streams to process data. Using streams can drastically lower the memory consumption by processing data as it comes down the stream instead of keeping everything in memory at any one time. Think SAX instead of DOM parsing.

In node.js using streams is easy. Basically data flows from Readable streams to Writable streams. Think producers of data and consumers of data. Buffering is handled automatically (at least in the built in streams) and if a down stream consumer stops processing the upstream producer will stop producing. Elegant and easy. Readable streams can be stuff like files or network sockets and Writeable streams stuff like files or network sockets… Or stdout which in node.js also implement the Writable stream API. Working with streams is like being a plumber so piping (using the pipe method) is how you connect streams.

An example always helps – the below example reads from alphabet.txt and pipes the data to stdout.

const fs = require('fs')
const path = require('path')

fs.createReadStream(path.join(__dirname, 'alphabet.txt'))
  .pipe(process.stdout)
> a
> b
> c

Simple example but works with small and big files without too much of a difference in memory consumption.

Sometimes processing is required and for this we use Transform streams (these are basically streams that can read and write). Say that we want to uppercase all characters. It’s easy by piping through a Transform stream and then on to the Writable stream (stdout):

const {Transform} = require('stream')
const fs = require('fs')
const path = require('path')

fs.createReadStream(path.join(__dirname, 'alphabet.txt'))
  .pipe(new Transform({
    transform(chunk, encoding, callback) {
      // chunk is a Buffer 
      let data = chunk.toString().toUpperCase() 
      callback(null, data) 
    }
  }))
  .pipe(process.stdout)
> A
> B
> C

It’s easy to see how streams are very composeable and adding processing steps are easy. the pipe could even be determined at runtime. The above examples use strings but streams can also work on objects if required.

Streams are beautiful but can take some time to master. I highly recommend reading up on streams and start getting to know them. The “Node.js Streams: Everything you need to know” post is very nice and provides a good overview.

Happy coding!

 

Simplifying usage of Salesforce Lightning Design System using NPM and Express

Using Salesforce Lightning Design System (SLDS) is a great and easy way to add some super nice styling to your app. It comes with some nice defaults and a responsive grid system like other frameworks lige Bootstrap. Where SLDS really shines is of course if you are already using Salesforce (I assume you’re already on Lightning right?!!?!?) or if you are going to. And again who isn’t. Anyways… Using SLDS makes your apps really look like Salesforce which is nice for Salesforce Lightning Components or for an app using Lightning Out to host Lightning apps in external applications.

I often use SLDS for another use-case which is quickly doing a mockup for a new Lightning Component. Doing it in SLDS can often be way quicker than making by hand from scratch or attempting to draw the component using Powerpoint or other tool.

Previously I’ve been using the download option for SLDS ie. grabbing the package of the website, expanding and copying into my app. Today I tried out the NPM solution as I often use node.js and Express when I need to mock stuff up. Installing SLDS is as easy as doing a “npm install @salesforce-ux/design-system –save” and you’re all set. Mostly. Problem is of course that the assets you need end up in the node_modules directory which is not exposed outside the app. The way I get around it is to add a static-rule to my Express server.js as follows:

const app = express();
app.use('/slds', express.static(__dirname + '/node_modules/@salesforce-ux/design-system/assets'));

Then loading SLDS assets into my app is as easy as:

<link rel="stylesheet" type="text/css" href="/slds/styles/salesforce-lightning-design-system.css" />

This works great if/when you post your app to Heroku as well and has the additional benefit of easy version management using NPM and package.json.

JSONata looks very nice

While JSON is a very nice and concise data format it lacks the structure and query capabilities of XML and XPath. Often times querying JSON leads to line on line of code to do proper error checking and retrieve the proper value and – if need be – a default value. Meet JSONata! JSONata is a query language plus so much more. I invite you to look at the slides from the recent IBM tech talk on the matter or visit the JSONata Exerciser to try it out.

JSONata is also available as a NPM module.

Software Dependency Management and the associated risks

Being a Maven convert and a guy that likes to dabble in programming this topic is very interesting albeit not one I’ve thought much about – and I guess this is true for most. Or let’s put it another way. After you start using Maven, npm, flask or whatever other dependency management tool you use for the job you think of dependency management as a done deal. Not having to download a jar / package makes it easier and thus, for some reason, less worrisome to add a dependency. Until this morning where I read a great post titled Developer Supply Chain Management by Ted Neward. If you’re a programmer and if you use Maven or npm or flask or whatever other automated dependency management tool you really should read this.

And if you use it as part of your product development cycle you should read it. Twice… And then act – part of which is talking to the rest of the team about it.

Thinking about dependency management and how to save dependencies should probably come back front and center and this should be a lesson to us all. If nothing else you should implement a local – dare I say on-premises – caching dependency and/or artifact server so that all dependencies are cached, stored and backed up locally (in a datastore you control). If nothing else enforce that all automated build servers download through the artifact server so that all dependencies that goes into a build is known, cached and kept.

It’s definitely something to think about.