Because software solutions rarely operate in a vacuum, integration is a necessary fact of life for many developers. Sometimes it’s easy. Anyone who has integrated an application into Slack, for example, will have been treated to an incredibly smooth experience. In many cases it’s as simple as filling in a form (a URL or two, an authentication key) and hitting the Submit button. That’s plain awesome.
But then you have the complex integrations, the ones you find in the Fortune 5000, that might involve weaving together third-party products numbering in the double digits. These are the environments where the level of specific customization and the complexity of the business logic you can use are determined by the openness and comprehensiveness of each vendor’s API.
When the time comes to stitch together those APIs and communicate with the outside world, the process of integration moves well past configuration and starts looking a lot like code. This is where the external scripts tools and API work comes in. Historically, this has been done using DOS batch files, Linux shell scripts, Perl, Python, and so on.
However, there’s a new tool of choice for these scenarios: Node.js. Why should you use Node.js for linking third-party APIs, publishing external APIs, integrating private and public cloud environments, automating deployments, and other gluecode projects? In fact, there are five reasons why Node.js is exceptionally valuable for integrating complex and hybrid environments.
1. Node.js is a lingua franca
While we’re on the subject of language, the fact that Node.js compiles at runtime means the code remains human readable, a key reason scripting languages are such a boon for integrators. Node.js offers extremely good performance to boot.
2. Node.js has modules, lots of modules!
An enormous part of the popularity of Node.js is the vast community of contributors, enabled by the externalization of reusable code as modules, which can then be incorporated using a
require() statement. The simplest are in the format:
var myName = require(‘external-module’);
The number of published modules is staggering. If you’re trying to write code to interface with a popular third-party application, chances are someone has already done it and published a module and documentation that enable you to achieve your objective in a few lines of code. MySQL and MongoDB are good examples.
This is incredibly important; after all, the goal is to have an effective system up and running with the minimum of additional code (no matter how much fun coding in Node.js can be).
Vendors too are getting in on the game. It’s now very common to see wrappers and clients for tools and applications available as Node modules. Twitter’s client “SDK” is an excellent example.
Here’s an example of a module we publish to provide an easy way for users to build and send events to our Moogsoft AIOps system:
var MoogEvent =require(‘node-moog’).MoogEvent;
myEvent.description=’My new description’;
Node modules are a fantastic way for vendors to expose their APIs and functionality, and to have them incorporated natively in Node.js code.
3. JSON is native
JSON is human readable, which is useful both to the humans who are trying to deploy expeditiously and to those who maintain the result. Even when you’re trying to debug the condensed form of JSON, you can draw on a variety of online tools exist that make the work easier. My favorite is JSONLint.
JSON is far easier to interpret than XML and gradually edging out XML as the de facto data interchange format for the web. Cloud-first solution providers almost universally employ JSON as the default payload.
An example payload:
Can be handled thusly:
var message = JSON.parse(payload);
console.log(message.title); // Will output “example”
This ability to handle data interchange between external systems in the same way you would handle data within the code dramatically speeds integration efforts.
4. REST is native too
OK, so REST is not native, per se, but for all intents and purposes it might as well be.
Node.js has native support for HTTP/HTTPS, so it’s simple to do a
GET or a
POST to a RESTful endpoint. Even if a vendor doesn’t offer a Node.js module to do the job for you, it might at least offer sample code you can cut and paste.
In the worst-case scenario, if the vendor is being stalwartly “language agnostic,” it will almost certainly offer you an example using Curl. With a little practice it’s easy to figure out how to transfer the Curl arguments into a Node.js HTTP or HTTPS request.
Of course, you can turn to a variety of Node modules that will provide fully featured REST connections while hiding the more complex workings. (Node.js’ HTTP/HTTPS API is actually very low-level in order to ensure there are no functional limitations.)
Why is REST so important? In the same way JSON is becoming the de facto data-interchange format, RESTful web services are rapidly becoming the de facto web-friendly protocol—so much so, that for many vendors, REST has become a synonym for API.
At Moogsoft, our bots have built-in REST capability, so integration with other applications and web services that offer REST endpoints is a breeze. We’ve also implemented a RESTful server, so external applications can interact with the system.
Speaking of which, Node.js’ HTTP/HTTPS module also offers server capabilities, so a Node.js application can listen for, and respond to, REST methods.
If you want to take advantage of an application’s outgoing REST support and offer a complex and rich web service, it’s worth looking at the Express Node module, which makes writing web servers quick and easy. The Express framework powers many of the internet’s most significant websites.
REST and its machine-data cousin WebHook are not only great for building intersystem APIs, but also for creating commands and tools. Look to them for chatops integration as well.
I mentioned how modules enrich Node.js and support a thriving developer community. Another key feature that makes Node.js so compelling for the systems integrator is how easy it is to publish and access the modules.
Thanks to Node.js’ built-in package manager NPM, distributing and accessing Node modules is supremely easy. A contributor creates a package.json file containing details and dependencies, then simply “pushes” the module to the NPM public repository, where it becomes immediately available.
Downloading the module is equally simple; you can access ours from the command line:
$ npm install node-moog
As an old Unix guy, I’m very comfortable with the command line; in fact it’s my default. Node.js is ideal for creating command-line tools too. Arguments are easy to process (much like shell scripts and batch files) and NPM takes care of installation.
If you’ve created a CLI tool called myTool, for example, enter the following:
$ npm install –g myTool
This command will install myTool globally, and it will be immediately available from the command line. Thus, Node.js is particularly useful for creating tools like sandbox wrappers for chatops commands and for creating scripts for HA scenarios, archiving, reporting, and so on.
New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to firstname.lastname@example.org.
This story, “5 reasons Node.js rules for complex integrations” was originally published by InfoWorld.