July 05, 2016

10 Tips For Optimizing Node.js Applications

by Jscrambler
Tags: Node.js

10 Tips For Optimizing Node.js Applications

In this post, you will learn 10 useful optimization tips to perform in a Node.js application.

Always use asynchronous functions

Okay, why should we always write asynchronous functions? Because this is best part of Node.js, all asynchronous function perform a non-blocking I/O, avoiding CPU idle. For an application which runs a lot of I/Os this simple trick will make the servers work more and better, because a server which is processing non-blocking I/O is able to handle multiple requests at the same time, while one of these requests is performing an I/O. See the example:

var fs = require('fs');  
// Performing a blocking I/O
var file = fs.readFileSync('/etc/passwd');  
console.log(file);  
// Performing a non-blocking I/O
fs.readFile('/etc/passwd', function(err, file) {  
    if (err) return err;
    console.log(file);
});

Use async module for better functions organization

One of the challenges about working with asynchronous functions is to handle multiple chained callbacks. A lot of chained callbacks makes the code very ugly and difficult to read, this is the classic callback hell.

var fs = require('fs');  
// A common callback hell example
fs.readFile('/etc/passwd', 'utf8'function(passwdErr, passwd) {  
    if (passwdErr) return passwdErr;
    fs.readFile('/etc/hosts', 'utf8', function(hostsErr, hosts) {
        if (hostsErr) return hostsErr;
        fs.mkdir(__dirname + '/test', function(dirErr) {
            if (dirErr) return dirErr;
            var data = passwd + hosts;
            fs.writeFile(__dirname + '/test/data', data, 'utf8', 
                function(err) {
                console.log('Done!');
            });
        });
    });
});

To avoid this callback hell, you can use the async module. This is a great module which has a lot of tricks and magic to deal with multiple asynchronous functions. See this example which eliminates the callback hell:

// You must install async before: npm install async
var async = require('async');  
var fs = require('fs');

async.waterfall([  
    function(callback) {
        fs.readFile('/etc/passwd', 'utf8', callback);
    },
    function(passwd, callback) {
        fs.readFile('/etc/hosts', 'utf8', function(err, hosts) {
            if (err) {
                return callback(err);
            }
            var data = passwd + hosts;
            return callback(null, data);
        });
    },
    function(data, callback) {
        fs.mkdir(__dirname + '/test', function(err) {
            if (err) {
                return callback(err);
            }
            return callback(null, data);
        });
    },
    function(data, callback) {
        fs.writeFile(__dirname + '/test/data', data, 'utf8', callback);
    }
], function(err) {
    if (err) {
        console.log(err);
        return;
    }
    console.log('Done!');
});

At first sight, this code looks bigger than the first one, but the async.waterfall uses an array structure to deal with multiple asynchronous functions better than the traditional callback method.
Another useful feature from async is the async.parallel() which executes asynchronous tasks in parallel.

Use ES6 Generators to organize asynchronous functions

There is another way to avoid the callback hell by using Generators from ES6. See how the previous example will look like using this alternative solution:

var fs = require('fs');

function* fileTask() {  
    var passwd =
        yield fs.readFile('/etc/passwd', 'utf8');
    var hosts =
        yield fs.readFile('/etc/hosts', 'utf8');
    var data = passwd + hosts;
    yield fs.mkdir(__dirname + '/test');
    yield fs.writeFile(__dirname + '/test/data', data, 'utf8');
}

var task = fileTask();

task.next(); // Runs the "yield fs.readFile('/etc/passwd', 'utf8');  
task.next(); // Runs the "yield fs.readFile('/etc/hosts', 'utf8');  
task.next(); // Runs the "yield fs.mkdir(__dirname + '/test');  
task.next(); /* Runs the "yield fs.writeFile(__dirname +  
             '/test/data', data, 'utf8');*/

Try Jscrambler for Free!

Use Node.js to only send data

Node.js servers are better when they work sending only data instead of a full HTML page. This is why this platform became so popular for REST APIs. In general this kind of application works with JSON data, which is native to JavaScript, and since Node.js is JavaScript, there is no parsing tasks between JSON and other data formats, so there is only JSON traffic between server and clients which increases the server’s performance by eliminating the parsing tasks.

Use Nginx or Apache for static servers

Don’t waste your Node servers by let them serve static files, because there are servers like Nginx and Apache which work better than Node.js for this task. The reason is simple, Node.js works better processing data instead of serving static files. Nginx and Apache servers have a lot of configuration for serving static files and also have useful cache strategies. Like I said before, always use Node servers for processing data.

Avoid cookies and sessions

Cookies and sessions are techniques to store temporary states in the server. Keep states is an expensive cost for servers. Today, it is very common to build stateless APIs which provide token authentications like JWT, OAuth and others. These tokens are kept in the client-side and avoid the servers to manage states.

Use cluster module for parallel processing

The cluster module is native to Node.js and basically it creates a lot of processes of the application including a master cluster which acts as the load balancer to distribute requests among all the slave clusters. This technique optimizes your servers by using all the CPU cores to work with parallel processing.

Enable Streaming responses

The stream module is native too, and this module allows the streaming of large data for a specific response. This is very useful when the server needs to send videos, audios and any kind of large data, because the stream allows the request to send pieces of data instead of all of it and this technique avoids the buffer’s overflow on the server.

Always use the latest stable version

This tip is too obvious, but always use the latest stable version of Node.js, because of the improvements for JavaScript V8 runtime, which often comes with a better optimization for memory and CPU uses.

Optimize… but don’t forget to protect

Remember, optimization is essential but not enough. You have to think about security too.

Protect Your Node.js App

At Jscrambler we help companies all over the world to optimize and protect their Node.js applications. Sign-up for our free trial here and let us know if you have any questions! We have a team of experts ready to help you!