Connecting Databases in Node JS

 
    Every web server needs to store some data either in files or in RAM. But there are drawbacks of each. RAM present in a system is very limited and costs more. Also it is volatile memory meaning that the data is lost when device switched off. 
Files on other hand have problem that reading and writing data is very time consuming process. Also to change some line of a file you would need to rewrite the whole file again and again. 

As a solution to efficient storages, databases were designed. Database is another system which writes data to your disk but not in the form of files. This way manipulation data is less CPU intensive with databases. 

This article aims to introduce you to the method of connecting various databases with your NodeJS app.

In any basic NodeJS application you will use all or any of the three following databases:
  • SQL (or Relational Databases) - MySQL, PostgreSQL etc.
  • NoSQL Database - MongoDB, DynamoDB etc
  • In-Memory Databases - Redis
Adding databases to your NodeJS application is very easy. We will look at some of the modules you can use for connecting databases.

P.S. before running the examples, make sure that your databases are running at their default ports. (MongoDB - 27017 , Redis - 6379, MySQL - 3306)

MongoDB
     The npm repository hosts some awesome mongodb connectors. One famous one is the mongoose library. We will be using this module in future articles. Install the mongoose module by doing

> npm install --save mongoose                                                                   

in your project root directory. 

Now open your server.js file and write the following code.

server.js

"use strict";

const express = require('express');
const path = require('path');
const mongoose = require('mongoose');

var app = express();
let PORT = 3000;

function connectToDatabases(){
  mongoose.connect('mongodb://127.0.0.1/myDatabase', function(){
    console.log("Connected to MonogDB database myDatabase");
  });
}

connectToDatabases();

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

Here we create the server normally as before and add another function connectToDatabases() which will initialize all the databases. Then we need to call the function. 
Mongoose module provides a method mongoose.connect() to connect to any mongodb database. It accepts the URL where the database is hosted and a callback function. The callback function is called after the connection is established.

On running this you should see something like:


Note that the database connection is completed after starting the server. This is due to the connection being established in background thread.
Now the connection is established is shared among the entire mongoose library. 

Here you connect to the URL:

mongodb://127.0.0.1/myDatabase                                                              

Here the mongodb:// protocol tells the server which port to look for mongodb. It is same as:

http://127.0.0.1:27017/myDatabase                                                            

Here we specify the port explicitly. myDatabase is the database name which we will use for storing data.

If you have enabled authentication on your database, then you can connect using:

http://username:password@127.0.0.1:27017/myDatabase                        

Mongoose is very popular and large module which will extensively be used in many apps. Hence the usage of mongoose library will be dealt with in next articles. 

Redis
Using redis is fairly simple as compared to MongoDB. We will use the redis module for NodeJS. Install it using:

> npm install --save redis                                                                          

You can connect to your redis instance of your site using the following in the server.js file:

"use strict";

const express = require('express');
const path = require('path');
const redis = require('redis');

var app = express();
let PORT = 3000;

function connectToDatabases(){
  let client = redis.createClient({host: 'http://127.0.0.1', port: 6379});
  client.on('connect', function(){
    console.log("Redis Connected");
  });
  client.auth("MyDatabasePassword");
  global.client = client;
}

connectToDatabases();

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

We first require the redis module and connect to the database using redis.createClient() function in which we need to pass the host and port where the database is hosted. The connection object is stored in the client object which has an ‘connect’ event which is fired when connection is successful. 

If your database has a password, then you can login using client.auth() which accepts a password for authenticating the database.  The problem with this module is that for executing queries, we need the client object. Hence we make the client object a global object so that all the files can access it without the need to reconnect.

Querying using this module is very simple. Suppose you want to store a key value pair:


client.set(<key>, <string value>, function(err, reply){
  // err object is null of successful
  // reply contains the response from the datase
});

// For storing hset
client.hset(<key>, <set_key>, <value>, function(err, reply) {...});

To retrieve the values:
client.get(<key>, function(err, values){...});

There are many other function available at their official documentation. These methods are very simple to understand.

MySQL
The module for MySQL is very similar to the redis one. Install the module by using:

> npm install --save mysql                                                                                      

Write the following in server.js file:

"use strict";

const express = require('express');
const path = require('path');
const mysql = require('mysql');

var app = express();
let PORT = 3000;

function connectToDatabases(){
  let connection = mysql.createConnection({
    host: '127.0.0.1',
    user: <username>,
    password: <databasePassword>,
    database: 'myDatabase'
  });
  connection.connect();
  global.connection = connection;
}

connectToDatabases();

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

As with redis, we create a connection object by passing the host, username, password and database name to the mysql library. Then make this connection object global.

To execute various queries on the database use:


connection.query("SELECT * FROM Person", function(err, results){ .. });

Results is a matrix containing the result in table form.

This is how you connect to various databases in NodeJS. There are connectors for other databases as well which you can find on npmjs. If you do not use connectors from npmjs, you would need to write C level database interfacing code and interact with it.

A detailed article on Mongoose will be covered in the next article.

Serving Static Files using Express in Node JS

 
Static files form a main component of the world wide web we know. Every site has atleast one html page. To serve these files to the customer you can either use servers like APACHE or any server of your choice. Here we will see how to send file from NodeJS. We can send files through an http server but this article will demonstrate the use of ExpressJS as you would be using more of it. 

First of all setup a project structure as shown below:



The index.html file inside the public directory contain some html code. You can write whatever html code you want. Lazy?? Just copy paste this page’s source code. 

Since we will be using ExpressJS, make it a habit to install the following modules when creating a project:
  • express
  • body-parser
  • cookie-parser
Cookie-parser is used to parse request cookies into the object req.cookies. Just remember this, we’ll use it later.

Naive Approach
In the server.js file, write the following:

server.js

"use strict";
const express = require('express');
const path = require('path');

var app = express();
let PORT = 3000;

var bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: true}));

app.get('/', function(req, res){
  res.sendFile(path.join(__dirname, 'public', 'index.html'));
});

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

In the / route handler, we use the res.sendFile method to send the file to the client. This method requires an absolute path which is why use the path module.

Start the server and visit localhost:3000 in your web browser. You should see something like this:

But wait. No styling has been done even though we included a css file. Check the developer console of your browser.


It says that index.css not found. But it was there in the folder right?

The problem is that whenever the html requests any CSS or JS file, it is actually sending another GET request to the server at the route http://<domain name>/index.css. Now since we don’t have any handler set for /index.css route. To solve this add the following in the server.js file:


app.get('/index.css', function(req, res){
  res.set('Content-Type', 'text/css');
  res.sendFile(path.join(__dirname, 'public', 'index.css'));
});

Here we create a handler for /index.css route which sends the css file. We would also need to specify the Content-Type header so that the file is interpreted correctly.
Now run the server and check the site:


Here we create two new endpoints, one for index.css and one for index.js. When a request is made to these endpoints, the server reads the required files and sends them with the correct content-type header.

Now restart the server and go to “http://localhost:3000” . Now this works fine.

This is a tedious task. Suppose there are hundreds of static files, then for each you would need to write an endpoint. This becomes very time consuming and boring task to do. It is also a waste of developer time. To solve this problem, the developers of ExpressJS introduced a support for serving static files directly from the express without having to write endpoints for each of the file. We just need to write the endpoints for those URLs which we need to customise. All the dependent URLs like for CSS, JS, Images, Audio, Video etc etc will be handled by ExpressJS. 

Pretty cool…!!! Let’s see how to do this.

Adding Express Static Magic

This is a simple thing to do. Modify the server.js file as below i.e remove all the CSS and JS endpoints and tell the existing app to use the express.static() function.


"use strict";

const express = require('express');
const path = require('path');

var app = express();
let PORT = 3000;

var bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: true}));

app.use(express.static(path.join(__dirname, 'public')));

app.get('/', function(req, res){
  res.sendFile(path.join(__dirname, 'public', 'index.html'));
});

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

Observe that we removed all the unnecessary endpoints from our app. Instead we tell the app to use the static feature of ExpressJS. This function accepts one argument -- the folder in which the app should search for static files. Here since all the static files are stored in the public folder, we specify the path to public directory as an argument to the static function.

Now whenever a request is made to the our server, it first check if the URL matches any file in the public directory. If it finds a file then the file is sent with the correct headers. If it is not, the the app looks for other handlers to handle the URL. If no handler is specified, the server will return a 404 - Not found status to the client.

We have kept the GET / endpoint in our code as we wanted to send the html page with a request is made to the / URL. If we remove the handler for this URL, then you need to go to “http://localhost:3000/index.html” to see your HTML page.

The express.static() function is a middleware exposed by the express module. We will see more about middlewares in the later tutorials.

Is it necessary to put the files in public folder only?
No. You can name the folder whatever you want. You need to specify the argument to the express.static() accordingly. It is recommended to put all the static files in a single folder. If you put the static files in the root folder, then the script may behave weirdly sometimes.

This is how you can easily serve static files to the clients using ExpressJS.


Express Framework in Node JS

 
Express Framework
     Previously we saw how to build a basic web server using http module of NodeJS. But there were some drawbacks regarding code maintainability. Hence a library/module for NodeJS was developed -- ExpressJS, which can handle the routes and functions of your server. 

As we said, ExpressJS is a NodeJS module which is not a core NodeJS module. This means we need install express for our application.

Install ExpressJS by  running “ npm install express --save ” in the terminal in your project directory. That is all we need to do. 

Now we will create the same server we previously built using ExpressJS.
Change your server.js file to:

server.js

"use strict";

const express = require('express');
var app = express();
let PORT = 3000;

app.get('/articles', function(req, res){
    res.end("Code to display articles here");
});

app.get('/stats', function(req, res){
  res.end("Stats will be displayed here once we complete our app");
});

app.use(function(req, res){
  res.end("More logic to be implemented");
});

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

Here we require the express module and create a new instance of express which will be our server. With this server, we specify the handler for a url. 

All the assigning of handlers to the routes should be called before app.listen(). The order in which also matters as the routes are matched from top to bottom. The handler that matches the route first is executed independent of the next handlers.

Restart your server and open your web browser and visit the previous urls. You should get the same responses as before.



As with the http module, the handler function contain req, res and a next (optional) arguments which you can use to achieve various functions. The req object stores information about the request like the url, params body etc while the res object stores info about the response. 

Different request methods
     In the previous example we saw about send a GET request. Now we need to send some POST request. In a POST request the body is sent as a string. For the ease of processing we will parse it into JSON object. For parsing we will use another module - body-parser. Install the modules using

> npm install body-parser --save                                                               

Now we need to tell the Express App to use this module. So in the server.js file, before we declare the routes add the following lines

let PORT = 3000;

// Add this code
var bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: true}));
// Till here

app.get('/articles', function(req, res){
...

For testing our POST request we will use an app POSTMAN ( you can download it from https://www.getpostman.com/) .

Add an handler to accept a post request:


app.post('/post-request', function(req, res){
  res.json(req.body);
});

Once we use body-parser module, the req.body object in the handlers will contain the body of the POST request. In this handler we are send the body of request as the response in JSON form.

 Restart the server and send a post request:



Hit send and you should see a response like:



You can send other requests like PUT, DELETE etc apart from GET and POST using Postman and handle it using Express Handlers


Web Server on Node JS

 
     In the previous articles we saw various features of NodeJS. Now it is time to sew everything together and start making our Web Server. Before we start building we should know what role does URL play in server.

 For this you should know what are the different parts of a URL:
The domain is a human readable address of the web page. It is translated to the actual IP address of the server by various DNS. The route of the URL is what decides what to execute on the server. Query and Params are some data which we send via the URL.

Simple Web Server
To create a basic web server, we will use the http module. This is one of a core NodeJS modules used for building a web server. It lets you interface the routes and params of a request to execute functions accordingly. 

Create a server.js file with the following code:

server.js

use strict;
const http = require(http);
let PORT = 3000;

let server = http.createServer( function(req, res) {
    res.end(You have visited +req.url);
});

server.listen( PORT, function(){
    console.log(Server started listening on port: +PORT);
});

Now run this code, you should see something like this:


Note that the script does not exit and keeps running. To see the server in action, open your favorite web browser and visit http://localhost:3000 or http://127.0.0.1:3000 depending on how your system is configured. You should see something like this:

Now let us understand the code.
First we require the http module to use it in our application. We then define a PORT on which the server should listen to. Port is a communication endpoint in any operating system. Various processes interact with each other using ports. When developing your web servers, try to use ports greater than 1024 as port numbers lesser than those are reserved by the operating system.

Then we create a HTTP server using http.createServer() which accepts a callback function that acts as the handler for our server. All the HTTP requests sent to our server will be handled by this callback function. The logic inside this handler function will be executed on every request. 

The server.listen() function takes two arguments -- PORT number and callback function. The callback function is executed when the process is started and is listening on the specified PORT. It will be executed only once in the process lifetime i.e. until the server crashes or is stopped.
Try changing the last part of url

The object req.url returns the route part of the URL and not the complete URL as the domain name depends on the system where the server is hosted. 

To add the some logic to our server modify the handler function as:


let server = http.createServer( function(req, res) {
    if(req.url === /articles){
        res.end(Code to display articles here);
    }else if(req.url === /stats){
        res.end(Stats will be displayed here once we complete our app);
    }else{
        res.end(More logic to be implmented);
    }
});

Restart the server by terminating the node process and again starting it and visit the urls in your browser and visit different routes:

  


This is how a Web Server is built on NodeJS. 

This is all well and good but in a large web application, there will be many routeswith different methods and different logic. If we use http module for such large applications the whole code will become very complicated. In the next article we will see how to use ExpressJS for simplifying the server code.



Streams in Node JS

 
     Streams are flow of data from one part of code to another. NodeJS provides stream module to create streams but that is very cumbersome to use and understand. So we will use an application of stream to demonstrate the power of streams. The fs module we know, uses streams internally. This article shows you why and how to use them.

     Previously we saw about File IO in NodeJS. File IO is a very time consuming task (for a computer).  It requires searching a memory location on the disk, read properties, read 1s and 0s, determine what it is, process etc. The File IO methods we saw uptill now work fine when the file is relatively small. Say upto 100 MB. If the file size is large, the IO time becomes large which can halt the system if care is not taken.


     Our current model works as shown above. The file we need to process is first brought into the RAM as whole. This takes some finite amount of time (say 20s in the example above). Then this data is processed using some function of ours. This also takes some time (say 4s at 50MB/s speed). So after a user click a button, he would need to wait a total of 24 seconds before he can see anything. This is not desirable.

     Say the file contains some article. Instead of using above method, what we can do is that read it line by line or paragraph by paragraph. This way although loading the whole file takes same amount of time, the user can see the first paragraph within a short time (0.12s) and while he is reading that paragraph, the other paragraphs arrive one by one. Hence the user perceives that the system is fast as his wait time is reduced.

Hope this gives you an idea of how streams work.

In NodeJS these streams have event emitters with emit different events, but only few are frequently used namely: 
  • data
  • end
We will use the same myTextFile.txt in our code. You can make that file large so that you can see the effect. 

Reading using Streams
Change the server.js code to read using stream.

server.js
use strict;
var fs = require(fs);
var readable = fs.createReadStream(.\myTextFile.txt);

var string = “”;
readable.on(data, function(chunk){
     string += chunk;
});

readable.on(end, function(){
    console.log(string);
    console.log(Finished reading file);
});


Running this code will give you an output like:
     We first create a readable stream from the myTextFile.txt using the fs module. Whenever a chunk of data is ready to be processed by the stream , the stream emits a ‘data’ event. On every data event we add the current chunk to our string. You can add your processing logic inside this event so that whatever data is appended to the string is processed in chunks. After the stream has finished reading a file, it emits an ‘end’ event. So after the end event is emitted, we display the data to the console. 

Similar to the readable stream, there is a writeable stream This is very helpful when some data is transferred over network and you need to write the data into a file. So instead of waiting for the data to be transmitted completely before writing, you can write the data to the file as soon as it arrives.

Example of writeable stream

use strict;
var fs = require(fs);
var writeable = fs.createWriteStream(.\myTextFile.txt);

writeable.write(This is some new text);
writeable.end();

But you will never in practise use streams for writing files like this. Instead you will a combination of both readable and writeable to read from a  large file and write to another file. 

use strict;
var fs = require(fs);
var readable = fs.createReadStream(.\myTextFile.txt);
var writeable = fs.createWriteStream(.\myTextFileCopy.txt);

readable.on(data, function(chunk){
     writeable.write(chunk);
});

readable.on(end, function(){
    console.log(Finished copying);
});

The stream also provides a pipe function to ease the above functionality. The above example can be written like 

use strict;
var fs = require(fs);
var readable = fs.createReadStream(.\myTextFile.txt);
var writeable = fs.createWriteStream(.\myTextFileCopy.txt);
readable.pipe(writeable);

Pipe allows you to directly transfer the chunks from one stream to another.


File IO in Node JS

 
     In any system, file input output services are a crucial functionality which governs many things. In NodeJS, file IO is required file serving html pages or when you want your client to download some file or even when you create some other tools with NodeJS apart from web servers. Following are some of the file IO methods which are frequently used by developers.

The examples shown consider the following file structure:


The server.js file will hold our code to execute and myTextFile.txt will contain some message which we will read or write to. For file IO, NodeJS provides fs module which included in the NodeJS core modules. You do not need to do npm install for fs module.

myTextFile.txt
Hello..!! I am inside myTextFile.txt                                                                     

Reading a file
The fs module provides 2 functions to read files. One method is synchronous i.e. it runs on the main thread (a blocking operation) and the other one is asynchronous. Here is the basic syntax for using the same:

Synchronous Reading
use strict;
const fs = require(fs);
var readFileMainThread = function(){
   var content = fs.readFileSync(.\myTextFile.txt);
   console.log(content.toString());
};
readFileMainThread();

Asynchronous Reading
use strict;
const fs = require(fs);
var readFileAsync = function(){
   fs.readFile(.\myTextFile.txt, function(err, content){
       console.log(content.toString());
   });
};
readFileAsync();

In both the cases, the content returned is in the form of a buffer. To display the actual content you need to convert this into string. Buffers are the exact bits in which the data is stored in the computer -- in the form of 1s and 0s. When you print a buffer to console, it will show some hexadecimal numbers whose binary representation is how the data is stored on the disk.

Writing a file
Writing is also similar to reading. The fs module provides a synchronous method as well as an asynchronous method to write something to a file.

Synchronous Writing
use strict;
const fs = require(fs);
var writeFileMainThread = function(){
   var content = This is my new content to write to a file;
   fs.writeFileSync(.\myTextFile.txt, content);
};
writeFileMainThread();

Asynchronous Writing
use strict;
const fs = require(fs);
var writeFileAsync = function(){
   var content = This content will also be written to a file. But Asynchronously;
   fs.writeFile(.\myTextFile.txt,content, function(err){
       if(err) //Some error has occured while writing
       else //Successful
   });
};
writeFileAsync();

In both the cases, the first argument to function is the file path to which the content is to be written. If the file does not exists then the file is automatically created. The 2nd argument is the content which is to be written. Here we pass the string instead of buffer (as is returned when reading).

Checking Existence of Files and Folders
Many a times you are saving some dynamic data into file. In such cases if you try to read a non-existing file, your application will crash. To prevent this, it is a  good practise to first check if a file exists or not. Although fs module provides both synchronous and asynchronous methods to achieve this, the synchronous function is more commonly used. This is how you use it:

use strict;
const fs = require(fs);
var readFileAfterChecking = function(){
    if(fs.existsSync(.\myTextFile.txt)){
       var content = fs.readFileSync(.\myTextFile.txt).toString();
       console.log(content);
    }else{
       console.log(File does not exists);
    }
};
readFileAfterChecking();

The fs.existsSync() function returns a boolean value. true if the file exists else false. The fs.existsSync() function can be used to check the existence of file and folders both. Just pass the folder path as the first argument.

Creating a directory
Sometimes there is a need to create some directory from your program either in your server or any command line tool. The fs module provides function to create a directory on your disk. Again both synchronous and asynchronous methods are available.

use strict;
const fs = require(fs);
var createDirectorySync = function(){
    if(!fs.existsSync(.\myDirectory)){
       fs.mkdirSync(.\myDirectory);
    }else{
       console.log(Directory already exists);
    }
};
createDirectorySync();

This fs.mkdirSync() function will crash if the same folder already exists. Hence it is advisable to use it only if the directory does not exists.

Please note that for creating a folder its parent folder should exists. i.e. if you are trying to create /folder1/child1 folder, you need to ensure that folder1 exists. fs.mkdir function will not create the whole path if it does not exists.

Reading Contents of a Directory
fs module provides two functions to read the content of a directory. One of them is synchronous which is frequently used with forEach() function of JavaScript.

Synchronous Method
use strict;
const fs = require(fs);
var readDirectorySync = function(){
    if(fs.existsSync(.\myDirectory)){
       fs.readdirSync(.\myDirectory).forEach(function(content){
           console.log(content);
       });
    }else{
       console.log(Directory does not exists);
    }
};
readDirectorySync();


Asynchronous Method

use strict;
const fs = require(fs);
var readDirectoryAsync = function(){
    if(fs.existsSync(.\myDirectory)){
       fs.readdirSync(.\myDirectory, function(err, contents){
           contents.forEach(function(content){
              console.log(content);
           });
       });
    }else{
       console.log(Directory does not exists);
    }
};
readDirectoryAsync();

The fs.readdirSync() function returns an array containing the name of files and folders which are present in the directory. IT JUST RETURNS THE NAMES. You have to decide if that particular name is of folder or a file. Once we get the array of contents, we iterate over it using forEach() function of JavaScript.

Checking if file or folder
In the previous function we had a problem then fs does not tell us directly which is a directory and which one is a file. For finding out this you need to use another function fs.statSync().

use strict;
const fs = require(fs);
const path = require(path);
var readDirectorySync = function(){
    if(fs.existsSync(.\myDirectory)){
       fs.readdirSync(.\myDirectory).forEach(function(content){
           var stat = fs.statSync(path.join(.\myDirectory, content));
           if(stat.isDirectory()){
               console.log(content + is a directory);
           }else{
               console.log(content + is a file);
           }
       });
    }else{
       console.log(Directory does not exists);
    }
};
readDirectorySync();

The fs.statSync() function returns an object having a function isDirectory(). This returns a boolean true if the passed path is a folder else returns false. The fs.statSync() function requires that you pass the full path of the content you are testing. Else it won’t work.

Deleting a file
The fs module provides a function fs.unlink to delete a FILE (only file) from the disk. Again it has 2 variants synchronous and asynchronous.

use strict;
const fs = require(fs);
var deleteFileAfterChecking = function(){
    if(fs.existsSync(.\myTextFile.txt)){
       fs.unlinkSync(.\myTextFile.txt);
       // File deleted
    }else{
       console.log(File does not exists);
    }
};
deleteFileAfterChecking();

If you try to delete a nonexistent file then the script will crash. So be careful.

Deleting a folder
The fs module has a functions fs.rmdir which accepts path of the directory to be removed as its argument. This function has two variants - a synchronous and another asynchronous.  Note: This command can only delete empty directories. If the directory is not empty then you need to remove it's contents using fs.unlink or fs.rmdir on child directories.


use strict;
const fs = require(fs);
var deleteFolderAfterChecking = function(){
    if(fs.existsSync(.\myFolder)){
       fs.rmdirSync(.\myFolder);
       // Folder deleted
    }else{
       console.log(Folder does not exists);
    }
};
deleteFolderAfterChecking();

That is a lot about file IO in NodeJS.

Global Events in NodeJS

 
       We have seen what events and event emitters are in NodeJS. Also we have seen how to create and use our own events. This article exposes you to some of the predefined events in NodeJS which play a crucial role in your application life cycle. 

Following are some of the global events in NodeJS.

Open server.js file and try the following codes to the working of each of the event. The following events are heard by the process global variable. To register a listener for an event, we use 

process.on(<event name>, liistenerFunction)                                                                                           

exit
This is an event which is emitted whenever a NodeJS process is terminating. It can be because the event loop senses that no threads are running or calling process.exit() function explicitly.


"use strict";

var i = 0;
setInterval(function(){
  console.log(i);
  i++;

  if(i==2){
    process.exit();
  }
  
}, 1000);

process.on('exit', function(code){
  console.log("Exiting: "+code);
});


SIGINT
        SIGINT stands for SIGNAL INTERRUPT. This event is triggered whenever the user interrupts a Node process by pressing Ctrl+C or Cmd+C (in Mac). The listener you define here will override the default functionality i.e. stopping the code. That is why we will add a process.exit() inside the listener else the process won’t ever stop by an interrupt.


"use strict";

var i = 0;
setInterval(function(){
  console.log(i);
  i++;
}, 1000);

process.on('SIGINT', function(){
  console.log("Exiting process");
  process.exit(0);
});


It will keep on printing integers unless we interrupt by pressing CTRL+C


On pressing Ctrl+C, our listener is executed which first prints out message defined in our listener. After this the process exists because of process.exit().

warning
       This event is emitted whenever the current NodeJS process emits a warning. You can emit a warning using process.emitWarning() function. This event is used by many database modules whenever some configuration mismatch has happened. You should use this instead of console.log() when developing some modules.


"use strict";

var i = 0;
setInterval(function(){
  console.log(i);
  i++;
  if(i==3){
    process.emitWarning("i has reached 3");
  }
}, 1000);

process.on('warning', function(){
  console.log("warning received");
});


As you can see when i becomes 3,  a warning is emitted which prints (node:22536)Warning: i has reached 3. After this our listener is called which prints warning received. The (node:22536) before the message makes the message stand out from your other logs. The number 22536 is the process id which the OS assigns the NodeJS process. It will be different every time you run this.

uncaughtException
      This event is emitted whenever a function throws an error but is not catched using try-catch block. This is a very powerful event as you can prevent your app from crashing even if an error has occurred. This event allows you to perform some custom function with the error like logging it or sending it to developer team etc. If you catch an error, then this event is not emitted.


"use strict";

var validString = `{"website": "compiletimeerror.com"}`;
var invalidString = `{"website" =  "compiletimeerror.com"}`;

var validJSON = JSON.parse(validString);
var invalidJSON = JSON.parse(invalidString);

process.on('uncaughtException', function(err){
  console.log(err);
});


As you can see, since the invalidString is an invalid JSON object, when you try to parse it into JSON NodeJS throws an error. Since we are not catching this error, it shows up in uncaughtException. Also on uncaught error, the process crashes. Since in a server, we do not want our server to crash on error, we can write the code to restart our server within this uncaughtException event handler.

Now see the same thing with try catch.


"use strict";

var validString = `{"website": "compiletimeerror.com"}`;
var invalidString = `{"website" =  "compiletimeerror.com"}`;

try{
var validJSON = JSON.parse(validString);
var invalidJSON = JSON.parse(invalidString);
}catch(err){
  console.log("Some error");
}

process.on('uncaughtException', function(err){
  console.log(err);
});

In this example we surround the parsing functions within a try catch block. Whenever some error occurs, the try catch block catches it and it prevents server from crashing. In such cases, since the exception or error is handled, the uncaughtException event is not triggered.


As you can see the catch block executes our code and prints “Some Error” instead of the error message.

These are the frequently used process events in NodeJS. There are others also but are less frequently used. You can find more information about them in the official documentation of NodeJS.


Popular Posts