Handling HTTP POST Requests with Express.js

Express JS

Node.js LogoLearn how to access the body of an HTTP POST request using the Express.js framework and body-parser module.

Forms are a common component in web applications. When a user submits a form, that data is sent to the back-end for processing. To process that data, the web server must understand how to access it. Popular web server languages include Java, .NET, PHP, Python and Node.js. In this article, we’ll learn how to access the POST data sent to a Node.js web server using the Express.js framework. To get started, you can go ahead and clone the following github repository: Handling POST requests with Express and Node.js.

And you’ll find instructions on how to run the code in the Github page.


The package.json for this project is pretty straightforward, and we’ll only need the body-parser and express Node.js modules. We also create a scripts property so that running the example code requires a simple command: npm start.

Requiring the modules we need – Example # 1:

In Example # 1, we’ve imported the Node.js modules that we need. The Express module takes care of the heavy lifting with regard to fulfilling web requests. NOTE: If you’re not familiar with the Express Node.js module, please see my earlier blog post on this subject:  Set up a Node / Express Static Web Server in Five Minutes.

We also import the body-parser Node.js module, which has the critical role of parsing the body of an HTTP request. When it comes to processing a POST request, this is important. And the path Node.js module helps express to construct a file path.

bodyParser.json and bodyParser.urlencoded – Example # 2:

Now, here in Example # 2, we tell express to use the bodyParser.json middleware, which provides support for parsing of application/json type post data. We also tell express to use the bodyParser.urlencoded middleware, which provides support for the parsing of application/x-www-form-urlencoded type post data.

Creating the node.js web server – Example # 3:

In Example # 3, we use express.static to set up the static assets folder, the main purpose of which is to help the working example function in a browser, with minimal effort. For more information on express.static, please see my earlier blog post in Express mentioned above. In this example, we use the app.post method, which tells the Express module to wait for an HTTP request at the /form route that leverages the POST HTTP verb. So, when the user sends a POST request to the /form route, Node.js executes the provided callback, which is the second argument passed to the app.post method.

The app.post callback takes two arguments, the first of which is the request object (i.e. “req”). The second is the result argument (i.e. “res”). We use the res.setHeader method to set the Content-Type header to application/json, which tells the user’s browser how to properly handle the returned data from the request.

NOTE: We wrap the rest of the callback code in a setTimeout, the purpose of which is to mimic a slow internet connection. Otherwise, the working example will move too fast for most to comfortably follow.

Inside the setTimeout, we use the res.send method to send the result body back to the user, and here we’re sending a serialized JSON object. To construct this object, we access the body property of the req object (i.e. the request object), which is why we have implemented the bodyParser.json middleware. And this is what allows us to parse the properties of the request body. In this example, we are expecting firstName and lastName POST parameters, which will allow us to access the req.body.firstName and req.body.lastName properties, to build the JSON for our result object.

To see this code in action, just follow these steps :

  1. Clone the git hub repository: https://github.com/kevinchisholm/video-code-examples/tree/master/node-express/handling-POST-requests-with-express
  2. Follow the instructions in the readme to set up the code
  3. Point your browser to: http://localhost:3000
  4. In the web page, enter some text into the two input boxes, and then click the “Submit” button
  5. Notice the logging statement in your node.js terminal
  6. Notice that the text you entered displayed in a browser message

You might also want to take a look at the Network tab in your Web Developer Tools, which allows you to see the actual network request that goes to the web server. You’ll be able to inspect the POST data sent, and the JSON data returned.

Viewing the working code example

Here’s what happens when you submit the data in the browser:

  1. The JavaScript in www/js/form-handler.js makes an AJAX POST call to the route: /form.
  2. The object sent in the POST request is: {firstName: XXX. lastName: XXX}. (NOTE: “XXX” is whatever value entered into the form’s text inputs.)
  3. Our Node.js web server intercepts the HTTP request to /form.
  4. Our Node.js web server parses the body of the HTTP request and constructs a JSON object.
  5. The XMLHttpRequest for the AJAX call is this JSON object.
  6. The browser displays the data from this JSON object in the browser.

Nothing too fancy here, just illustrating the “round trip” of our HTTP POST request.


In this article, we learned how to handle POST requests with the Express node.js module, and we talked about the need for bodyParser.json and bodyParser.urlencoded. We also learned how to listen for a POST request to a specific route, and how to access the POST parameters in the HTTP request body. Now, while the working example is simple, it does allow you to inspect every step of the process. If you look at your browser’s network tab, you can see the HTTP POST request go out, and then return. What happens during the server-side processing of that request is what you see in our Node.js code: server.js.

So, a lot to digest at first, but I’m hoping that this it will get you started with your next form-based Node.js application!

Introduction to Express.js, the Node.js web application framework

Express JS

Node.js Logo - expressExpress.js provides the kind of abstraction that lets you stay out of the weeds and focus on your application code.

While the low-level nature of Node can be an asset, it can also be somewhat of a curse because when you’re serving static assets, it can be tedious to detect routes and serve the correct static assets for an entire web page. Some examples of static assets are CSS files, images or JavaScript files. Now, the good news is, Express is a Node module that provides abstraction for these kinds of challenges. It’s popular, it’s used by large companies, and there’s strong community support for it, all of which make this a solid choice.

Why Express?

The main goal of Express is to provide an application framework, and getting started is simple. Take a look at the code samples, which you can clone at the following Github repository: Introduction to Express.js, the Node.js web application framework. You’ll find instructions on how to run the code in the Github


The package.json for this project is simple: the single dependency is express.

The get() Method – Example # 1

In Example # 1, we call the get() method of the app variable, and we pass the string “/” to the get method. This tells Express.js how we want to handle any GET request to the root of the application. (NOTE: GET is an HTTP verb, other examples are POST and PUT.) In Example # 1, we are sending back a simple HTML page, and we have created the HTML by constructing a string that represents that HTML code. This happens in the “HTML” variable. We then call the send() method of the result object, sending our HTML back to the user. Now run Example # 1 in your terminal with the command node example-1.js, then navigate to http://localhost:3000/ in your browser. There you will then see “This is Example # 1”.

The use() method – Example # 2

Example # 2 is much shorter than Example # 1 because we have not hard-coded our HTML. Instead, we have used the use method of the app object, which tells Express which folder we want to use for serving static assets. As a result, our JavaScript code is cleaner, and easier to read. Also, we’ve separated concerns. In other words, instead of hard-coding our HTML in our JavaScript file, we’ve put HTML where it belongs: in an HTML file.

Now notice how the web page in Example # 2 has an image. I included that to point out how Express handles this for us, even though we never had to write any JavaScript code that specifically waits for an image request. There is also a CSS file being served. In both cases, Express understands that WWW is our public web folder and it serves up static assets as needed, which certainly saves us a lot of time. Now run Example # 2 in your terminal with the command node example-2.js, then navigate to http://localhost:3000/ in your browser. There you will see “This is www/index.html”, which is a major improvement, as the HTML that the user sees is actually served from a static HTML file.

Adding Handlers for a Second Route – Example # 3

In Example # 3, we use the GET method to add a handler for when the user requests “/about“. In this case, we serve-up “/www/about.html“, which is just one example, but we could have added any specific route handlers needed. Now run Example # 3 in your terminal with the command node example-3.js, and navigate to http://localhost:3000/ in your browser. There you will see “This is www/index.html”. Now, click “About” in the upper-right-hand corner, to display the “About Us” page. You can then click “Home” and “About” over and over, to switch routes, because our JavaScript code in Example-1.js handles the alternation of these two routes.


In this article, we learned the absolute basics of Express, but in doing so, we also got to see how simple it is to use. In our discussion we saw examples of the get() method, as well as the use() method. I’m hoping that this was enough to illustrate the power behind this Node.js web application framework, and that it will get you well on your way to enjoying its usefulness.

Create a Node Websocket Server in Five Minutes


Node.js Logo - node websocket serverLeveraging Express.js and the ws NPM module, it is possible to create a Node Websocket Server in less than ten lines of code.

The Websocket protocol provides full-duplex communication channels over a single TCP connection. In the past, web clients had to employ long-polling or the repeated pinging of a server in order to achieve this kind of “push” functionality. Now, Websocket technology eliminates the need for such outdated techniques. So, when a Websocket client connects to the server, a persistent connection is created, and the Websocket server can then push notifications to all connected clients. It is possible for the clients to send messages to the Websocket server as well, but I’ll cover that in a later article.

In this article, I’ll explain the bare-minimum code needed to create a Node Websocket server that can broadcast all incoming messages to connected clients. This should take us about five minutes, and less than ten lines of code. The beauty of Express.js is that it takes care of the heavy lifting with regard to the actual web server. The ws NPM module also plays a starring role in that it handles the Websocket communication layer, allowing us to expose an endpoint that accepts connections and messages from clients. Plus, we can broadcast messages to connected clients.


Above is the contents of package.json. There are only two dependencies: the Express.js framework and the ws module.

The Node Websocket Server – Example # 1

So, here in Example # 1 we have the entire working application. On line #s 3 through 9 we create our dependencies. I’ve grouped things in a way that I hope makes sense, but I’ll just point out that on a high level there are two things happening here. We require the modules that we need as constants: http, express, and WebSocket. Also, we create the constants app, server and websocketServer. These constants are the results of expressions. Now if you’ve ever worked with Express.js before, the app constant should be familiar to you; it’s simply an instance of the Express framework. The server constant is the result of calling http.createServer(), passing it our express.js instance (ie. “app”). And finally, the constant websocketServer represents our Websocket server.

Now let’s jump ahead for a moment to line # 30, where we start our web server. It’s not that there’s much going on here; it’s just that I wanted to point out that the server is started by calling the server.listen method, passing it the port to listen on (i.e. 3000). The second argument (the anonymous function) is optional.

Now let’s go back up to the top of the file. As you can see, the rest of the code is surprisingly simple. We create two event handlers, the first of which takes care of each Websocket client connection, and the second one processes each message that it receives from that client. On line # 12, we have the first event handler. We use the “on” method of the websocketServer instance to handler an incoming connection. This is somewhat similar to creating a handler for a GET or POST request in Express.js.

We pass the event as the first argument (i.e. “connection”), and then a function as the 2nd argument. The anonymous function that we provide contains the code that we want executed for each new Websocket client connection. This function also receives a Websocket client as its first argument. We have named this variable: “webSocketClient”. On line # 14 we provide some feedback to the Websocket client by sending it the first Websocket message: { “connection” : “ok”}. This is for demonstration purposes only, just so that we can see right away that the connection has been established.

Now inside of the anonymous callback, we set up the second event handler, which will process each message that this client receives. And similar to the connection event handler, we use the “on” method of the webSocketClient variable to handler an incoming message. We pass the event as the first argument (i.e. “message”), and then a function as the 2nd argument. The anonymous function that we provide contains the code that we want executed for each message received by this Websocket client.

Broadcasting the Message to All Websocket Clients

On line # 20, we start the process of broadcasting the incoming message to all Websocket clients. Using the forEach method of the Websocket.clients list, we iterate the list of websocket clients. And for each iteration, we provide a callback function. This callback function receives the currently iterated Websocket client as its first argument. So, we then use the send method of that client object, and send the incoming message (i.e. by sending one message to many recipients, we are “broadcasting” that message).

Testing the Websocket Server

Now if you copy all of the code from Example # 1 into a file and then execute it, you’ll have a running Websocket server on port # 3000. But that isn’t enough. Now we want to test our websocket server, and an easy way to do this is to use the Smart Websocket Client plugin for Google Chrome.

So go ahead and click this link to install the plugin, and once you’ve installed it, start the plugin by clicking the icon in the upper-right-hand corner of your Chrome browser.

Smart Websocket Client

Once the Smart Websocket Client is running, enter http://localhost:3000 in the address bar and then click the “Connect” button. You should see { “connection” : “ok”} in the lower window, indicating that a Websocket connection was successfully established (see example # 2).

Connecting Success Message

Example # 2

In the top window, enter any text, click the “Send” button, then you’ll see your message appear in the lower window. Now open a few more instances of the Smart Websocket Client and follow the same steps. If you place your Chrome browser tabs side by side, you’ll see that every message you’ve sent has been broadcast to every Websocket client. Congratulations!  You’ve just built a working Node Websocket server.

Example # 3

Now earlier in this article, I promised that we could create our Websocket server in less than ten lines of code. Example # 1 clocks in at 32 lines of code, but this is because I used whitespace and comments to make the code as readable as possible. So, in Example # 3, I’ve provided the condensed version of our Node Websocket server. This code is not very pretty, but as promised, it is a fully functional Node Websocket server that’s set up in less than ten lines

Set up a Node / Express Static Web Server in Five Minutes


node express static web server

Setting up Node and Express as a Simple Lightweight Web Server for Your Single Page Application is Very Easy to Do.

Sometimes you just need a local web server.  Sure, you could use MAMP, but installing Apache, MySQL and PHP seems like overkill in some cases. I have used MAMP for years and it is terrific. In particular, when you need to run PHP locally and / or connect to a MySQL database, it’s the cats pajamas. But that was the standard 10 years ago. Nowadays, it’s very common to build a single page web application where all of the assets are static, data is pulled-in from a REST endpoint and all of the heavy lifting is done in the browser via JavaScript. In these kinds of scenarios, a static Node / Express server is a simple, easy and lightweight approach.

In this article I’ll explain the steps needed to set up a Node / Express Static Web Server. And the good news is that on this high level, the required steps are very simple. First, you’ll need to require the express and path modules. After that, you’ll create an instance of express, then set the port that the web-server will use. The next step, and the key-ingredient here is the express.static method. This tells Express.js that you want to serve static content from a specific folder. In that one line of code, you’ve done the majority of the configuration work.

So, not only will Express serve-up static content from that folder, it can do so for any subfolders as well. You can specify any folder in your project as the static web folder. And the beauty of it is that any folder outside of the one you specify will be hidden from public view, so your application code will be safe. When you pass the the express.static method to the use method of your express instance, you provide the details that express needs to serve your static content. Then finally, you use the listen method of your express instance to start the web server. We’ll take a closer look at the express.static method in Example # 2.

Now, I just want to remind you here that this article pertains to the specific occasions in which you need to serve static web assets locally. In other words, using a Node / Express static web server can be a very simple way to satisfy your need for a local web server, but may not be the best approach for your production needs. Technically, you could take the code that is detailed in this article and deploy it to your production server, and in theory it would work just fine. For this article, however, I’m just going to concentrate on providing a fast and simple way to get a local web server running so that you can test your front-end code (e.g. HTML, CSS or JavaScript).

The code samples can be downloaded here: https://github.com/kevinchisholm/node-express-static-web-server

Example # 1 – package.json

In Example # 1, we have the contents of package.json. Nothing too special going on here. But just note that our only dependency is the express module. Also, in the scripts property, I’ve set up the start command to execute the app.js file in the node web-server folder. This way, we can simply type npm start in the terminal, instead of node web-server/app.js (just a bit less typing).

Example # 2 – The Express Static Web Server

In Example # 2, we have the entire contents of our web server: 15 lines of code (and nearly 25% of that is comments!). The magic happens on line # 10:  We call the app.use method and pass it express.static, which also takes a couple of arguments. So this tells Express that we want to set a static folder. We then use the path.join method to tell Express where all static assets should be served from. In our case, it is the www folder. The two arguments passed to the path.join method are __dirname, which tells us the absolute path to the folder within which the current script is found, and then “../www” which is a relative path to the www folder.

Now, as I mentioned earlier, anything outside of your static folder is protected from public view. This means that while the folder you specify when calling the express.static() method (i.e. “../www”) is publically viewable, any folder that is a sibling or descendant of that folder is not available publically. This is not a critical factor when working locally (i.e, developing), but it does matter in production. In other words, you wouldn’t want your application code to be viewable to the general public. Nor would you want to make available any sensitive information that’s in your application code, such as a secret key or other credentials. So, as you can see, this is one of the key strengths of Express, which is the ability that it provides you to not only define your public/static folder in one line of code, but to also protect all of the other folders by default.

Express does all of the heavy lifting

A little earlier, I used the word magic. We both know that none of this is actually magic, but it sure feels like it. If you’ve ever created a Node web server manually, then you know two things: 1) It’s really easy, 2) It’s really tedious once you get past “Hello World”.  But Express hides all the tedium and makes serving static assets as easy as 1-2-3.

HTTP Headers

There is one downside here. Express does not set the appropriate content-type headers for the HTTP requests.  This is not fatal in most cases because this approach is simply meant to provide a very fast and easy way to set up a static web server. The actual web server works just fine, but keep in mind that content-type headers for files such as JPEG, PNG, CSS or JS will not be set accordingly. If that is a problem, then a simple static web server is probably not what you need and you should consider a more robust approach. So, hopefully, if you do need a simple static web server, this article was what you needed to get up and running quickly.


There are multiple options when it comes to setting up a static web server. One advantage to leveraging Node and Express.js, however, is that as a developer, you probably already have Node installed on your machine. So, in this case, you won’t need to install any additional software. You can simply import the Express framework, write about a dozen lines of code, and you have a static web server. This is probably not a server that you would use in production, but as you can see, it can easily solve the problem of quickly serving web content on your local machine. If you need to write moderately complex dynamic application logic, then you might need something a bit more advanced than what was discussed here. But for a basic static web server, this approach should get you going (hopefully in less than five minutes : – )

Yikes! AWS Node / NPM ERR! enoent ENOENT: no such file or directory package.json


Node.js LogoAWS’s Node deployment keeps telling me that it cannot find package.json, but it’s there! – Fortunately, this problem is easily solved.

AWS makes deploying your Elastic Beanstalk easy. Compress your build files, upload the ZIP and then deploy that application version. Lovely. But sometimes your application goes into a “warning” or “degraded” state, and then a visit to the application with a browser yields: “502 Bad Gateway“. Errrggggg…..

At this point, you look in the logs and see a cryptic message that says something like: “enoent ENOENT: no such file or directory package.json“. You double-triple-quadruple-check and yes, package.json is in-fact very much alive and well. So, of course your next thought it: “WTF???

I have run into this problem a few times and in each case, the problem was me: I zipped-up a folder, instead of the contents of a folder.

Do not compress an entire folder

Compressing the my project folder does not fix package.json problem

Let’s say your Node application is in a folder named: “myProject“. If you are compressing that folder, then this is your problem. You don’t want to compress a folder because when AWS un-zips that file, it will not know to look in the “myProject” folder that is created when the file is un-zipped.

Compress ALL of the items in  your project folder

Compressing the root files fixes package.json problem

What you want to do is: select EVERY file in the root of that folder (i.e. your Node application’s root folder), and then compress THOSE files. This will create a ZIP file that when un-zipped, creates the file structure that AWS expects. Now AWS will find package.json. This should solve the problem.

Compressing the root files fixes package.json problem

In the image above, I have zipped up the contents of the “myProject” folder, and created Archive.zip.

Upload the zipped file

Compressing the root files fixes package.json problem

Now, back in your AWS console, you can use the “Upload and Deploy” button to upload your ZIP file, and then deploy it.

Setting Your AWS-Hosted Node Application’s Port


Node.js LogoWhen working locally, using an arbitrary port number is fine, but you need to get that property when deploying your Node application to AWS.

Technically, you can code-up a Node web server in less than ten lines of code. Most likely, your application will require a few more lines of code than that. But my point here is: getting a basic Node web server running is not terribly difficult.

Example # 1

In example # 1, I used port # 3000, but I could have used virtually any valid port number. When working locally this is for the most part a non-issue. As long as no other application is using the port you want to use, you chose one and then use it. Easy. This example does little more than say “Hello!”, but the point I’m trying to make is that your main JS file ends with the server.listen method, and you need to pass it a port number.

But when you attempt to deploy this code to your AWS Elastic Beanstalk instance, you will get a “503 Bad Gateway” error in your browser. The reason for this is: you don’t know which port should be used when calling the server.listen method. The great thing about AWS is that it provides a layer of abstraction for those kinds of details. In other words; AWS takes care of details such as which port to listen on. The downside here is that you have no way of knowing exactly which port that will be when you deploy your code.

Example # 2

In example # 2, we set a variable named: port. We attempt to assign the value of process.env.PORT to that variable. If that value is falsely, then we set it to 3000. The reason this works is; if our code is running on our AWS instance, then process.env.PORT will automatically be set and we will listen on that port. If we are running our code locally, then process.env.PORT will be undefined (or “falsely”). So, then our port variable will have a value of 300. This way, our code can run successfully on our AWS instance, or locally.

Node.js Hosting Links


JavaScript LogoThe good news is: there are a lot of Node hosting services out there. The bad news is: there are a lot of Node hosting services out there

Installing Node locally is easy. Cloning an existing Node application from GitHub and running it locally is easy. Creating your own Node application and running it locally is easy. But, choosing a hosting solution for your Node application is definitely not easy.

Below is a list of companies that offer Node hosting services. I do not claim to have every possible company listed here. But I’ve done my best to list the ones that I know about and will update this page any time I learn about another one worth mentioning.

Title: Openshift

Link: openshift.com

Description: I’ve used Openshift.com quite a bit and for the most part have been very happy with their service. They offer a free plan that definitely includes what you need to get up-and-running.

Title: Heroku

Link: heroku.com

Description: Heroku was the first Node hosting service I knew about. I’ve not used it in a while but I was always very happy with it. Setup and deployment was fairly pain-free, as was adding services such as MongoDB.

Title: Amazon Web Services

Link: Deploying Node.js Applications to AWS Elastic Beanstalk

Description: AWS is a big topic. But in general, it’s really easy to get a Node instance up-and-running with Elastic Beanstalk.

Title: Nodejitsu

Link: nodejitsu.com

Description: I’ve not tried Nodejitsu but have heard good things about them.

Title: zeit.co

Link: zeit.co/now

Description: This is a new one to me, but their setup looks super-simple.

Title: Node.js on Google Cloud Platform

Link: cloud.google.com/nodejs

Description: Although Google still has not caught up with Amazon yet, they are serious about their cloud offerings. I’ve not tried their Node hosting but have confidence that it is at worst, solid.

Title: Node.js Hosting

Link: a2hosting.com/nodejs-hosting

Description: Another new one to me, but their packages look very affordable.

Title: Node.js One Click Install | Cloud Hosting – GoDaddy

Link: a2hosting.com/nodejs-hosting

Description: Godaddy now offers a “Cloud” service that supports Node hosting.

Helpful Node.js Education Links


JavaScript LogoNode.js is growing fast. This is a great problem. While is means that JavaScript lovers have a rosy future, it can sometimes be difficult to keep up with what is going on with Nodes.js

Here are a list of links that you might find helpful in your Node.js travels. In each case, I’ve provided a brief description of the link / organization / article, so that you have a sense of where you are are headed. If you feel that there is a Node.js link that I should have included in this article, please contact me at: kevin@kevinchisholm.com.

Critical Node.js Links

Node.js Logo

Title: nodejs.org

Link: nodejs.org

Description: Since this is the home page for Node.js, you cannot go wrong here.

Node.js Logo

Title: Node.js v6.6.0 Documentation

Link: nodejs.org/api

Description:The official documentation for Node.js. Very well organized and easy to read. Almost the most important Node.js documentation you can read if you are getting started.

Node.js Logo
Title: npm – Home of the package manager for JavaScript.

Link: www.npmjs.com

Description: It’s hard to imagine doing anything with node without the use of NPM. This is the official home page of NPM, and a great starting point.

Node.js Logo

Title: Homebrew. The better way to install Node.js on Mac OSX

Link: brew.sh

Description: I’m being a little opinionated here (ok. I’m being a lot opinionated). But, for Mac users, Homebrew is the way to go when you install Node.js (sorry Windows users, you are stuck with scoop : – )

Node.js Logo

Title: Built in Node.js – startups, apps, projects using Node

Link: builtinnode.com

Description: A great way to learn about who is using Node.

Node.js Newsletters

Node.js Logo

Title: npm Weekly

Link: www.npmjs.com/npm-weekly

Description: Find out what npm has been working on, thinking about, and talking about every week. A great newsletter if you are into NPM.

Node.js Logo

Title: node weekly

Link: nodeweekly.com

Description: A free, once–weekly e-mail round-up of Node.js news and articles. Another awesome newsletter if you are into Node.

Other Helpful Node.js Links

Node.js Logo

Title: Node Tutorials on scotch.io

Link: scotch.io/tag/node-js

Description: scotch.io tutorials are very easy to read. The site is in general a great resource for learning about a number of web development technologies. Fortunately, they are passionate about Node!

Setting AWS-Node.js Stormpath keys


Stormpath Logoprocess.env can be used to set the environment variables you need when using the stormpath api in your aws-hosted node application

Stormpath provides amazing abstraction when it comes to authentication. There are certainly other services like this, but when it comes to security, Stormpath is not only popular, but well respected. This comes as no surprise as they simply make authentication easy.

I have to say that their documentation is for the most part very good. If Node is  your thing, they make it very easy to get up-and-running with their API. Their mailing list is also quite helpful.  At least a once or twice per week, I receive emails that link to interesting articles on their blog.

Recently, I was trying to setup a Node.js/Express.js application, leveraging their express-stormpath Node module. I was thinking to myself: “…hmmmm. There must be a step where I have to configure my secret key or something like that”. After some quality time with Google, I came across this article, that suggested the following:



(where “xxx” is your actual key)

Well, that is fine for working locally, but I knew if I wanted to deploy this as an AWS Elastic Beanstalk application, I needed to actually set these values somewhere.

Using process.env

I set the three environment variables I needed to be properties of process.env:

(where “XXX” is your actual key)

I took a look in the source code for the express-stormpath Node module and could see that it seemed to want to find these on process.env, so I think this approach should be fine. I’m still in the process of getting this Node.js/Express.js application up and running, but if you are faced with the same challenge, hopefully this helped you.

Share Node.js code with JSApp.us


JavaScript LogoJSApp allows you to write Node.js code in a browser, run it, and also share it with others

One of the things that makes front-end development so much fun is that you can easily create and share code. With tools such as JSFiddle, you can create an example web page and then send that JSFiddle URL to someone. Or you can even send someone the URL of a JavaScript file that you created so that they can just run $.getScript(yourJavaScriptURL) to inject your code in their page. And there are plenty of other clever ways to share / demo front-end code without a lot of fuss.

But what about Node?

Well, it’s not always so easy with Node, right? It’s server-side code, so you can’t just send someone a URL of your Node.js file to inject in their page. Github really saves the day in this case; you can create a public repo, and then send someone the Github repo URL. But that still requires the recipient to have at least git installed on their computer. And as we all know, once something takes more than 2 clicks, you start to lose your audience. That said, anyone with a reasonable attention span and a genuine interest in your code will follow the few clicks needed to clone your repo and run your code, but for quick little snippets, it sill feels like overkill sometimes.

For example, I like to write blog posts here about Node. In some cases, it does make sense to create a Github repo, especially if you have to leverage package.json, and the app requires file access, etc. But what about little examples? Just 10-20 lines of code to demonstrate a concept? Or even a simple working example?

Enter JS App!

When you navigate to jsapp.us, you immediately see some sample Node.js code. You can delete it and write your own. Then,  you simply click “test” in the sidebar (or CTRL + b), and a new browser window opens with your Node.js code running!

If you create a profile (free), you can save your code and share it with others. This is one of the most clever things I’ve seen in a long time. You can also go back and edit your files, re-name them, delete them. Really fun stuff.

If you need to create a quickie Node.js app and a Github repo would be overkill, JSApp might be just the tool you need. It’s been a while since I was this impressed but something I stumbled upon.


Renaming a file with Node.js


Node.js LogoLearn how to rename a file with Node.js, in ten lines of code

I was fleshing out a few ideas on a project today and found myself trying to figure out how to rename a file with Node.js. I have no doubt that there are better ways to go about this, but I thought I’d document my findings.

The first thing I realized is that the low-level nature of Node.js offers a great deal of power, but also with that power comes the need to handle all of the details. For me, this meant getting references to four things:

  1. The name of the old file
  2. The name of the new file
  3. The full path to the old file
  4. The full path to the new file (which does not exist yet)

The first two items are easy: I just had to provide a name for the old file, and decide what to call the new file. The last two items involved a bit more effort. I needed to do three things:

  1. Get the path of the folder that contains the old file
  2. Add a trailing slash to that path
  3. Set a permanent reference to the old file

So, to accomplish all of these tasks, I decided to use the filepath Node.js module.

Example # 1A

 Example # 1B

Example # 1A shows the contents of package.json. There is only one dependency: the filepath Node.js module.

In Example # 1B, I first set references to the file system module, as well as the filepath module. Next, I provided strings for the names of the old and new files. The filepath module is then used to get the path to the current folder; I set that to the fpFolder variable (adding a trailing slash to that string, which will be needed when we append file names to that string).

The variable fpFile is used as a permanent reference to the old file (this will come in handy for Example # 3.) Finally, I build the full file paths for the old and new files. After that, a couple of console.log statements to make sure all of this work is correct.

Example # 1C

Example # 1C shows the output of the two console.log statements. Each path will vary for you a bit, so I simply put “[YOUR LOCAL PATH TO]” for the folder structure that leads up to that file in the github repo that you cloned (see “How to Demo” below).

This example does not actually rename a file. So, now I will use the file system module to make that change.

How to Demo:

  • Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  • Navigate to: JavaScript/node-js/node-modules/fs/fs-rename
  • Execute the following command in a terminal prompt: npm install
  • Execute the following command in a terminal prompt: node filepath-1.js

Example # 2

In Example # 2, I use the file system module’s rename method to rename the file: “re-name-me.txt.” This method takes three arguments: a path to the old file, a path to the new file and a callback. The callback takes one argument: an error object. Inside of the callback, I check for the error object, and then output the path of the new file. So now, follow the instructions below to see this code in action. After you execute the code, the file: “re-name-me.txt” will be renamed to: “ive-been-renamed.txt.”

In order to rename Example # 2 again, you’ll need to manually rename the file: “ive-been-renamed.txt” back to: “re-name-me.txt”. After a few times back and forth, this got pretty tedious and I started to think that there must be a way to toggle the file back and forth. Meaning: If the file has been renamed, change it back to the original name, and so forth.

How to Demo:

  • Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  • Navigate to: JavaScript/node-js/node-modules/fs/fs-rename
  • Execute the following command in a terminal prompt: npm install
  • Execute the following command in a terminal prompt: node filepath-2.js

Example # 3A

In Example # 3A, I use the ternary operator when setting the final path for the old and new files. In each case, I check to see if the old file exists, and then depending on the answer, I set each path accordingly.

Example # 3B

Example # 3B is the full code for the final version of this file. I combined all var statements and cleaned up the code a bit. When you follow the instructions below, you’ll see that you can keep executing node filepath-2.js over and over, and the text file will toggle between the old name and the new name.

How to Demo:

  • Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  • Navigate to: JavaScript/node-js/node-modules/fs/fs-rename
  • Execute the following command in a terminal prompt: npm install
  • Execute the following command in a terminal prompt: node filepath-2.js


As I mentioned, there are probably a number of ways to do this that are more efficient. Everything I detailed here was the result of a few minutes with Google. Hopefully, this article either got you where you needed to go, or pointed you in the right direction.

Helpful Links for Renaming a File with Node.js



Getting started with the filepath Node.js module


Node.js LogoWhen you need to reference and work with the local file system in your Node.js program, the filepath module is quite a handy tool.

Even if your Node.js program is a web-server of some sort, working with the local file system is somewhat inevitable. While Node.js does provide low-level file system access (see the Node.js fs module), abstraction is always helpful, particularly when dealing with absolute paths.

The filepath Node.js module is a very helpful utility for simple access to file paths. You’ll need only a package.json file with this module as a dependency, an “npm install” command, and then you are up and running. This article provides a quick introduction to a few of the most common methods.

Example # 1A

Example # 1B:

In Example # 1, we first create the FP variable, which references the filepath module. Then we create the path variable, which holds the return value of the FP object’s newPath method. And finally, we output the path in the console. Example # 1B shows the terminal output when we use console.log to view the path variable. This path will vary for each user so I simply put “[YOUR LOCAL PATH TO]” for the folder structure that leads up to that file in the github repo that you cloned (see “How to Demo” below).

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: JavaScript/node-js/filepath
  3. Execute the following command in a terminal prompt: node filepath-1.js

Example # 2

Example # 2 demonstrates the list method. The only real difference between this code and Example # 1, is the new variable “files”, which receives the value of the list method, when called on our path variable. The files variable ends up as an array. Each element in the array is an object whose “path” property is a string that points to a file in the current directory.

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: JavaScript/node-js/filepath
  3. Execute the following command in a terminal prompt: node filepath-2.js

Example # 3A

Example # 3B

Example # 3C

Example # 3D

In Example # 3A, we see the recurse method in action. Just as the name implies, the recurse method will recursively list all of the files in the current directory. As a result, if one of those files is a folder, then it will list all of the files in that folder, and so on. This method differs from the previous two examples in that it takes a callback. The callback is a bit like a forEach call; it iterates over all of the files or folders in the path, and calls the callback for each one. Inside of the callback, the path variable is the current path being iterated over.

Example # 3C is the output from the code in Example # 3A.

In Example # 3C, we use the toString() method of the path object so that instead of a bunch of objects that we would need to handle, we just get the values we are after; the string representation of the path to that file or folder.

Example # 3D is the output from the code in Example # 3C.

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: JavaScript/node-js/filepath
  3. Execute the following command in a terminal prompt: node filepath-3.js


The filepath Node.js module has much more to offer than was demonstrated here. Hopefully, this article has demonstrated how easy it is to get started with filepath.

Helpful Links for the filepath Node.js module



Getting started with the uglify-js Node.js module


Node.js LogoLearn how to easily implement minification and file concatenation right in your Node.js program.

There is no doubt that tools such as grunt and gulp provide powerful front-end tooling, particularly for large-scale applications. But in some cases, you may want to “roll your own”. If you want to minify and / or concatenate files from your Node.js application, the uglify-js module offers a simple syntax yet plenty of muscle-power.

So, if you want to get serious, a quick package.json file and “npm install” command are all you need to get started. Once these two tasks are taken care of, you can minify one or more files, and concatenate the output to a new file. In this article, I will show you how to do just that in less than 20 lines of code.

Example # 1A

Example # 1B

Example # 1C

 Example # 1D

In Example # 1A we have the contents of the file: package.json. This tells npm that our program depends on the “uglify-js” module. Examples # 1B, 1C and 1D are the contents of the files we will “uglify”. The actual code has no significance. We just want to have a reference so that once we have uglified the files, we can see the difference in the output.

Example # 2A

 Example # 2B

In Example # 2A, we minify the file: file-1.js. In this case, the minified code is simply shown in the console. Example # 2B shows the minified code. It’s hard to imagine a case where we would want to minify code, but only show the result in a terminal window. Realistically, we need to write the output of the minified file to a new file.

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: JavaScript/node-js/uglify-js
  3. Execute the following command in a terminal prompt: npm install
  4. Execute the following command in a terminal prompt: node uglify-1.js

Example # 3


In Example # 3, we have the content of uglify-2.js. Here, we’ve moved things into a more real-world context; we save the result of the minification to a physical file. Now notice that after you execute node uglify-2.js in your terminal, there is a new file named: output.min.js, which is a minified version of file-1.js.

The first change we made was to add a reference to the “fs” module, which provides access to the file system in Node.js. The console.log statement was left in, just so you can still see the output in the console. Below that, we call the writeFile method of the fs object. We pass it three arguments:

  1. the name of the file that will contain the result of the minification process (i.e. the minified code)
  2. the content for that file (i.e. the minified code), and
  3. a callback. The callback takes one argument: an error object. In the callback, we check to see if there was an error, and if not, we send a success message to the console.

In this Example, the callback is optional as it has nothing to do with the minification process and only provides messaging as to the status of the minification attempt.

Although Example # 3 is more of a real-world context than Example # 2, it is still a bit unrealistic as we only minify one file. In a typical production workflow, one would likely want to minify and concatenate more than one file.

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: JavaScript/node-js/uglify-js
  3. Execute the following command in a terminal prompt: npm install
  4. Execute the following command in a terminal prompt: node uglify-2.js

Example # 4

Example # 4 shows the contents of uglify-3.js. The only change we have made is in the call to UglifyJS.minify. Instead of passing it a string, we pass an array. Each element in the array is a path to a file we want to minimize and the concatenate to the output file. In this case all of the files are in the same folder as our program, so there is no folder structure (i.e. just the file names). You can take the same exact steps to demo this example, and when you do, you will see that the file output.min.js contains the minified code of file-1.js, file-2.js and file-3.js.


The uglify-js offers a ton of options, parameters and features. For this article, I wanted to demonstrate how easy it is to set up and use this Node,js module. But if you want to understand the true power of uglify-js, there is a ton of documentation available online. Hopefully this article got you to first base!

Helpful Links for the uglify-js Node.js module




Node.js Basics – Command-Line Arguments


Node.js LogoLearn how to access command-line arguments passed to your node.js file

When you use Node.js as a command-line tool, you most likely type the following into your terminal application: node some-file.js. Once your application grows to even a modest level of complexity or sophistication, you will probably want to accept arguments on the command line. So, in this article, I will explain the basics of how to gain access to the arguments that are passed to your node application on the command line.

The key to accessing command-line arguments lies in the argv property of the process global. The process object is a global object, which means that you can access it from anywhere in your program. The argv property is an array that contains all arguments provided when you executed your program on the command line.

Example # 1A

Example # 1B

Example # 1C

In Example # 1A, we have the contents of the file: showArgs-1.js. Then this line simply inspects the argv property of the process objet. The output will go straight to the terminal window the moment you execute your program.

In Example # 1B, we have the actual command you use to execute the program; first node (the executable for Node.js), and then the name of the file we want to execute: showArgs-1.js.

Example # 1C shows the output of this program. So, in the console, you will see an array. The first element of that array is node, and the second element is the path to the program. This path will vary for each user so I simply put “[PATH TO]” for the folder structure that leads up to that file in the github repo that you cloned (see “How to Demo” below).

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: node/cli-arguments-basics
  3. Execute the following command in a terminal prompt: node showArgs-1.js

Example # 2A

Example # 2B

 Example # 2C

In Example # 2B, you’ll see that we passed some additional arguments to our program: “arg2”, “arg3” and “arg4”. I used the numbers 2, 3, and 4 because they make more sense, due to the 0-based nature of this arguments array. Remember: arguments 0 and 1 are node and the file that contains your program, so the additional arguments here are indexed as 2, 3 and 4.

Since we know that the first two arguments will be the same every time, let’s try to get rid of those first two arguments.

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: node/cli-arguments-basics
  3. Execute the following command in a terminal prompt: node showArgs-1.js arg2 arg3 arg4

Example # 3A

 Example # 3B

 Example # 3C

In Example # 3A, we use the Array.prototype.slice method to remove the first two elements of the arguments array. Specifically, process.argv is an array, so we use its slice method to remove the first two elements of the array, and assign that return value to the variable args. We then inspect the args array using a console.dir statement.

In Example # 3B, we execute our program, passing three arguments: “arg0”, “arg1” and “arg2’. Example # 3C shows the output, which is an array containing only the arguments that we passed to the program.

How to Demo:

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: node/cli-arguments-basics
  3. Execute the following command in a terminal prompt: node showArgs-2.js arg0 arg1 arg2

Example # 4A

Example # 4B

Example # 4C

In Example # 4A, we make the output a bit easier on the eyes. Instead of a simple console.dir statement, we use the forEach method of the args array for iteration. Inside of each iteration, we use a console.log statement to output a more human-friendly message that shows the argument that is contained in that element.

How to Demo: 

  1. Clone this github repo: https://github.com/kevinchisholm/video-code-examples
  2. Navigate to: node/cli-arguments-basics
  3. Execute the following command in a terminal prompt: node showArgs-3.js arg0 arg1 arg2


While the examples here were quite basic, they are perfectly valid and could be used in your code. There are Node.js modules that provide powerful abstractions when it comes to Node.js command-line arguments. The one I have probably heard about the most is minimist. So, if you are writing an application that has even a moderate level of complexity, you will likely want to use something like minimist. That said, understanding how things work on a low level is always a good idea. Hopefully, this article has provided a helpful introduction to this concept.

Helpful Links for Node.js Command-Line Basics




When did Walmart become so hip?


WalmartLabs LogoWalmartLabs is doing some very cool things with Node.js. When the heck did all this happen?

Did you know that Walmart supports nearly 30 open-source modules, most of which are used in production, or that they created their own “private npm” to prevent hacks? Nope, neither did I.

I must admit, Walmart is not a company that comes to mind when I think of leading-edge web development. But like many older large-scale organizations, they have realized that they need to better leverage technology, or lose market share to companies such as Amazon. Well, it sure seems like they are very focused.

I had never heard of WalmartLabs until very recently. I kept noticing that in my Node.js-specific web-surfing, their name started to pop-up. So I took a look, and what I found was impressive.

It seems to me that Walmart has been hiring top engineering talent, and putting them to good use. They are doing some pretty cool stuff with Node and making serious contributions to the open-source community. Below are two videos I have recently viewed that are very much worth checking out. If you are interested in Node.js in the enterprise, these folks have a lot to share.

Cheerios and Fruit Loops: Frontend Node At Walmart by Kevin Decker

Walmart Senior Mobile Web Architect Kevin Decker talks about how Walmart threw out their legacy Java stack, and the many challenges of SPAs. In particular, he provides an in-depth discussion on pre-caching with Phantom.js, Thorax, the Cheerios Library, Fruit Loops (“a sugary cheerios) and Contextify.

Node.js at Walmart

Walmart Sr. Architect Eran Hammer talks about the server stack that they built on smartOs, hapi – their open-source Node framework, and custom “server partials”. He also discusses their use of Node as an orchestration layer, and some of the challenges of migrating from their legacy Java back-end.


Interesting Links related to WallmarLabs






Book Review: Node for Front-End Developers, by Garann Means


Node for Front-End Developers, by Garann Means - CoverIf you are just getting started with server-side JavaScript, “Node for Front-End Developers” offers a fast, high-quality introduction.

The ubiquity of front-end JavaScript is undeniable. Not only has the appetite for web-based content increased dramatically, but so has the appetite for sophisticated user interfaces. More and more, visitors expect web-based content to offer complex interaction and high-performance. The explosion of mobile device use has only exacerbated this dynamic. Ryan Dahl’s Node.js turned the whole concept of JavaScript on its head by providing an open-source tool that allows the language to be leveraged on the server-side, significantly expanding the potential of this language.

Node for Front-End Developers, by Garann Means is a fast introduction to this incredibly powerful technology. The concept of creating a web-server provides a door through which clear and concise explanations present the basic concepts of server-side JavaScript. I found it particularly helpful that for such a short book, topics such as the query string, post data, path data routing, asynchronous events, templating, databases and MVC are well handled.

The book’s length is deceptive; readers will find a wealth of useful information here. While each topic represents a thread that deserves further reading, anyone who is new to Node.js will find Ms. Means’ introduction helpful. Her writing style is both relaxed and professional. From using NPM to install modules, to real-time communication with WebSockets, Node for Front-End Developers offers a range that is just enough to excite the reader, yet never too much detail. Any of the examples can be typed into your favorite text editor and fired-up with minimal effort. This is critical when delving into a new topic, and makes your introduction to Node.js disarming and fun.

  • Title: Node for Front-End Developers
  • Author: Garann Means
  • Publisher: O’Reilly Media
  • Date Published: February 7, 2012
  • Language: English
  • ISBN-10: 1449318835
  • ISBN-13: 978-1449318833

JavaScript Concatenation and Minification with the Grunt.js Task Runer


Grunt.js LogoCombine multiple JS files into one using JavaScript, Node.js and Grunt.

In the article: “Getting Started with Grunt: The JavaScript Task Runner,” we covered the bare-bones basics of how to set up and use Grunt, to automate and manage your JavaScript build task. In that article, we used a Grunt plugin called “grunt-contrib-uglify” to minify and obfuscate our JS code. But while minification is a common task for any build system, file concatenation is also a technique used to minimize the number of http requests. This helps to improve web page performance. In this article, we will look at the Grunt plugin: grunt-contrib-concat. It exposes a number of very useful methods for simple, to advanced file concatenation.

Example # 1

In Example # 1, we have the contents of “package.json“. Naturally, Grunt is listed as a dependency, but there is a reference to the “grunt-contrib-concat” plugin as well.

The Project File Structure – And Files to be Concatenated


The Empty Distribution Folder
The Empty Distribution Folder



Example # 2

In Example # 2, we have the JavaScript code that will set up and run the Grunt tasks. There are two sections to this code: the call to the grunt.initConfig() method, and the two commands that load the plugin and register the task.

Grunt Init configuration

The method: initConfig belongs to the grunt object. For this article, we are concatenating three JavaScript files. As you can see, the sole property to the object-literal passed to the initConfig method is concat. Its value is another object with two properties: “options” and “dist“. The “dist” property provides an object with critical details for this operation: the location of the source files (“src“). It also provides the location of the file to be created that contains all of the source files (“dist“). Note that the value of the “src” property is an array. This ensures that any number of files can be specified. The path is relative to the Gruntfile. In this example, the “options” property specifies the string that will be used to separate each concatenated file.

Loading the Grunt Plugin and Registering the task

Under the call to the grunt.initConfig() method, we load the “grunt-contrib-concat” plugin by passing it as a string to the grunt.loadNpmTasks method. This makes the plugin available to the rest of our code. Next, we register the “default” task with a call to the grunt.registerTask() method. The first argument is the name of the task (in this case it is “default“). The second argument is an array containing one or more named tasks.

That is all we need to run our tasks. So now, in the terminal, we simply type: “grunt“. After grunt has completed running, you should see a new file in the “dist” folder named: “allScripts.js.

The Built File
The Built File


In this article, we talked about JavaScript minification using Grunt. We discussed configuring the project dependencies in the file: package.json, which is needed in order for our application to work properly. We also reviewed the file: Gruntfile.js, which is where we did set configuration properties for the build, and then run the tasks. There is plenty to dive into on the subject of minification with Grunt. This article provided the most basic elements of how to set, configure and run a task that combines multiple JavaScript files into one.

Helpful Links for: Grunt.js file concatenation




Getting Started with Grunt: The JavaScript Task Runner


Grunt.js LogoAutomate and manage your JavaScript build tasks with JavaScript, Node.js and Grunt.

Today, even the most straightforward web-based application will involve JavaScript, and chances are that JS code will not be too simple. In-fact, it is often inevitable that your JavaScript could start to grow over time and / or span multiple files. When these dynamics are in-play you’ll find yourself spending more and more time having to organize and test your code. Regardless of the details, this kind of maintenance quickly becomes tedious. I think many will agree that when humans have to perform repetitive / tedious tasks, the chance of error increases. Fortunately, this is the exact kind of work that a computer is thrilled to do, and do well. Using a built tool to manage your JavaScript build tasks is not only efficient, but also a smart way to go.

Grunt is promoted as a “Task Runner.” Semantics aside, it is an aptly named tool that provides a significant level of flexibility and power, when it comes to managing and automating your build tasks.

In this article, I will intentionally “scratch the surface.” While there is a great deal of depth to Grunt, getting started is a critical first step. My focus here is to provide you with information that simply enables you to set the minimum configuration required for your first (albeit simple) automated task with Grunt.


In order to use Grunt, you’ll need Node.js and the The Grunt CLI (Command-Line Interface). Installing Node is beyond the scope of this article. If you have not installed Node yet, you can start here: http://nodejs.org/

The Grunt Command-Line Interface

The Grunt CLI is used to launch Grunt. While this may sound a bit confusing, installing the Grunt CLI does not actually install the Grunt task runner. The CLI is used to launch whatever version of Grunt is defined in your package.json file (more on that file later). The advantage of this is that your projects can all use different versions of Grunt without concern for conflicts. The CLI does not care, it simply launches whatever version your project uses.

(You might need to use the sudo command when executing npm install, depending on your account priviliges.)

Note that in the example above the “-g” flag is used. This is recommended. It installs the module globally so that it can be executed from any folder on your machine, and need not ever be installed again.


Once you have installed the grunt-cli globally, you can put your project together. The first step is to create a package.json file. The package.json file is a feature of node.js. You can learn more about it here: https://npmjs.org/doc/json.html

In this example, we have named this project: “Grunt-Test-1”, and given it a version #. Both of these values are arbitrary, and completely up to you. But keep in mind that the “name” property is potentially used by Grunt in some cases, so choose a name that makes sense. The “devDependencies” property is an object that whos whose keys contain node modules that your project depends on. The value for each key is the exact (or minimum) version number required for that module.

The Project File Structure

Image # 1: The project’s file structure

Once your package.json file is ready, simply run the following command: “npm install”. Node will refer to your package.json file in order to understand what it needs to install so that your project works correctly. Keep in mind that the “installation” is simply a download that is local to this folder. Its is not “installed” on your computer, like a native / compiled application.

Example # 1

For this example, we have only two dependencies: grunt, and grunt-contrib-uglify. The “grunt” requirement is critical because we want to use Grunt. Everything else is optional. But if grunt is the only dependency, then there won’t be too much we can do. Grunt plugins allow us to define tasks that can be managed by Grunt. The “grunt-contrib-uglify” plugin provides JavaScript minification and obfuscation. For this article, we will simply minify a JavaScript file and save it with a new name.

Once you have created your package.json file, you can install all required modules with two easy words: “npm install”. Entering that command in your terminal will allow you to leverage the Node Package Manager, otherwise known as: “npm” (make sure you are in the root folder of your project). The Node Package Manager will look for a file named package.json in the current folder, and read the devDependencies object. For each property of the “devDependencies” object, npm will download whatever file is specified by the version # you have provided as a value for that property.

One of the ways in which this approach is so helpful is that it negates the need to commit large files to your version control. Node modules are often a combination of text files (e.g. “.js”, “package.json”, “README”, “Rakefile”, etc.) and executables. Sometimes a combination of just a few dependencies can add several megabytes to your project, if not more. Adding this kind of weight to your code repository not only eats up disk space, but it is somewhat unintuitive because version control systems are really meant to store and manage text files that you (or your teammates) will change over time. Node.js modules are managed by the contributors of those modules, and you probably won’t need (or want) to change their source code. So, as long as your project contains the package.json file, whenever you (or anyone on your team) check out that repo, you can run “npm install” to the exact versions of the modules you need.

The Gruntfile

The Gruntfile is the JavaScript code you create to leverage the power of Grunt. Grunt always looks for a file named: “Gruntfile.js” in the root of your project. If this file does not exist, then all bets are off.

Inside of Gruntfile.js, all of your code will be wrapped in one function: module.exports.

Example # 2

In Example # 2, we have defined the “exports” method of the “module” object. Right now it does nothing, but all of your grunt code will go inside of this function.

Example # 3

In Example # 3, we make a call to the grunt.initConfig method. This method takes an object as its sole argument. In this object, we set properties that provide the details Grunt needs in order to work the way we want. Here we set one property: “pkg”. In order to set the value for that property, we tell grunt to read the contents of our “package.json” file. The main reason for this is that we can use the metadata that is stored in package.json, and use it in our tasks. The two delimiters: <% %> can be used to reference any configuration properties. For example: <%= pkg.name %> would display the name that you have given your project. In this case it is: “Grunt-Test-1”, but we could also access the version number of our project in this manner: <%= pkg.version %>.

Since we will be using the “grunt-contrib-uglify” plugin, lets update Gruntfile.js so that it is configured to leverage this plugin.

The Empty Build Folder
The Empty Build Folder

Image # 2: The empty build folder

Example # 4:

In Example # 4, we have added a new property to the config object: “uglify”, which provides the detail that the “grunt-contrib-uglify” plugin needs. This property is an object which, in-turn, has two properties that are objects. As the name suggests, the “options” property contains options for the “grunt-contrib-uglify” plugin. The “build” property is an object that provides two properties that are critical to the “grunt-contrib-uglify” plugin:

  • src: The location of the file to “uglify”
  • dest: The location where the “uglified” file will be placed

While we have provided some important details, running Grunt at this point will produce no results because there are no tasks specified. Let’s take care of that next:

Example # 5:

In Example # 5, we have made two changes: We load the “grunt-contrib-uglify” plugin, and then we register the default task, which in this case is: “uglify”. Notice that the “uglify” task is passed to the registerTask() method, as the sole element of an array. This array can contain as many tasks as you like, always represented as a string. The purpose of the “default” task is to let Grunt know that outside of any other named tasks that are registered, this is the list of tasks that should be executed.

Now that we have provided the minimum details needed, we can run Grunt by simply typing “grunt” in the terminal. Once Grunt completes, you should see a new file in the “build” folder that is in the root of your project. That file should be named as per the details provided in the “uglify” section of the object passed to the grunt.initConfig() method. You’ll see that file has been “uglified,” meaning that it has been minified and obfuscated. The end result should be a significant reduction in file size.

The Built File
The Built File

Image # 3: The  built file


The example used in this article was very basic (about as basic as you can get!). My goal was to provide the critical base information needed in order to understand, setup and use Grunt for the first time. There is a great deal of power available from Grunt. While it may not fit everyone’s needs, it can, in many cases, provide a robust and actively maintained open-source framework with which to create simple or complex build processes.

Helpful Links for Getting Started with Grunt





Creating a Simple JSONP API with Node.js and MongoDB


MongoDB LogoBy leveraging the Node.js middleware “express”, we can create functionality for viewing, adding or deleting JSON data.

In a previous article: “Using Mongoose ODM to Connect to MongoDB In Your Node.js Application,” we learned the basics about connecting to a MongoDB database in a Node.js application. Because that article barely skimmed the surface of what is possible, we’ll take a few more baby steps here with our data. And for the sake of brevity, I’ll skim over the Mongoose.js details. If needed, you can refer to the article mentioned above for more details on that.

The goals for this article are:

  • Allow the user to view all data in the database
  • Allow the user to make a JSONP call to get all data
  • Allow the user to add a new name to the Sales database
  • Allow the user to delete all data in the database

When completed, this will be far from a robust or production-ready application, but we will, at minimum, learn how to view / add / delete data in our MongoDB database, using clean URLs in the browser.


In order to use the code in this article, you’ll need the following installed on your computer:


Installation of these components is beyond the scope of this article, but if you follow the provided links, you will be pointed in the right direction.

File Structure

  • app.js
  • package.json

For this article, we will need two files: app.js and package.json. So, in your project folder, create these two empty files. The following sections will explain what to put in them.

Example # 1A

In Example # 1A, we have the contents of package.json. Note that for a more detailed discussion about package.json files you can search this blog for helpful articles. The two dependencies declared are “mongoose” and “express”. When you use node package manager to install dependencies, npm will download and install mongoose and express for us.

Example # 1B

In Example # 1B we see the command needed to install the dependencies for our application. Once you run this command, you will have everything needed to start writing code.

Getting Started

Open app.js in your text editor. From this point on, you can copy / paste the code in each example into the app.js (or you can scroll to the bottom of this page and paste the entire code listing in one step).

Example #2

In Example #2, we declare all of the top-level variables we’ll need in our script. Take note of “app”, which will be used to leverage the express middleware that we listed as a dependency. Also, “initApp”, which is called from the very end of this script. It is used to start the HTTP server.

Example #3

In Example #3, we have our database implementation. The details are identical to those in the previously mentioned article, so we’ll skip over that.

Example #4

In Example #4, we get into something new. If you remember from Example #2, the variable: “app” is an instance of the Express middleware object. We use the .get() method of that object to define what will happen when certain requests are made. When users navigate to the root of our web application, they are presented with a simple message. We accomplish this by passing two arguments to the .get() method: a string representing the requests we want to respond to (i.e. “/”), and an anonymous function. That anonymous function takes two arguments: “req” and “res”, which represent the request that was made, and the response object that we will send back. We use the .end() method of the response object, and pass in the string we want to send to the browser.

The second call to the app.get() method responds to “/json/delete”. It, in turn, calls a function named: utils.deleteAllData(), which will be explained a bit later.

Example #5

In Example #5, we use the app.get() method to respond to the request: “/json”. For this request, we want to show all of the data in the database. We start off by requesting all of the data in the database: salesMember.find({}).exec(). The anonymous function that is passed to the exec() method provides access to an error object (if there is one), and the results of our search. In this case, the result object is JSON, which contains all the data in the database, which we then stringify.

We then use the utils.isJsonCallback() method to determine if the user added a callback function name to the query string. If so, we wrap our database JSON with the named callback. We then deliver the JSON by passing it to the res.end() method.

Example #6

In Example #6, we respond to a request to add a new user to the database (i.e. “/addUser”). If you remember from the top of the script, the variable “url” allows us to leverage the same-named Node.js module, which provides programmatic access to the URL. We then use the “url” object to access the query string for the new user parameters. Once we have that information, we can leverage code that is nearly identical to the previous article, to create a new document in the collection. So just think of this as adding a row to a database table).

Once the new data has been saved, we then end the response with some HTML, informing the user of the successful data addition, and add a link that allows them to view all data or go to the home page.

Example #7

Example #7 contains all of the utility functions used throughout our code:

utils.isJsonCallback : Returns true if a callback name was provided in the query string
utils.getJsonCallbackName : Returns the name of the callback provided in the query string
utils.wrapDataInCallback : Returns the JSON data, wrapped in the callback function
utils.deleteAllData : Deletes all of the data in the database

Note: At the end of Example #7 you will also see a call to initApp(). This simply starts the HTTP server.

Example #8

So finally, in Example #8 we have the complete code for our working example. You can start the application by navigating to the root of the folder that contains app.js and entering the following command in the terminal: node app.js.

NOTE: On line # 97 of Example #8, I escape the double quotes that are part of HTML element attributes. I did this only because the color-coding of the plugin used to make code more readable was being particularly difficult for some reason. You will likely need to surround that entire string in single quotes, and remove the escape characters: “\”.


In this article we leverage MongoDB, mongoose and express middleware to create a very basic JSONP API. By using the .get() method of the express object instance, we created functions that respond to specific requests. As a result, we were able to provide the user with functionality to view all data, retrieve all data as a JSONP call, add a user to the database, or delete all data.