It's about time to add some application logic to our project
This is part 8 of the series: How to create your own website based on Docker.In the last part of the series, we have created our "dockerized" mongodb noSQL database server to read our persisted entries from and based on our architecture we have decided, that only the REST API (which will be based on ioJS) is allowed to talk to our database container.
So now it's about time to create the actual REST API that can be called via our nginx reverse proxy (using api.project-webdev.com) to read some person object entry from our database. We'll also create a very simple way to create a Person as well as list all available persons. As soon as you've understood how things work, you'll be able to implement more features of the REST API yourself - so consider this as pretty easy example.
Source code
All files mentioned in this series are available on Github, so you can play around with it! :)Technologies to be used
Our REST API will use the following technologies:- ioJS as JavaScript application server
- hapiJS as REST framework
- mongoose as mongoDB driver, to connect to our database container
- pm2 to run our nodejs application (and restart it if it crashes for some reason)
First things first - creating the ioJS image
Creating the ioJS image is basically the same every time. Let's create a new directory called /opt/docker/projectwebdev-api/ and within this new directory we'll create another directory called app and our Dockerfile:
# mkdir -p /opt/docker/projectwebdev-api/app/The new Dockerfile is based on the official ioJS Dockerfile, but I've added added some application/image specific information, so that we can implement our ioJS application:
# > /opt/docker/projectwebdev-api/Dockerfile
- Added our ubuntu base image (we're not using debian wheezy like in the official image)
- Installed the latest NPM, PM2 and gulp (for later; we're not using gulp for this little demo)
- Added our working directories
- Added some clean up code
- Added PM2 as CMD (we'll talk about that soon)
So just create your /opt/docker/projectwebdev-api/Dockerfile with the following content:
# Pull base image.Source: https://github.com/mastix/project-webdev-docker-demo/blob/master/projectwebdev-api/Dockerfile
FROM docker_ubuntubase
ENV DEBIAN_FRONTEND noninteractive
RUN apt-get update
RUN apt-get update --fix-missing
RUN curl -sL https://deb.nodesource.com/setup_iojs_2.x | bash -
RUN apt-get install -y iojs gcc make build-essential openssl make node-gyp
RUN npm install -g npm@latest
RUN npm install -g gulp
RUN npm install -g pm2@latest
RUN apt-get update --fix-missing
RUN mkdir -p /var/log/pm2
RUN mkdir -p /var/www/html
# Cleanup
RUN apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
RUN apt-get autoremove -y
RUN ln -s /usr/bin/nodejs /usr/local/bin/node
WORKDIR /var/www/html
CMD ["pm2", "start", "index.js","--name","projectwebdevapi","--log","/var/log/pm2/pm2.log","--watch","--no-daemon"]
Adding our REST API code to our container
Now let's create a simple application, that listens to a simple GET request and returns and entry from our mongoDB container. Just to proof that it works, I'll create a REST API that returns a simple Person object that contains an id as well as a first and a last name.In order to get this object later, I'd have to call http://api.projectwebdev.com/person/{id} and it will return that object in JSON format. We'll also add a router to return all persons as well as a route that allows to add a new person - but we'll cover that in a second.
Since PM2 will only start (and not build) our ioJS application, we have to make sure that NPM (packaged with ioJS or nodeJS) is installed on your server, so that you can build the project there.
So here is my simple flow:
- I create the ioJS application on my local machine
- Then I upload the files to my server
- On my server I use npm install to fetch all dependencies
- PM2 restart the application automatically if it detects changes
In a later blog posting I will explain how you can setup a Git Push-To-Deploy mechanism which will take care of this automatically, but for this simple application we're doing it manually.
To get started, I'll create a new directory on my local machine (which has ioJS installed) and create a basic application:
# mkdir -p /home/mastixmc{development/projectwebdev-api && $_npm init will ask you a bunch of questions, and then write a package.json for you. It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected. (Info: Every nodeJS/ioJS application needs to have a package.json as descriptor)
# npm init
# npm install hapi mongoose --save
npm install hapi mongoose --save will download/install hapiJS and mongoose and will save the dependency in our package.json file, so our server can download it later as well.
Creating the application
In our new directory, we'll create a file called index.js, with the following contents (we'll get into details afterwards):var hapi = require('hapi');Disclaimer: Since this is just a little example, I hope you don't mind that I've put everything into on file - in a real project, I'd recommend to structure the project correctly, so that it scales in larger deployments - but for now, we're fine. Also, I did not add any error-checking or whatsoever to this code as it's just for demonstration purposes.
var mongoose = require('mongoose');
// connect to database
mongoose.connect('mongodb://'+process.env.MONGODB_1_PORT_3333_TCP_ADDR+':'+process.env.MONGODB_1_PORT_3333_TCP_PORT+'/persons', function (error) {
if (error) {
console.log("Connecting to the database failed!");
console.log(error);
}
});
// Mongoose Schema definition
var PersonSchema = new mongoose.Schema({
id: String,
firstName: String,
lastName: String
});
// Mongoose Model definition
var Person = mongoose.model('person', PersonSchema);
// Create a server with a host and port
var server = new hapi.Server();
server.connection({
port: 3000
});
// Add the route to get a person by id.
server.route({
method: 'GET',
path:'/person/{id}',
handler: PersonIdReplyHandler
});
// Add the route to get all persons.
server.route({
method: 'GET',
path:'/person',
handler: PersonReplyHandler
});
// Add the route to add a new person.
server.route({
method: 'POST',
path:'/person',
handler: PersonAddHandler
});
// Return all users in the database.
function PersonReplyHandler(request, reply){
Person.find({}, function (err, docs) {
reply(docs);
});
}
// Return a certain user based on its id.
function PersonIdReplyHandler(request, reply){
if (request.params.id) {
Person.find({ id: request.params.id }, function (err, docs) {
reply(docs);
});
}
}
// add new person to the database.
function PersonAddHandler(request, reply){
var newPerson = new Person();
newPerson.id = request.payload.id;
newPerson.lastName = request.payload.lastname;
newPerson.firstName = request.payload.firstname;
newPerson.save(function (err) {
if (!err) {
reply(newPerson).created('/person/' + newPerson.id); // HTTP 201
} else {
reply("ERROR SAVING NEW PERSON!!!"); // HTTP 403
}
});
}
// Start the server
server.start();
Now I we can copy our index.js and package.json file to our server (/opt/docker/projectwebdev-api/app/), ssh into our server and run npm install within that directory. This will download all dependencies and create a node_modules folder for us. You'll have a fully deployed ioJS application on your Docker host now, which can be used by the projectwebdev-api container, since this directory is mounted into it.
Explaining the REST-API code
So what does this file do? Pretty simple:HapiJS creates a server that will listen on port 3000 - I've also added the following routes including their handlers:
- GET to /person, which will then call a PersonReplyHandler function, that uses Mongoose to fetch all persons stored in our database.
- GET to /person/{id}, which will then call a PersonIdReplyHandler function, that uses Mongoose to fetch a person with a certain id from our database.
- POST to /person, which will then call a PersonAddHandler function, that uses Mongoose to store a person in our database.
A Person consists of the following fields (we're using the Mongoose Schema here):
// Mongoose Schema definitionSo the aforementioned handlers (e.g. PersonAddHandler) will make sure that this information is served or stored from/to the database.
var PersonSchema = new mongoose.Schema({
id: String,
firstname: String,
lastname: String
});
Later, when you have set up your nginx reverse proxy, you'll be able to use the following requests to GET or POST persons. But we'll get into that in the last part!
Add a new person:
curl -X POST -H "Accept: application/json" -H "Content-Type: multipart/form-data" -F "id=999" -F "firstname=Sascha" -F "lastname=Sambale" http://api.project-webdev.com/personResult:
[{Get all persons:
"_id": "555c827959a2234601c5ddfa",
"firstName": "Sascha",
"lastName": "Sambale",
"id": "999",
"__v": 0
}]
curl -X GET -H "Accept: application/json" http://api.project-webdev.com/person/Result:
[{Get a person with id 999:
_id: "555c81f559a2234601c5ddf9",
firstName: "John",
lastName: "Doe",
id: "15",
__v: 0
}, {
_id: "555c827959a2234601c5ddfa",
firstName: "Sascha",
lastName: "Sambale",
id: "999",
__v: 0
}]
curl -X GET -H "Accept: application/json" http://api.project-webdev.com/person/999Result:
[{You'll be able to do that as soon as you've reached the end of this series! ;)
"_id": "555c827959a2234601c5ddfa",
"firstName": "Sascha",
"lastName": "Sambale",
"id": "999",
"__v": 0
}]
Explaining the database code
I guess the most important part of the database code is how we establish the connection to our mongodb container.// connect to databaseSince we're using container links, we can not know which ip our mongodb container will get when it gets started. So we have to use environment variables that Docker provides us.
mongoose.connect('mongodb://'+process.env.MONGODB_1_PORT_3333_TCP_ADDR+':'+process.env.MONGODB_1_PORT_3333_TCP_PORT+'/persons', function (error) {
if (error) {
console.log("Connecting to the database failed!");
console.log(error);
}
});
Docker uses this prefix format to define three distinct environment variables:
- The prefix_ADDR variable contains the IP Address from the URL, for example WEBDB_PORT_8080_TCP_ADDR=172.17.0.82.
- The prefix_PORT variable contains just the port number from the URL for example WEBDB_PORT_8080_TCP_PORT=8080.
- The prefix_PROTO variable contains just the protocol from the URL for example WEBDB_PORT_8080_TCP_PROTO=tcp.
If the container exposes multiple ports, an environment variable set is defined for each one. This means, for example, if a container exposes 4 ports that Docker creates 12 environment variables, 3 for each port.
In our case the environment variables look like this:
- MONGODB_1_PORT_3333_TCP_ADDR
- MONGODB_1_PORT_3333_TCP_PORT
- MONGODB_1_PORT_3333_TCP_PROTO
Where MONGODB is the name and PORT is the port number we've specified in our docker-compose.yml file:
mongodb:Docker Compose also creates environment variables with the name DOCKER_MONGODB, which we are not going to use as it might happen that we switch from Docker Compose to something else in the future.
build: ./mongodb
expose:
- "3333"
volumes:
- ./logs/:/var/log/mongodb/
- ./mongodb/db:/data/db
So Docker provides the environment variables and ioJS uses the process.env object to access them. We can therefore create a mongodb connection URL that looks like this:
mongodb://172.17.0.82:3333/persons... which will be the link to our Docker container that runs mongodb on port 3333... Connection established!
Running ioJS in production mode
As mentioned before, in order to start (and automatically restart our REST API application, when we update the application files or the application crashes for some reason) we're using PM2, which will be configured via command line paramaters in our CMD instruction (see our Dockerfile):CMD ["pm2", "start", "index.js","--name","projectwebdevapi","--log","/var/log/pm2/pm2.log","--watch","--no-daemon"]So what does this command do?
- "pm2", "start", "index.js" starts our application from within our WORKDIR (/var/www/html/).
- "--name","projectwebdevapi" names our application projectwebdevapi.
- "--log","/var/log/pm2/pm2-project.log" logs everything to /var/log/pm2/pm2-project.log (and since this is a mounted directory it will be stored on our docker host in /opt/docker/logs - see our docker-compose.yml file).
- "--watch" watches our WORKDIR (/var/www/html/) for changes and will restart the application if something has changed. So you'll be able to update the application on your docker host and the changes will be reflected on the live site automatically.
- "--no-daemon" runs PM2 in the foreground so the container does not exit and keeps running.
That's pretty much it - now, whenever you start your container later (in our case Docker Compose will start it), PM2 will start your application and will make sure that it keeps running.
In the next part we'll create the frontend application that calls our new REST-API!