CRON-free with MASA.io

Published on

Apr 14, 2016

events

On the last KNPLabs hackathon three teams worked hard during two days: The RADTeam has polished the KNP RAD Components, the GauffreTeam has worked on a new version of Gaufrette, using PHP streams at their best, and a third one worked on a toy project in order to experiment new technologies.

They decided to work on a “cron-as-a-HTTP-service” project : users would be able to schedule HTTP calls at given times, the service should trigger this HTTP request at time and then stores the response result. They wanted to get out of their comfort zone and play with new technologies.

IMG_9421

There is no good project without a good name, thus they chose the name “Masa”, which means “time” in Malay. And because they built a project with so much swag, of course they quickly added the essential .io suffix ;)

A global overview of Masa.io

Business as usual with Docker and Docker Compose: a single YAML file to define a float of Docker containers, each containing an atomic part of our app. They communicate with each other through network.

Here is a diagram of the app components and Docker containers. For each component, the chosen technology is displayed in a yellow label. A green pin indicates a component built from scratch for Masa.

In order of appearance on the diagram : Antoine and Pierre have worked on the Symfony API, Olivier on the polling Node.js server, Laurent on the Go worker and Emma on the reporter Node.js server.

All components have been stored into a single git repository - as everybody was working on different parts of the application, they were able to share a git repository without any problem - the only shared resource being the “docker-compose.yml” file.

Now, let’s have a quick look at each component!

The API

Pierre and Antoine have built an API app with Symfony. As the goal was to get out of their comfort zone, they did not use any ready to use solution like FOSRESTBundle, nor Doctrine ODM to handle MongoDB data.

Instead they implemented a raw Mongo DB <-> structured ValueObject mapper, and called these mapper methods in the REST Controllers.

The API is self-documented with the use of the excellent NelmioApiDocBundle.

The interesting parts of having worked on this component :

  • Discovering the Doctrine ODM, then switch to a handmade solution in order to be closer to MongoDB and experiment things.
  • Playing with Docker & Docker Compose

The polling component

The polling app is a Node.js server, which issues a request to the MongoDB server every second. MongoDB returns any record with a scheduled date that equals or is less than now, and which is not flagged as “pending” or “processed”.

For each of these records, a message is sent to RabbitMQ, and the record is flagged as “pending”.

The interesting parts of having worked on this component :

  • Using Mongoose, the “MongoDB only” ODM of Node.js. Olivier has been using the Knex /Bookshelf duo before, but he always had the feeling that it was a “less good” version of Doctrine ORM. But working with Mongoose was different! It is a really good piece of software : the entities definition is simple and intuitive, and it’s fun to define the model schema (yeah, it can be fun, pinky swear! :-) ). The Promises-based API is fluent, everything works as expected. He now understands why so many people like to work with MongoDB on Node.js projects!
  • The 'node-mongoose-fixtures' module makes creating MongoDb fixtures a breeze.
  • Following a suggestion of Pierre, Olivier tried to use a Unix-like app structure, with /boot, /etc, lib/ directories. This is not the way he structures his Node.js apps usually, but this "Unix-like naming convention" works well and is simple to use for anybody who is used to Unix / Linux.

The worker component

For this part Laurent had his first experimentation with the Go language. It is really different from what we’re used to as mainly PHP / JavaScript users.

Although this was a completely new field, Laurent has been able to build a Go worker pretty quickly : as a CLI tool that receives arguments like the target URL, the HTTP verb to use, custom HTTP headers, and triggers the HTTP request, then sends the result to another component (the reporter).

As a modern programming language made in a Web company, the cool part of Go is that it is shipped with a pretty complete standard library, with all the necessary API to work with HTTP.

For example, because this worker is launched by a RabbitMQ cli consumer , the data it receives is encoded in base64. No problem! In a few minutes, Laurent was able to decode it with the stdlib of Go - a language he had never used until the day before…

Although Laurent has not been very pleased with Go paradigms and language syntax, it’s quite impressive to see such a quick learning curve for a low-level language, which compiles to binary standalone executables.

The interesting parts of having worked on this component :

  • Discovering a new programming language
  • Easy to deploy, as a single executable file is produced by the Go compiler

The reporter component

Last but not least, the reporter made by Emma is a Node.js server, which listens to reports coming from worker processes, through HTTP communication. It stores the result in MongoDB, and updates the task record with a “processed” flag.

Working on this has been a good way for Emma to discover Mongoose too, as well as Express, the Node.js “de facto” HTTP micro framework, and JavaScript Promises.

The interesting parts of having worked on this component :

  • Building a real Node.js app, based on rock-solid components like Express and Mongoose
  • Playing with Docker & Docker Compose

Outro

The Team really enjoyed working together on the Masa.io toy project! In only two days they’ve been able to ship several applications, each one with different technologies but linked by Docker network features. Micro-services, have you said?masaio

Even though each of them were working on a specific component, they discussed a lot about the architecture and the most interesting way of building each of them. Even if we were clearly out of our comfort zones, we managed to have the full  API -> MongoDB -> Polling -> RabbitMQ -> Worker -> Reporter workflow working at the end of the hackathon!

Enjoy !

Written by

Eve Vinclair-Berkemeier
Eve Vinclair-Berkemeier

People Manager @ KNPLabs

Scrum Mistress - AFOL at home and at work :D Helping hand for client projects and internal organization of our teams at KNP.

Comments