(Updated: )
/ #micro #node #javascript 

Record analytics events with a Zeit "micro" + SQLite3 Node.js application

micro-analytics-events - A Node.js microservice to record analytics events to SQLite3 with Zeit’s “micro”

In order to finally switch off Google Analytics, I need to replace my goal conversion tracking somehow.

I’ve got pageview data using Simple Analytics, which also has an events system.

In order to use events, you have to get cookie consent and those banners don’t look nice.

Instead I built micro-analytics-events. A Node.js microservice to record analytics events.

It uses micro because it’s lightweight and easy to get started with. It stores things with SQLite3 because I intend to host it for free on Glitch.

It’s a very simple application, if you send a POST request with an event_type it records it. If you send a GET request with optional start, end and event_type query params, it returns { "events": [] }, an object with an event key that’s an array of { event_type, date }.

Table of Contents

Creating a SQLite3 database with Node.js

We’re using the sqlite3 module.

Our query is very straightforward:

CREATE TABLE IF NOT EXISTS events (
  id INTEGER PRIMARY KEY,
  date STRING NOT NULL,
  event_type STRING NOT NULL
);

If we parse that out, we’ll create an events table that takes a string date and event_type. It also will use SQLite’s default auto-incrementing integer as it’s primary key, this is fine since we’re not intending on

SQLite3 and JavaScript Date

The reason we’re using STRING for date is the SQLite3 doesn’t have any built-in Date type.

SQLite does not have a storage class set aside for storing dates and/or times.

Instead, the built-in Date And Time Functions of SQLite are capable of storing dates and times as TEXT, REAL, or INTEGER values.

SQLite3 Docs Date and Time Datatype

We can save our Dates in SQLite3 as:

In JavaScript land:

  • that ISO string is what JavaScript’s Date#toISOString outputs.
  • “Unix Time” is what JavaScript Date#valueOf() outputs (coercing the Date to a Number also does this).
  • No idea about Julian numbers but I don’t intend to use them

Creating a database connection

We create a db.js file which just loads up our database and re-exports it:

const sqlite = require('sqlite3');

const db = new sqlite.Database('.data/main.db');

module.exports = db;

Now if you’ve never seen database connection logic… it’s usually way messier and more async in nature than this.

Since SQLite3 just uses a file, well, there’s no callback’s/await’s to deal with.

Full setup.js

Our setup.js file looks like the following:

const db = require('./db');

const tables = `CREATE TABLE IF NOT EXISTS events (
  id INTEGER PRIMARY KEY,
  date STRING NOT NULL,
  event_type STRING NOT NULL
);`;
db.run(tables);

Which is quite literally the query we defined above dumped into a template string and used to call db.run.

db.run as the name suggests, runs the query.

Hooking into npm start

In order to avoid nasty issues with the server starting up with no database created or without the right schema, we’ll hook into npm start script.

npm start is the usual command that Platform as a Service (PaaS) providers (Heroku, Glitch, now.sh) use to start up Node.js projects.

We’ll use the fact that npm scripts run as follows:

prestart, start, poststart: Run by the npm start command.

npm scripts documentation

So if we want something to run before start we just call it prestart in package.json:

{
  "scripts": {
    "prestart": "node setup.js",
    "//": "other-things"
  }
}

When we run npm start or yarn start, we’ll run node setup.js first, ensuring we have our database set up the way it should be.

Resetting the database

For local development, it can be useful to delete the database and re-bootstrap it.

That’s where the following script comes in:

{
  "scripts": {
    "reset": "rm .data/main.db && node setup.js",
    "//": "other-things"
  }
}

Dev vs production

index.js is the production entrypoint:

const micro = require('micro');
const handler = require('./handler');

const port = process.env.PORT || 3000;

const server = micro(handler);

server.listen(port, () => {
  console.log(`Server running on port ${port}`);
});

We run the application in production mode with node index.js.

handler.js is a micro handler, which is just an async function that’s exported, it can be as simple as:

module.exports = () => 'Hello world';

That means that to run in dev mode, we can use the micro-dev package like so: npx micro-dev handler.js.

We can set the dev script to micro-dev handler.js and the start script to node index.js. Our full package.json looks like this:

{
  "scripts": {
    "dev": "micro-dev handler.js",
    "prestart": "node setup.js",
    "start": "node index.js",
    "reset": "rm .data/main.db && node setup.js"
  },
  "dependencies": {
    "micro": "^9.3.4",
    "sqlite3": "^4.1.0"
  },
  "devDependencies": {
    "micro-dev": "^3.0.0"
  }
}

Handling POST requests

Our handler will switch on req.method:

const {sendError, createError} = require('micro');
module.exports = (req, res) => {
  switch(req.method) {
    case 'POST':
    case 'GET':
    default:
      return sendError(req, res, createError(501, 'Not Implemented'));
  }
}

Parsing request data using micro.json()

Let’s handle POST requests first:

const {json, sendError, createError} = require('micro');

const handlePost = async req => {
  try {
    const data = await json(req);
    const eventType = data.event_type;
    const date = new Date().toISOString();
    console.log(eventType, date);
    return 'logged';
  } catch (error) {
    return 'noop';
  }
};

module.exports = (req, res) => {
  switch(req.method) {
    case 'POST':
      return handlePost(req, res);
    case 'GET':
    default:
      return sendError(req, res, createError(501, 'Not Implemented'));
  }
}

We just leverage micro.json() to load the request body and extract eventType.

If anything happens with the body parsing, we just return 200 noop.

Inserting into SQLite3 with prepared statements

We update handlePost to the following:

const handlePost = async req => {
  try {
    const data = await json(req);
    const eventType = data.event_type;
    const date = new Date().toISOString();
    db.run(`INSERT INTO events (event_type, date) VALUES (?, ?)`, [
      eventType,
      date
    ]);
    return 'logged';
  } catch (error) {
    return 'noop';
  }
};

We’re using a SQLite3 prepared statement to avoid SQL injection:

db.run(`INSERT INTO events (event_type, date) VALUES (?, ?)`, [
  eventType,
  date
]);

Handling GET requests

Create a new handleGet function in handler.js.

const querystring = require('querystring');
const handleGet = async req => {
  const {url} = req;
  const q = url.split('?');
  const {
    start,
    end,
    event_type
  } = querystring.parse(q.length > 0 ? q[1] : '');
  const events = []

  return {events};
}

Update the exported handler to call a our new handleGet function.

module.exports = (req, res) => {
  switch(req.method) {
    case 'POST':
      return handlePost(req, res);
    case 'GET':
      return handleGet(req, res);
    default:
      return sendError(req, res, createError(501, 'Not Implemented'));
  }
}

The quirks of callback-based APIs in 2019: wrapping SQLite’s db.all

Let’s add a select function to our database module (db.js):

db.select = function(sql, values) {
  return new Promise((resolve, reject) => {
    db.all(sql, values, (err, rows) => {
      if (err) return reject(err);
      return resolve(rows);
    });
  });
};

module.exports = db;

select leverages db.all but promisifies the API (hence return of a Promise).

It supports passing of regular SQL-only queries but also prepared statements.

SQLite3 date comparisons with the date/datetime functions

When no event_type is passed as a query param, we’ll use the following prepared statement (which requires a { $start: ISODate, $end: ISODate } values object):

SELECT event_type, date
  FROM events
  WHERE datetime(date) >= datetime($start)
    AND datetime(date) <= datetime($end);

Which pretty checks whether the data of a row, when treated as a datetime, is greater than or equal to start and less than or equal to end.

datetime is a SQLite3 built-in that supports ISO strings.

If event_type is passed, we’ll use the following prepared statement (which requires a { $start: ISODate, $end: ISODate, $event_type: String } values object)

SELECT event_type, date
  FROM events
  WHERE datetime(date) >= datetime($start)
    AND datetime(date) <= datetime($end)
    AND event_type = $event_type;

Update handleGet in handler.js:

const handleGet = async req => {
  const {url} = req;
  const q = url.split('?');
  const {
    start,
    end,
    event_type
  } = querystring.parse(q.length > 0 ? q[1] : '');
  const values = {
    $start: new Date(start).toISOString(),
    $end: new Date(end).toISOString()
  };
  let eventTypeClause = '';
  if (event_type) {
    values.$event_type = event_type;
    eventTypeClause = `AND event_type = $event_type`;
  }

  const events = await db.select(
    `SELECT event_type, date
     FROM events
      WHERE datetime(date) >= datetime($start)
        AND datetime(date) <= datetime($end)
        ${eventTypeClause}`,
    values
  );

  return {events};
}

Clever defaults and overcoming datetime quirks

The GET endpoint only works when you pass it bounds as query parameters.

Let’s change that by defaulting start and end, update handleGet and add defaultStart, defaultEnd in handler.js:

const defaultStart = () => {
  const d = new Date();
  d.setDate(d.getDate() - 1);
  return d.toISOString();
};

const defaultEnd = () => new Date().toISOString();

const handleGet = async req => {
  const {url} = req;
  const q = url.split('?');
  const {
    start = defaultStart(),
    end = defaultEnd(),
    event_type
  } = querystring.parse(q.length > 0 ? q[1] : '');
  const values = {
    $start: new Date(start).toISOString(),
    $end: new Date(end).toISOString()
  };
  let eventTypeClause = '';
  if (event_type) {
    values.$event_type = event_type;
    eventTypeClause = `AND event_type = $event_type`;
  }

  const events = await db.select(
    `SELECT event_type, date
     FROM events
      WHERE datetime(date) >= datetime($start)
        AND datetime(date) <= datetime($end)
        ${eventTypeClause}`,
    values
  );

  return {events};
}

Default end is now the current date, and default start the previous day.

On top of these defaults, we want to make sure all of the day’s events are sent, add the following function:

const processEnd = dateString => {
  const date = new Date(dateString);
  date.setHours(23, 59, 59, 999);
  return date.toISOString();
};

And update handleGet to use it:

const handleGet = async req => {
  const {url} = req;
  const q = url.split('?');
  const {
    start = defaultStart(),
    end = defaultEnd(),
    event_type
  } = querystring.parse(q.length > 0 ? q[1] : '');
  const values = {
    $start: new Date(start).toISOString(),
    $end: processEnd(end)
  };
  let eventTypeClause = '';
  if (event_type) {
    values.$event_type = event_type;
    eventTypeClause = `AND event_type = $event_type`;
  }

  const events = await db.select(
    `SELECT event_type, date
     FROM events
      WHERE datetime(date) >= datetime($start)
        AND datetime(date) <= datetime($end)
        ${eventTypeClause}`,
    values
  );
  return {events};
};

Adding basic API authorization

We don’t want just anyone creating and reading events from our analytics API, so let’s secure it.

We’ll create the micro equivalent of an Express middleware, except in micro it’s just a function that takes a function and returns a (req, res) => {} handler.

Create a check-auth.js file, which if API_SECRET is set in env, checks that the Authorization header ends with it:

const {sendError, createError} = require('micro');

const {API_SECRET} = process.env;

const checkAuth = fn => (req, res) => {
  if (!API_SECRET) {
    return fn(req, res);
  }

  if (
    req.headers.authorization &&
    req.headers.authorization.endsWith(API_SECRET)
  ) {
    return fn(req, res);
  }

  return sendError(req, res, createError(401, 'Unauthorized'));
};

module.exports = checkAuth;

To consume this, import check-auth.js into handler.js and update the export to be:

const checkAuth = require('./check-auth');

module.exports = checkAuth(async (req, res) => {
  switch (req.method) {
    case 'GET':
      return handleGet(req, res);
    case 'POST':
      return handlePost(req, res);
    default:
      return sendError(req, res, createError(501, 'Not Implemented'));
  }
});

Allowing CORS in a micro app

Another classic middleware scenario is allowing CORS, since we probably want the events we record to be sent from a frontend somewhere.

We’ll read ALLOWED_ORIGIN from the environment and default it to '*' (not the safest default).

We also want to make sure to 200 when the request is a pre-flight OPTIONS request, and to allow Authorization using 'Access-Control-Allow-Headers'.

This is the cors.js file contents:

const {ALLOWED_ORIGIN = '*'} = process.env;
const cors = fn => (req, res) => {
  res.setHeader('Access-Control-Allow-Origin', ALLOWED_ORIGIN);
  res.setHeader(
    'Access-Control-Allow-Headers',
    'Origin, X-Requested-With, Content-Type, Accept, Authorization'
  );
  if (req.method === 'OPTIONS') {
    return 'ok';
  }

  return fn(req, res);
};

module.exports = cors;

To consume this, import cors.js into handler.js and update the export to be:

const cors = require('./cors');
module.exports = cors(
  checkAuth(async (req, res) => {
    switch (req.method) {
      case 'GET':
        return handleGet(req, res);
      case 'POST':
        return handlePost(req, res);
      default:
        return sendError(req, res, createError(501, 'Not Implemented'));
    }
  })
);

Adding a /health endpoint

Having an endpoint that 200s on demand can be useful for things like uptime monitoring.

We just want to check that the path is /health and 200 in that case. Again the case for a middleware, which is just a function that decorates a passed handler.

In handler.js:

const health = fn => (req, res) => {
  if (req.url === '/health') {
    return 'OK';
  }

  return fn(req, res);
};

module.exports = health(
  cors(
    checkAuth(async (req, res) => {
      switch (req.method) {
        case 'GET':
          return handleGet(req, res);
        case 'POST':
          return handlePost(req, res);
        default:
          return sendError(req, res, createError(501, 'Not Implemented'));
      }
    })
  )
);

Deploying to Glitch

This was designed to run on Glitch, hence the use of SQLite3.

You can remix it directly from glitch.com/~interesting-protocol or deploy from the repository github.com/HugoDF/micro-analytics-events/your fork using Glitch’s GitHub import feature.

In Glitch GitHub importer syntax, the repository would be HugoDF/micro-analytics-events.

Futher attempted work

To make queries potentially faster on the date field, I attempted to add an index on datetime(date), which is not supported on the version of SQLite that Glitch runs.

The reason the index needs to be on datetime(date) and not date is that we never actually query against date we always use the datetime SQLite function.

A retrospective on previous “micro” art

I still stand by Simple, but not too simple: how using Zeit’s micro improves your Node applications.

micro exposes a great function-based composition model that’s easy to work with and predictable. A middleware is just a “handler decorator” that runs the passed handler if relevant.

Credit Curtis MacNewton

Author

Hugo Di Francesco

Co-author of "Professional JavaScript", "Front-End Development Projects with Vue.js" with Packt, "The Jest Handbook" (self-published). Hugo runs the Code with Hugo website helping over 100,000 developers every month and holds an MEng in Mathematical Computation from University College London (UCL). He has used JavaScript extensively to create scalable and performant platforms at companies such as Canon, Elsevier and (currently) Eurostar.

Get The Jest Handbook (100 pages)

Take your JavaScript testing to the next level by learning the ins and outs of Jest, the top JavaScript testing library.