read24 - Server: MySQL Database Connector

July 10, 2020

Roger Ngo

The limitation that we have within the system is that all data is currently stored in-memory. There is no mechanism to regularly flush data into memory, and doing so to keep the data store fresh would be impractical. If the server application is to terminate unexpectedly, then all data which had not been saved to the mockDb.json file would be lost.

A better solution is to store the data on a persistent store. This could mean storing the data in a SQL database. What is even better is that I can create an abstraction layer that could remove the details on what datastore the server is accessing.

This way the type of store isn't dependent on the application is accessing as long as it has implemented a specific interface which the abstraction layer can access.‌

For now, I will just be switching between 2 types of database storage modes:‌

  1. In-memory - This is the type the application is using now.
  2. MySQL - For when data must be persisted.‌

To begin, install the necessary npm dependencies. mysql, and @types/mysql.

npm install mysql --save
npm install @types/mysql --save-dev

Then it is time for some heavy-duty refactoring. The need has arisen that I must get fancier!

In types.ts, I will redefine DataRow to be DataType to consist of the minimal required properties of a data row:

export interface DataType {
    id: number;
    dateCreated?: number;
    dateUpdated?: number;
}

Then, DataRow is a new TypeScript intersection type that will have all properties of DataType, BookType, ClassRoomType, etc.

export type DataRow = DataType & 
    QuizTokenType & 
    BookType & 
    ClassroomType & 
    QuestionType & 
    QuizQuestionType & 
    RatingType & 
    StudentType & 
    StudentAnswerType & 
    UserType;

Spoiler alert, BaseResource will need to be moved, and will be refactored, but I will come back to that.

Now, the approach is to have a common database connector in which in-memory, MySQL, NoSQL, etc data stores should implement so that the database operations are abstracted away, and the server need not be concerned about which database engine is running under the hood.

Under db, create a new file db_connector.ts. For now, it will have the limited set of function definitions I have used frequently thus far.

import { DataRow } from "./types";

export abstract class DbConnector {
    public abstract insert(tableName: string, data: Partial<DataRow>): Promise<number>;
    public abstract update(tablename: string, data: Partial<DataRow>): Promise<number>;
    public abstract find(tableName: string, id: number): Promise<Partial<DataRow>>;
    public abstract select(tablename: string, whereFunc?: (o: Partial<DataRow>) => boolean): Promise<Partial<DataRow>[]>;
}

Notice that the return types are now Promise objects. This is because with MySQL, or any network based database operations, we do not want to block the CPU when performing the operations. Therefore they are async, and should allow callers to proceed with logic safely while waiting for results to be processed within the database.‌

Before the MySQL connector is implemented, I will need to refactor _MemoryDb to extend DbConnector, and have it properly implement the abstract base methods.‌

The changes consist of pretty much converting the methods in _MemoryDb to return a resolved Promise object with the value. For example, the insert method will return a resolved Promise with 1 indicating 1 row inserted.

public insert(tableName: string, data: DataRow): Promise<number> {
    let maxId = 0;
    const table = db.data[tableName];

    for (const row of table) {
        if (row.id > maxId)
            maxId = row.id;
    }

    maxId++;

    data.id = maxId;

    db.data[tableName].push(data);

    return Promise.resolve(1);
}

Similarly, for find, a resolved Promise with the DataRow to be returned:

public find(tableName: string, id: number): Promise<DataRow> {
    return Promise.resolve(db.data[tableName].find((o: DataRow) => o.id === id));
}

Now that is done, I can implement a MySQL connector called _MySqlDb which implements the same DbConnector. ‌

First, create a connection to the database. Right now, since I am still in development stages, I will just leave the credentials as-is. But once I am serious, this must be moved to somewhere safer.

const connection = mysql.createConnection({
    host: 'localhost',
    user: 'read24_dev',
    password: 'password',
    database: 'read24_dev',
    insecureAuth: true
});

Then start the connection:

connection.connect();

Also, provide a hook on SIGINT to safely terminate the connection:

process.on("SIGINT", () => {
    console.log('terminating sql connection');
    connection.end((err) => 
        err
            ? process.exit(1) 
            : process.exit());
});

Using the documentation found on the npm page, I query the database by building a list of columns, and values, and constructing the query manually. Once I have done that connection.query is called to perform the SQL query.

For example, here is the insert method:

public insert(tableName: string, data: DataRow): Promise<number> {
    const list = this._buildList(data);
    const values = [];
    for(const col in data) {
        values.push(data[col.toString()]);
    }

    const formatted = mysql.format(`INSERT INTO ${TableMapping[tableName]} SET ${list}`, values);

    const promise = new Promise<number>((resolve, reject) => {
        const query = connection.query(formatted, (err, results) => {
            if (err)
                return reject(err);

            console.log(query.sql);
            console.log('INSERT results', results);

            return resolve(1);
        });
    });

    return promise;
}

A Promise<number> just like the insert method in _MemoryDb is returned. If you are wondering what the TableMapping object is, it is just a mapping from the typical in-memory camel-case table names to the table names I plan to be using for the SQL database.

const TableMapping = {
    books: 'books',
    choices: 'choices',
    classrooms: 'classrooms',
    questions: 'questions',
    quizQuestions: 'quiz_questions',
    quizTokens: 'quiz_tokens',
    ratings: 'ratings',
    students: 'students',
    studentAnswers: 'student_answers',
    teachers: 'teachers',
    teacherClassrooms: 'teacher_classrooms'
};

I am now able to instantiate a generic DatabaseConnector which can be used to access MySQL, or in-memory data stores. Under the db folder, create connector.ts to instantiate the DatabaseConnector depending on the environment being used.

import { MemoryDb } from "./memory";
import { DbConnector } from "./db_connector";
import { MySqlDb } from "./sql";

export let DatabaseConnector: DbConnector = null;

if (process.env.DATA_SOURCE === 'mysql') {
    DatabaseConnector = MySqlDb;
    console.log('Using the mysql database');
}
else {
    DatabaseConnector = MemoryDb;
    console.log('Using the in-memory database');
}

Notice that the DATA_SOURCE environment variable must be set. I will circle back to this at a later time to allow myself to set these variables in an easier way. But for now, export DATA_SOURCE=mysql through the terminal is fine before running npm run start on the server application.

Remember BaseResource? Let's pull that out from types.ts into its own file in db called base_resource.ts.‌

Mostly everything is the same, except that now a new utility method is introduced to convert a resource class into a DataRow for writing into the database. This is called serializeForDb:

public serializeForDb(): Partial<DataRow> {
    const result = {};
    for (const prop in this) {
        if (typeof this[prop] === 'function' || prop === '_tableName')
            continue;

        result[prop.toString()] = this[prop];
    }

    return result;
}

I can then refactor insert, and update to use this method so that less code is repeated. For example, using the new DatabaseConnector, I have rewritten the insert method to be:

public async insert() {
    if (this.id > 0)
        throw new Error(`Cannot insert this row because it already has an ID value: ${this.id}.`);

    this.dateCreated = Date.now();

    const toSave = this.serializeForDb();

    await DatabaseConnector.insert(this._tableName, toSave);

    return this;
}

Notice the method now becoming async because the DatabaseConnector.insert operation is now asynchronous.

The other big change which I must call out in this class is the load method. It is now asynchronous since it makes a call to DatabaseConnector.find. This is a big change as now the old pattern of loading a resource by ID:

new ResourceClass().load(id)

Now becomes:

await new ResourceClass().load(id)

I will have to make many changes in the resource classes, and routes themselves to accommodate this change. For example, within QuizToken, the findByToken method becomes an static async method because it now uses the async DatabaseConnector.select method, and also BaseResource.load.

public static async findByToken(token: string) {
    const id = (await DatabaseConnector.select('quizTokens', (o: QuizTokenType) => o.token === token))[0].id;

    return await new QuizToken().load(id);
}

Now, the /quiz/book/question endpoint can be modified to be an async route with the necessarily changes:

app.post('/quiz/book/question', async (req, res) => {
    const quizToken = req.body.quizToken;
    const choiceId = req.body.choiceId;

    if (!quizToken || !choiceId)
        return res.status(400).json({message: 'Must provide both the quiz token, and choice.'});

    const qt = await QuizToken.findByToken(quizToken);
    const c = await new Choice().load(choiceId);

    const studentAnswer = new StudentAnswer({
        quizToken: qt.token,
        studentId: qt.studentId,
        questionId: c.questionId,
        choiceId: c.id
    });
    studentAnswer.insert();

    return res.status(200).json(studentAnswer.json());
});

Most of the other routes and resource classes will need the same type of changes (as noticed with await new Choice().load(choiceId) from above).

Finally before testing everything, create all the tables for the SQL database, and seed it with the same data which can also be found in mockDb.json. I won't dump all that here, but it can be found under scripts in the code repository.

After testing, I can run a student report which all makes accesses to the SQL database:

{
    "student": {
        "id": 1,
        "classroomId": 1,
        "firstName": "Stu",
        "middleName": null,
        "lastName": "Dent",
        "grade": 5,
        "userId": 1
    },
    "data": [
        {
            "passed": false,
            "book": {
                "title": "Where the Red Fern Grows",
                "author": "Wilson Rawls",
                "wordCount": 75528,
                "lexile": 700
            },
            "quiz": {
                "token": "92e07760-3d29-455b-b5b0-3d118da3f634",
                "dateCreated": 1594384513233,
                "totalQuestions": 10,
                "totalCorrect": 5
            }
        },
        {
            "passed": true,
            "book": {
                "title": "Where the Red Fern Grows",
                "author": "Wilson Rawls",
                "wordCount": 75528,
                "lexile": 700
            },
            "quiz": {
                "token": "4610a426-1c93-445e-907f-9900369f9ee2",
                "dateCreated": 1594384681736,
                "totalQuestions": 10,
                "totalCorrect": 10
            }
        }
    ]
}

Next time, I will create administration type routes. This will make it easier to create data. We are getting closer to front-end development!