Subscribe

Share

Application Development

Files Up and Files Down

Use node-oracledb to support uploading and downloading files from your REST APIs.

By Dan McGhan

July/August 2019

Storing images, documents, and other files in Oracle Database is nothing new. But it wasn’t until support for binding and fetching buffers was added to node-oracledb that a simple solution for this emerged in the Node.js world. In this article, I’ll show you how to use Node.js to upload files from a client into the database—and then download them back out.

Why Store Files in the Database?

You might be wondering why you should use the database to store files (unstructured data) when you could just use the file system? This is a great question. Here are a few important reasons:

To better relate unstructured data to structured data. For example, you can associate images of receipts with detail lines in an expense reporting application.

To simplify backup/recovery. It often makes more sense to reuse your existing database backup/recovery strategies than to invent new ones for the file system.

To leverage advanced database security features. From end-to-end encryption to advanced access controls, Oracle Database has a lot to offer when it comes to security.

To index files and make them searchable. Many document types can be made searchable with “Google-like” search results via Oracle Text.

As you can see, there are some good use cases for storing files in the database, so let’s get started!

Setup and API Requirements

In this article, I assume you’re running everything from within the Database App Development VM with Node.js and Git installed. (See this post for instructions on setting up this type of environment.) You may use any other setup you wish, but you’ll need to adapt the instructions that follow accordingly.

I’ll pick up the code where I left off in the series on building a REST API with Node.js. To get access to the code for that series on GitHub, run the following lines of code from a terminal:

cd ~
git clone
cd oracle-db-examples/javascript/rest-api/part-5-manual-pagination-sorting-
and-filtering/hr_app

This will clone the oracle-db-examples repo in GitHub and then change directories to the last article of that REST API series. This will be your starting point for building out the API for uploading and downloading files with Node.js.

Run the following lines of code in a tool such as Oracle SQL Developer to create the table in the HR schema that will be used to store uploaded files:

create table jsao_files (
  id           number generated always as identity not null,
  file_name    varchar2(255) not null,
  content_type varchar2(255) not null,
  blob_data    blob,
  constraint jsao_files_pk primary key (id)
);

With the table created, you’re now ready to start building out the API. To demonstrate a few capabilities beyond simply uploading and downloading files, I’ll define a couple business rules in the API. For uploads, the maximum size of the file should be 50 MB. For downloads, browsers should be able to download the file with the name saved when the file was uploaded. Consider adding more business rules once you have the API working correctly.

Routing Logic

The first change you will make to the Node.js API is to add another route and related handlers to the router. Open the services/router.js file, and add the following line below the existing variable declarations (line 4):

const files = require('../controllers/files.js');

Currently the files.js controller file doesn’t yet exist. You’ll create that file in a subsequent step, but keep in mind that your application will not start until that file is created.

Next add the following lines below the existing call to router.route.

router.route('/files/:id?')
  .get(files.get)
  .post(files.post);

The new route being defined is /files, with an optional id parameter in the path. HTTP get and post requests will be routed to the respective get and post functions from the files controller module.

Controller Logic for Uploads

Create a new file in the controllers directory named files.js. Open the file, and add the following lines of code (and don’t forget to save your changes afterward):

const files = require('../db_apis/files.js');

async function post(req, res, next) {
  try {
    const maxFileSize = 1024 * 1024 * 50; // 50MB; OCI limit is 1 GB unless streaming
    let contentBuffer = [];
    let totalBytesInBuffer = 0;
    let contentType = req.headers['content-type'] || 'application/octet';
    let fileName = req.headers['x-file-name'];

    if (fileName === '') {
      res.status(400).json({error: `The file name must be 
                            passed to the via x-file-name header`});
      return;
    }

    req.on('data', chunk => {
      contentBuffer.push(chunk);
      totalBytesInBuffer += chunk.length;

      if (totalBytesInBuffer > maxFileSize) {
        req.pause();

        res.header('Connection', 'close');
        res.status(413).json({error: `The file size exceeded the 
                              limit of ${maxFileSize} bytes`});

        req.connection.destroy();
      }
    });

    req.on('end', async function() {
      contentBuffer = Buffer.concat(contentBuffer, totalBytesInBuffer);
      
      try {
        const fileId = await files.create(fileName, contentType, contentBuffer);

        res.status(201).json({fileId: fileId});
      } catch (err) {
        console.log(err);

        res.header('Connection', 'close');
        res.status(500).json({error: 'Oops, something broke!'});

        req.connection.destroy();
      }
    });
  } catch (err) {
    next(err);
  }
}

module.exports.post = post;

The controller logic above starts by declaring and initializing some variables. Note that contentBuffer is an array in which data will be buffered during a file upload. Also, the code expects the file’s name and content type to be passed in via HTTP headers, because the HTTP body is reserved for the file content.

An event listener is added to the request’s data event. This event will be triggered one or more times as data is transferred from the client to the server. The callback function takes the data passed in (a Buffer by default) and pushes it into the contentBuffer array. The rest of the logic in the function is used to track the number of bytes in the buffer. If the maximum file size (maxFileSize) of 50 MB is exceeded, an error message will be sent to the client and the connection will be destroyed.

Next an event listener is added to the end event of the request. This event will fire after all the data events, when there is no more data to read. The callback function takes the contentBuffer array and converts it into a scalar Buffer object. The buffer, filename, and content type are then passed to the files database API to insert the file into the table. If successful, a “201 Created” status code and file ID are sent back to the client in the response.

With the controller logic in place, you can now add the related database logic.

Database Logic for Uploads

Create a new file named files.js in the db_apis directory. Open the file, and add the following lines of code:

const oracledb = require('oracledb');
const database = require('../services/database.js');

const createSql =
 `insert into jsao_files (
    file_name,
    content_type,
    blob_data
  ) values (
    :file_name,
    :content_type,
    :content_buffer
  ) returning id into :id`;

async function create(fileName, contentType, contentBuffer) {
  const binds = {
    file_name: fileName,
    content_type: contentType,
    content_buffer: contentBuffer,
    id: {
      type: oracledb.NUMBER,
      dir: oracledb.BIND_OUT
    }
  };
  
  result = await database.simpleExecute(createSql, binds);
  
  return result.outBinds.id[0];
}

module.exports.create = create;

This module starts by bringing in node-oracledb (to access some constants) and the custom database module (for the simpleExecute method). The SQL insert statement assigned to createSql is very simple—and that’s the beauty of buffering!

In the create function, all you need to do is create a binds object and execute the SQL. Note that the content_buffer bind variable is just a simple “in bind” that takes the entire file contents as a buffer. The id bind variable is defined as an “out bind,” and its value is returned to the controller logic after the insert is complete.

Controller Logic for Downloads

Now that the logic for uploading files is in place, it’s time to add the logic for downloading files. Return to the controllers/files.js file, and append the following lines of code to the bottom of the file:

async function get(req, res, next) {
  try {
    const id = parseInt(req.params.id, 10);

    if (isNaN(id)) {
      res.status(400).json({error: `The id of the file must be provided`});
      return;
    }

    const rows = await files.get(id);

    if (rows.length === 1) {
      res.status(200);

      res.set({
        'Cache-Control': 'no-cache',
        'Content-Type': rows[0].content_type,
        'Content-Length': rows[0].file_length,
        'Content-Disposition': 'attachment; filename=' + rows[0].file_name
      });

      res.send(rows[0].blob_data);
    } else {
      res.status(404).end();
    }
  } catch (err) {
    next(err);
  }
}
module.exports.get = get;

The get function starts by parsing the ID of the file to be retrieved. If it’s not parsed correctly, an error message will be returned to the client; otherwise the ID is passed to the get function of the files database API.

Next the length of the array returned from the get call is checked. If the array has exactly one row, a “200 OK” status message will be sent to the client. Various headers are added to the response to satisfy the requirement of working with browsers. Finally, the actual BLOB fetched from the database is sent to the client.

Database Logic for Downloads

The last part of the code needed for the API to work is the get function in the files database API. Open the db_apis/files.js file again, and add the following lines of code to the bottom of the file:

const getSql =
 `select file_name "file_name",
    dbms_lob.getlength(blob_data) "file_length",
    content_type "content_type",
    blob_data "blob_data"
  from jsao_files
  where id = :id`

async function get(id) {
  const binds = {
    id: id
  };
  const opts = {
    fetchInfo: {
      blob_data: {
        type: oracledb.BUFFER
      }
    }
  };
  const result = await database.simpleExecute(getSql, binds, opts);
  return result.rows;
}
module.exports.get = get;

As was the case with the insert statement, the SQL query assigned to getSql is very simple. All that was necessary to get it to work in the get function was to define the binds and opts objects. The only bind variable was an “in bind” for the ID. As to the options, I’m using the fetchInfo option to change the return type for the blob_data column to be a buffer (a stream object is returned for BLOBs by default).

Testing the API

With all the work completed, it’s time to try the new additions to the API! Run the following commands in a terminal:

cd ~/oracle-db-examples/javascript/rest-api/part-5-manual-pagination-sorting-and-filtering/hr_app
npm install
node .

That should start the Node.js application.

If you get an “ORA-01017: invalid username/password;” error message, it indicates that the environment variables referenced in the config/database.js file have not been set or are incorrectly set. Run the following lines of code in a terminal, and then try to start the app again:

echo "export HR_USER=hr" >> ~/.bashrc
echo "export HR_PASSWORD=oracle" >> ~/.bashrc
echo "export HR_CONNECTIONSTRING=0.0.0.0/orcl" >> ~/.bashrc
source ~/.bashrc

To test an upload, open a new terminal window (while the API is running in the previous terminal) and run the following lines of code:

cd ~
curl -X "POST" "http://localhost:3000/api/files" \
     -H 'x-file-name: runTimeClickHere.png' \
     -H 'Content-Type: image/png' \
     --data-binary '@runTimeClickHere.png'

The curl command issues a POST request to the new API endpoint and sends an image file that was in the Oracle user’s home directory as the body of the request. This simulates the type of HTTP request a browser would issue when using XMLHttpRequest.send() with a file reference. You should receive a response such as {"fileId":1} , which indicates that the file was successfully inserted and its ID value is 1.

Next you can test-download the file by putting the previously returned ID value at the end of the URL path. Run the following command in a terminal:

curl "http://localhost:3000/api/files/1" --output my-file.png

That curl command will take the contents of the HTTP response body and write them to a file named my-file.png. The new file is a binary copy of the original file.

Although curl doesn’t pay attention to the response headers, by default, you can confirm that they are working by opening the Firefox web browser and navigating to http://localhost:3000/api/files/1. When prompted, select Save File and then click OK. Finally, use the file explorer to navigate to your downloads directory, where you’ll find a file that has the same name as the original (as shown in Figure 1). If you can see the image, then both uploads and downloads are working correctly.

Checking the downloads directory

Figure 1: Checking the downloads directory

Hopefully you’ll agree that the node-oracledb driver’s ability to bind and fetch buffers can make quick work of adding file upload and download capabilities to your Node.js APIs. If you’d like to learn more about uploading and downloading files with Node.js and Oracle Database, including how to work with streams and use a sample client app, visit this series at the JavaScript and Oracle blog.

Next Steps

LEARN more about JavaScript and Oracle.

TRY Oracle Cloud.

Illustration by Wes Rowell