Streamlining Large File Uploads with Chunk Uploads in Node.js and Express.js

22 FEB 2024

In today’s digital realm, efficiently managing large file uploads is a common necessity for many web applications. Whether it’s media files, documents, or other sizable data, conventional upload methods often fall short, particularly when dealing with sluggish or unstable network connections. To overcome these challenges, developers frequently turn to chunked file uploads, a technique that divides large files into smaller, more manageable segments for smoother and more reliable transfers. In this article, we’ll delve into the implementation of chunked file uploads in Node.js and Express.js to enhance both developer workflows and user experiences.

Understanding Chunked File Uploads

Chunked file uploads involve breaking down a large file into smaller “chunks” or segments, which are then uploaded to the server individually. This approach offers several benefits:

  1. Resumable Uploads: In the event of an upload failure, only the affected chunk needs to be re-uploaded, rather than the entire file, reducing bandwidth consumption and improving upload reliability.
  2. Lower Memory Footprint:Processing smaller chunks consumes less memory on both the client and server sides, making it ideal for handling large files without overwhelming system resources.
  3. Real-time Progress Tracking:Developers can provide users with real-time feedback on upload progress, enhancing the overall user experience during lengthy uploads.
  4. Enhanced Reliability:Chunked uploads are less susceptible to timeouts and network failures, as smaller segments are easier to transmit and manage.

Implementation in Node.js and Express.js

Let’s explore how to implement chunked file uploads using Node.js and Express.js.

1. Project Setup

Start by initializing a new Node.js project and installing the necessary dependencies, including Express.js and any additional libraries required for file handling.

mkdir chunk-upload-example
cd chunk-upload-example
npm init -y
npm install express multer
2. Setting Up the Server

Create a new JavaScript file (e.g., server.js) and set up a basic Express server to handle file uploads.


const express = require('express');
const multer = require('multer');
const app = express();

const PORT = process.env.PORT || 3000;
app.use(express.static('public'));
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
const storage = multer.diskStorage({
  destination: function (req, file, cb) {
    cb(null, './uploads');
  },
  filename: function (req, file, cb) {
    cb(null, file.originalname);
  }
});

const upload = multer({ storage: storage });
app.post('/upload', upload.single('file'), (req, res) => {
  // Handle single file upload
  res.send('File uploaded successfully');
});

app.listen(PORT, () => {
  console.log(`Server is running on port ${PORT}`);
});
3. Implementing Chunked Uploads

Modify the file upload endpoint to handle multipart/form-data requests containing file chunks. Utilize a library like multer to handle file uploads and reassemble the chunks on the server.

const CHUNKS_DIR = './chunks';
app.post('/upload/chunk', upload.single('file'), (req, res) => {
  const { file, body: { totalChunks, currentChunk } } = req;
  const chunkFilename = `${file.originalname}.${currentChunk}`;
  const chunkPath = `${CHUNKS_DIR}/${chunkFilename}`;
  fs.rename(file.path, chunkPath, (err) => {
    if (err) {
      console.error('Error moving chunk file:', err);
      res.status(500).send('Error uploading chunk');
    } else {
      if (+currentChunk === +totalChunks) {
        // All chunks have been uploaded, assemble them into a single file
        assembleChunks(file.originalname, totalChunks)
          .then(() => res.send('File uploaded successfully'))
          .catch((err) => {
            console.error('Error assembling chunks:', err);
            res.status(500).send('Error assembling chunks');
          });
      } else {
        res.send('Chunk uploaded successfully');
      }
    }
  });
});
async function assembleChunks(filename, totalChunks) {
  const writer = fs.createWriteStream(`./uploads/${filename}`);
  for (let i = 1; i <= totalChunks; i++) {
    const chunkPath = `${CHUNKS_DIR}/${filename}.${i}`;
    await pipeline(pump(fs.createReadStream(chunkPath)), pump(writer));
    fs.unlink(chunkPath, (err) => {
      if (err) {
        console.error('Error deleting chunk file:', err);
      }
    });
  }
}
4. Client-Side Implementation

Implement client-side logic to split the file into chunks and send them to the server. Additionally, provide progress feedback to users using the Fetch API to monitor upload progress.

const fileInput = document.getElementById('file-input');
fileInput.addEventListener('change', async (event) => {
  const file = event.target.files[0];
  const chunkSize = 1024 * 1024; // 1MB
  const totalChunks = Math.ceil(file.size / chunkSize);
  let startByte = 0;
  for (let i = 1; i <= totalChunks; i++) {
    const endByte = Math.min(startByte + chunkSize, file.size);
    const chunk = file.slice(startByte, endByte);
    await uploadChunk(chunk, totalChunks, i);
    startByte = endByte;
  }
  console.log('Upload complete');
});
async function uploadChunk(chunk, totalChunks, currentChunk) {
  const formData = new FormData();
  formData.append('file', chunk);
  formData.append('totalChunks', totalChunks);
  formData.append('currentChunk', currentChunk);
  const response = await fetch('/upload/chunk', {
    method: 'POST',
    body: formData
  });
  if (!response.ok) {
    throw new Error('Chunk upload failed');
  }
}

Conclusion

Implementing chunked file uploads in Node.js and Express.js can significantly improve the reliability and performance of large file transfers in web applications. By dividing files into smaller segments and uploading them asynchronously, developers can ensure smoother uploads and a more seamless user experience overall. With the detailed guide provided in this article, integrating chunked file uploads into your Node.js and Express.js projects becomes a straightforward task, empowering your applications to handle large file uploads with ease and efficiency.

Happy coding!

Do look out for other articles?

  1. What Is Microservices? — Learn All About Microservice Architecture
  2. 10 microservices design patterns for better architecture