Photo by Caspar Camille Rubin on Unsplash
How I Automated MongoDB Backups with Node.js
Automate MongoDB Backups with Node.js: A Step-by-Step Guide
I remember the day when one of my projects lost critical data due to an accidental deletion. It was a painful lesson, but it pushed me to automate my MongoDB backups. Let me walk you through how I set up an automated backup system using Node.js—something that has saved me from countless potential disasters.
Why Automate Backups?
Initially, I used to take manual backups by running the mongodump
command whenever I remembered. The problem? I didn't always remember. Automating backups ensures your data is regularly saved without manual intervention, protecting you from accidental deletions, data corruption, or hardware failures.
Setting Up Your Project
The first thing I did was create a dedicated project for my backup script:
$ mkdir mongodb-backup
$ cd mongodb-backup
$ npm init -y
Then, I installed the necessary dependencies:
$ npm install fs-extra path child_process
Writing the Backup Script
I created a file named backup.js
and added the following code:
Importing Dependencies
import { exec } from 'child_process';
import fs from 'fs-extra';
import path from 'path';
These imports provide the necessary tools: child_process
to execute shell commands, fs-extra
for file system operations, and path
to manage file paths.
Configuring Backup Settings
const DB_NAME = "my_database"; // Replace with your database name
const BACKUP_DIR = path.join(__dirname, "backups");
const MAX_BACKUPS = 5; // Number of backups to keep
const TIMESTAMP = new Date().toISOString().replace(/[:.]/g, '-');
Here, I define the database name, the directory where backups will be stored, and the number of backups to retain. The TIMESTAMP
ensures each backup has a unique name.
Defining Backup Paths
const backupPath = path.join(BACKUP_DIR, `backup-${TIMESTAMP}`);
const archivePath = `${backupPath}.tar.gz`;
These lines construct the paths for the raw database dump and its compressed archive.
Ensuring the Backup Directory Exists
fs.ensureDirSync(BACKUP_DIR);
This ensures the backup directory exists before writing files to it.
Executing the Database Dump
const dumpCommand = `mongodump --db=${DB_NAME} --out=${backupPath}`;
console.log("Starting MongoDB backup...");
exec(dumpCommand, (error) => {
if (error) {
console.error("Backup failed:", error.message);
return;
}
console.log("Backup completed, creating archive...");
Here, the mongodump
command creates a backup of the database. If successful, the script proceeds to compress the backup.
Compressing the Backup
const tarCommand = `tar -czf ${archivePath} -C ${BACKUP_DIR} backup-${TIMESTAMP}`;
exec(tarCommand, (tarError) => {
if (tarError) {
console.error("Compression failed:", tarError.message);
} else {
console.log(`Backup archived: ${archivePath}`);
// Remove uncompressed dump folder
fs.removeSync(backupPath);
// Automatically delete older backups beyond the set limit to manage storage efficiently.
cleanOldBackups();
}
});
This part compresses the raw backup into a .tar.gz
archive, deletes the uncompressed version, and triggers the cleanup function.
Removing Old Backups
function cleanOldBackups() {
const backups = fs.readdirSync(BACKUP_DIR)
.filter(file => file.endsWith(".tar.gz"))
.sort((a, b) => fs.statSync(path.join(BACKUP_DIR, b)).mtime - fs.statSync(path.join(BACKUP_DIR, a)).mtime);
if (backups.length > MAX_BACKUPS) {
backups.slice(MAX_BACKUPS).forEach(file => {
fs.removeSync(path.join(BACKUP_DIR, file));
console.log(`Deleted old backup: ${file}`);
});
}
}
This function ensures that only the latest five backups are kept by sorting the backup files by modification time and deleting the oldest ones when the limit is exceeded.
Running the Backup Script
To run the script, I simply executed:
$ node backup.js
This created a timestamped backup and ensured that only the latest five backups were kept, automatically removing older ones.
Automating Backups with Cron Jobs
Running the script manually was not an option—I needed automation. So, I set up a cron job (on Linux/macOS) to run the script daily at midnight.
Linux/macOS (Cron Job)
Open the terminal and type:
$ crontab -e
Add the following line:
0 0 * * * node /path/to/your/project/backup.js >> /path/to/logs/backup.log 2>&1
This ensures that any errors or output are logged for debugging.
Windows (Task Scheduler)
For Windows users, Task Scheduler can automate the backup process:
Open Task Scheduler.
Create a new task.
Set the trigger to run daily at your preferred time.
Set the action to run
node.exe
with the script’s path as an argument.
Conclusion
Since setting up this automation, I no longer worry about data loss. Every morning, I check my logs and see that my backups ran smoothly overnight. If you're managing a MongoDB database, I highly recommend automating your backups—you never know when you'll need them.
Happy coding, and may your data always be safe! 🚀