This guide provides a complete walkthrough for building a robust foundation for a production-ready bulk SMS broadcast application using Node.js, Express, and the Sinch SMS REST API. We'll cover everything from project setup and core messaging functionality to error handling, security, and deployment.
By the end of this tutorial, you will have a robust API capable of accepting lists of recipients (identified by a group ID) and a message, and then efficiently broadcasting that message via SMS using Sinch. This solves the common need for businesses to send notifications, alerts, or marketing messages to large groups of users simultaneously.
Project Overview and Goals
- What we'll build: A Node.js/Express backend service with an API endpoint to trigger bulk SMS broadcasts based on recipient groups.
- Problem solved: Automating the process of sending the same SMS message to multiple recipients efficiently and reliably.
- Technologies:
- Node.js: Asynchronous JavaScript runtime for building scalable network applications.
- Express: Minimalist web framework for Node.js, simplifying API creation.
- Sinch SMS API: Used for the actual sending of SMS messages via their robust infrastructure. We will use the REST API directly.
- dotenv: To manage environment variables securely.
- node-fetch: To make HTTP requests to the Sinch API.
- winston: For flexible logging.
- express-validator: For request input validation.
- express-rate-limit: To protect the API from abuse.
- Prerequisites:
- Node.js and npm (or yarn) installed.
- A Sinch account with API credentials (Service Plan ID, API Token) and a provisioned phone number or Alphanumeric Sender ID.
- Basic familiarity with Node.js, Express, and REST APIs.
- A code editor (like VS Code).
- A tool for testing APIs (like Postman or curl).
System Architecture
The basic flow is as follows:
- An authorized client (e.g., a frontend application, another backend service, or a testing tool) sends a POST request to our Express API endpoint (
/api/broadcast
) specifying a recipient group and message. - The Express application validates the request (input data, API key).
- The application retrieves recipient phone numbers associated with the group ID (from a simulated data source in this guide).
- The application constructs a request payload for the Sinch SMS Batch API. Note: The provided code assumes the recipient list size per group is within Sinch API limits; production systems with very large groups require implementing ""chunking"" (sending multiple smaller batches).
- The application sends the request(s) to the Sinch API using
node-fetch
, potentially with retries on failure. - Sinch processes the batch request(s) and sends SMS messages to the recipients.
- Our application logs the process and responds to the client.
<!-- Mermaid diagram removed -->
1. Setting up the Project
Let's initialize our Node.js project and install the necessary dependencies.
-
Create Project Directory: Open your terminal or command prompt and create a new directory for the project, then navigate into it.
mkdir sinch-bulk-sms cd sinch-bulk-sms
-
Initialize Node.js Project: This creates a
package.json
file to manage dependencies and project metadata.npm init -y
-
Install Dependencies: We need Express for the server,
dotenv
for environment variables,node-fetch
for calling the Sinch API,winston
for logging,express-validator
for input validation, andexpress-rate-limit
for security.npm install express dotenv node-fetch@2 winston express-validator express-rate-limit
- Note: We install
node-fetch@2
specifically because version 3+ uses ES Modules by default, while this guide uses the CommonJS (require
) pattern common in many Express setups. If you prefer ES Modules (import
), you can use the latestnode-fetch
(v3+) and adjust the import syntax (import fetch from 'node-fetch';
) throughout the guide.
- Note: We install
-
Set up Project Structure: Create the following directories for better organization:
mkdir src data cd src mkdir config controllers middleware routes services utils logs cd ..
src/
: Contains all our source code.src/config/
: For configuration files (like logger setup).src/controllers/
: Handles incoming requests and outgoing responses.src/middleware/
: For Express middleware (auth, validation, error handling).src/routes/
: Defines the API routes.src/services/
: Contains business logic, like interacting with the Sinch API.src/utils/
: Utility functions.src/logs/
: Directory where log files will be stored.data/
: Contains sample data files (used in simplified data layer).
-
Create Environment File: Create a file named
.env
in the project root (sinch-bulk-sms/
). This file stores sensitive credentials and configuration. Never commit this file to version control.# .env # Server Configuration PORT=3000 NODE_ENV=development # or production # Sinch API Credentials # Get these from your Sinch Dashboard: https://dashboard.sinch.com/sms/api/rest SINCH_SERVICE_PLAN_ID=YOUR_SERVICE_PLAN_ID SINCH_API_TOKEN=YOUR_API_TOKEN SINCH_FROM_NUMBER=YOUR_SINCH_VIRTUAL_NUMBER_OR_SENDER_ID # e.g., +15551234567 or YourBrand # Internal API Security INTERNAL_API_KEY=YOUR_SECRET_INTERNAL_API_KEY # Generate a strong random key
- Purpose: Using
.env
keeps sensitive data out of your codebase and enables different configurations per environment (development, production). - How to get Sinch credentials:
- Log in to your Sinch Customer Dashboard.
- Navigate to SMS -> APIs.
- Your Service plan ID and API token are displayed here. Copy them into your
.env
file. - Ensure you have a purchased number or configured Sender ID under Numbers or Sender IDs and add it as
SINCH_FROM_NUMBER
. - Generate a secure, random string for
INTERNAL_API_KEY
.
- Purpose: Using
-
Create
.gitignore
: Create a.gitignore
file in the project root to prevent committing sensitive files and unnecessary directories.# .gitignore # Dependencies node_modules/ # Environment variables .env* !.env.example # Logs src/logs/*.log # Corrected path to match logger config # Log files from potential older configs or tools logs/ npm-debug.log* yarn-debug.log* yarn-error.log* # Build output dist/ build/ # OS generated files .DS_Store Thumbs.db # Data files (if sensitive or large) data/
-
Create Main Server File (
server.js
): Createserver.js
in the project root (sinch-bulk-sms/
).// server.js require('dotenv').config(); // Load environment variables first const express = require('express'); const rateLimit = require('express-rate-limit'); const logger = require('./src/config/logger'); // We will create this next const broadcastRoutes = require('./src/routes/broadcastRoutes'); const { errorHandler } = require('./src/middleware/errorHandler'); const { apiKeyAuth } = require('./src/middleware/authMiddleware'); const app = express(); const PORT = process.env.PORT || 3000; // Middleware app.use(express.json()); // Parse JSON bodies app.use(express.urlencoded({ extended: true })); // Parse URL-encoded bodies // Basic Rate Limiting const limiter = rateLimit({ windowMs: 15 * 60 * 1000, // 15 minutes max: 100, // Limit each IP to 100 requests per windowMs message: 'Too many requests from this IP, please try again after 15 minutes', standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers legacyHeaders: false, // Disable the `X-RateLimit-*` headers }); app.use(limiter); // Health Check Route (optional but good practice) app.get('/health', (req, res) => { res.status(200).json({ status: 'UP', timestamp: new Date().toISOString() }); }); // API Routes - Apply API key auth middleware here app.use('/api', apiKeyAuth, broadcastRoutes); // All routes under /api require API key // Centralized Error Handling Middleware - Must be last app.use(errorHandler); // Start Server and store the server instance const server = app.listen(PORT, () => { logger.info(`Server running on port ${PORT} in ${process.env.NODE_ENV} mode`); }); // Graceful Shutdown Handling (Optional but recommended for production) const gracefulShutdown = (signal) => { logger.info(`${signal} signal received: closing HTTP server`); server.close(() => { logger.info('HTTP server closed'); // Add cleanup logic here (e.g., close database connections) // Ensure cleanup completes before exiting process.exit(0); }); // Force shutdown after a timeout if server.close() hangs setTimeout(() => { logger.error('Could not close connections in time, forcefully shutting down'); process.exit(1); }, 10000); // 10 seconds timeout }; process.on('SIGTERM', () => gracefulShutdown('SIGTERM')); process.on('SIGINT', () => gracefulShutdown('SIGINT')); // Handle Ctrl+C
-
Configure Logger (
src/config/logger.js
): Set up Winston for logging.// src/config/logger.js const winston = require('winston'); const path = require('path'); const fs = require('fs'); // Determine log directory and ensure it exists const logDir = path.join(__dirname, '../logs'); // Store logs in src/logs if (!fs.existsSync(logDir)) { fs.mkdirSync(logDir); } const logFormat = winston.format.combine( winston.format.timestamp({ format: 'YYYY-MM-DD HH:mm:ss' }), winston.format.errors({ stack: true }), // Log stack traces winston.format.splat(), winston.format.json() // Log in JSON format ); const logger = winston.createLogger({ level: process.env.NODE_ENV === 'production' ? 'info' : 'debug', // More verbose in dev format: logFormat, defaultMeta: { service: 'sinch-bulk-sms-api' }, transports: [ // - Write all logs with level `error` and below to `error.log` // - Write all logs with level `info` and below to `combined.log` new winston.transports.File({ filename: path.join(logDir, 'error.log'), level: 'error', }), new winston.transports.File({ filename: path.join(logDir, 'combined.log') }), ], exceptionHandlers: [ // Log unhandled exceptions to a separate file new winston.transports.File({ filename: path.join(logDir, 'exceptions.log') }) ], rejectionHandlers: [ // Log unhandled promise rejections new winston.transports.File({ filename: path.join(logDir, 'rejections.log') }) ] }); // If we're not in production then log to the `console` with a simpler format if (process.env.NODE_ENV !== 'production') { logger.add(new winston.transports.Console({ format: winston.format.combine( winston.format.colorize(), winston.format.printf(info => `${info.timestamp} ${info.level}: ${info.message} ${info.stack ? '\n' + info.stack : ''}`) ), level: 'debug', // Ensure console shows debug messages in dev })); } module.exports = logger;
- Why Winston? It's highly configurable, enabling logging to multiple destinations (console, files, external services) with different formats and levels. This is crucial for debugging and monitoring in production.
2. Implementing Core Functionality (Sinch Service)
This service encapsulates the logic for interacting with the Sinch API.
-
Create Sinch Service File (
src/services/sinchService.js
):// src/services/sinchService.js const fetch = require('node-fetch'); const logger = require('../config/logger'); const SINCH_API_URL = 'https://us.sms.api.sinch.com/xms/v1'; // Or use EU endpoint: https://eu.sms.api.sinch.com const SERVICE_PLAN_ID = process.env.SINCH_SERVICE_PLAN_ID; const API_TOKEN = process.env.SINCH_API_TOKEN; const FROM_NUMBER = process.env.SINCH_FROM_NUMBER; // Define Sinch batch size limit (check Sinch docs for current value) const SINCH_BATCH_SIZE_LIMIT = 1000; /** * Sends a bulk SMS message using the Sinch Batch SMS API. * Handles chunking if the recipient list exceeds Sinch limits. * @param {string[]} recipients - Array of E.164 formatted phone numbers. * @param {string} message - The text message body. * @returns {Promise<object[]>} - An array of response bodies from Sinch API for each batch sent. * @throws {Error} - If the API call fails or returns an error status after retries, or if config is missing. */ async function sendBulkSms(recipients, message) { if (!SERVICE_PLAN_ID || !API_TOKEN || !FROM_NUMBER) { logger.error('Sinch API credentials or From Number are not configured in .env'); throw new Error('Sinch service configuration error.'); } if (!recipients || recipients.length === 0) { throw new Error('Recipient list cannot be empty.'); } logger.info(`Initiating bulk SMS send to ${recipients.length} recipients. From: ${FROM_NUMBER}`); // --- CHUNKING LOGIC --- const recipientChunks = []; for (let i = 0; i < recipients.length; i += SINCH_BATCH_SIZE_LIMIT) { recipientChunks.push(recipients.slice(i_ i + SINCH_BATCH_SIZE_LIMIT)); } logger.info(`Splitting recipients into ${recipientChunks.length} chunk(s) of up to ${SINCH_BATCH_SIZE_LIMIT} each.`); // --- END CHUNKING LOGIC --- const apiUrl = `${SINCH_API_URL}/${SERVICE_PLAN_ID}/batches`; const headers = { 'Content-Type': 'application/json'_ 'Authorization': `Bearer ${API_TOKEN}` }; const batchResults = []; let overallSuccess = true; // Process each chunk for (const chunk of recipientChunks) { const payload = { from: FROM_NUMBER_ to: chunk_ body: message_ // Optional: delivery_report_ callback_url_ client_reference }; try { // Use a helper function for the actual API call with retries (defined below) const result = await sendSinchRequestWithRetry(apiUrl_ headers_ payload_ chunk.length); batchResults.push(result); logger.info(`Successfully sent batch via Sinch. Batch ID: ${result.id}_ Recipients in batch: ${chunk.length}`); } catch (error) { logger.error(`Failed to send a batch of ${chunk.length} recipients: ${error.message}`_ { stack: error.stack }); batchResults.push({ error: error.message_ recipients_in_batch: chunk.length }); // Record error for this batch overallSuccess = false; // Decide if you want to stop on first error or continue with other chunks // For this example_ we continue but track the failure. } } if (!overallSuccess) { // Throw an error if any batch failed_ potentially including details // from batchResults. This indicates partial failure. throw new Error(`One or more SMS batches failed to send. Check logs for details. Results: ${JSON.stringify(batchResults)}`); } logger.info(`All ${recipientChunks.length} batch(es) processed successfully.`); return batchResults; // Return array of successful batch responses } // --- Helper function for API call with retries --- const MAX_RETRIES = 3; const INITIAL_RETRY_DELAY_MS = 500; async function sendSinchRequestWithRetry(apiUrl_ headers_ payload_ recipientCount) { let attempts = 0; while (attempts < MAX_RETRIES) { attempts++; logger.debug(`Attempt ${attempts}/${MAX_RETRIES} to send Sinch batch (${recipientCount} recipients).`); try { const response = await fetch(apiUrl_ { method: 'POST'_ headers: headers_ body: JSON.stringify(payload)_ timeout: 15000 // Increased timeout for potentially larger requests }); // Success Case if (response.ok) { const responseBody = await response.json(); logger.info(`Sinch API call successful on attempt ${attempts}. Batch ID: ${responseBody.id}`); return responseBody; // Return successful response } // Handle specific retryable errors (rate limits_ server errors) if (response.status === 429 || response.status >= 500) { const responseBodyText = await response.text(); logger.warn(`Sinch API returned retryable status ${response.status} on attempt ${attempts}. Body: ${responseBodyText}. Retrying...`); if (attempts >= MAX_RETRIES) { throw new Error(`Sinch API request failed after ${MAX_RETRIES} attempts with status ${response.status}. Last Body: ${responseBodyText}`); } const delay = INITIAL_RETRY_DELAY_MS * Math.pow(2, attempts - 1); logger.info(`Waiting ${delay}ms before next retry.`); await new Promise(resolve => setTimeout(resolve, delay)); continue; // Next attempt } // Handle non-retryable client errors (4xx except 429) let errorBody; try { errorBody = await response.json(); // Try parsing JSON } catch (e) { errorBody = await response.text(); // Fallback to text } logger.error(`Sinch API non-retryable error: ${response.status} ${response.statusText}`, { errorBody }); throw new Error(`Sinch API request failed with status ${response.status}: ${JSON.stringify(errorBody) || 'Unknown client error'}`); } catch (error) { logger.error(`Error during Sinch API call attempt ${attempts}:`, { message: error.message, stack: error.stack }); // Retry on specific network errors if ((error.type === 'request-timeout' || error.code === 'ECONNREFUSED' || error.code === 'ENOTFOUND' || error.code === 'ECONNRESET') && attempts < MAX_RETRIES) { logger.warn(`Network error detected on attempt ${attempts}. Retrying...`); const delay = INITIAL_RETRY_DELAY_MS * Math.pow(2_ attempts - 1); logger.info(`Waiting ${delay}ms before next retry.`); await new Promise(resolve => setTimeout(resolve, delay)); continue; // Next attempt } // If max retries reached or non-retryable error, re-throw if (attempts >= MAX_RETRIES) { logger.error(`Failed Sinch API call after ${MAX_RETRIES} attempts.`); } // Ensure we throw an Error object throw (error instanceof Error ? error : new Error(JSON.stringify(error))); } } // Safeguard: Should not be reached if logic is correct throw new Error('Failed to send SMS batch via Sinch after exhausting retries.'); } // --- End Helper function --- module.exports = { sendBulkSms };
- Why
node-fetch
? Provides a standard, Promise-based way to make HTTP requests. - Why separate service? Promotes separation of concerns, making the controller cleaner and Sinch interaction reusable/testable.
- Chunking Implemented: The code now splits the recipient list into chunks based on
SINCH_BATCH_SIZE_LIMIT
and sends multiple requests if necessary. This is essential for handling large lists reliably. - Retry Logic: The helper function
sendSinchRequestWithRetry
handles retries with exponential backoff for network issues and specific Sinch error codes (429, 5xx).
- Why
3. Building the API Layer (Routes and Controller)
Now, let's define the API endpoint that clients will use to trigger broadcasts.
-
Create Broadcast Controller (
src/controllers/broadcastController.js
): This version assumes thegroupId
approach from Section 6// src/controllers/broadcastController.js const { validationResult } = require('express-validator'); const { sendBulkSms } = require('../services/sinchService'); const { getNumbersByGroupId } = require('../services/dataService'); // Import data service const logger = require('../config/logger'); async function handleBroadcastRequest(req, res, next) { // 1. Validate Input from request body const errors = validationResult(req); if (!errors.isEmpty()) { logger.warn('Broadcast request validation failed', { errors: errors.array() }); // Return only the error messages for clarity return res.status(400).json({ errors: errors.array().map(e => e.msg) }); } const { groupId, message } = req.body; try { // 2. Get recipients from data source logger.info(`Fetching recipients for group ID: ${groupId}`); const recipients = await getNumbersByGroupId(groupId); if (!recipients) { logger.warn(`Attempted broadcast to non-existent group: ${groupId}`); return res.status(404).json({ errors: [`Recipient group '${groupId}' not found.`] }); } if (recipients.length === 0) { logger.warn(`Attempted broadcast to empty group: ${groupId}`); return res.status(400).json({ errors: [`Recipient group '${groupId}' is empty.`] }); } // 3. Validate data fetched from the service (e.g., E.164 format) // Note: This check is performed here because the data comes from our service, // not directly from user input validated by express-validator. const invalidNumbers = recipients.filter(num => typeof num !== 'string' || !/^\+?[1-9]\d{1,14}$/.test(num)); if (invalidNumbers.length > 0) { logger.error('Data source contains invalid phone numbers!', { groupId, invalidNumbers: invalidNumbers.slice(0, 5) }); // Log only a few examples // Do not expose internal data issues directly to client return res.status(500).json({ errors: ['Internal error: Invalid recipient data format encountered.'] }); } // 4. Call the Sinch Service (which now handles chunking and retries) logger.info(`Processing broadcast request for group '${groupId}' (${recipients.length} recipients).`); const sinchResponses = await sendBulkSms(recipients, message); // Returns array of batch results // 5. Send Success Response // Consolidate batch IDs for the response const batchIds = sinchResponses.map(r => r.id).filter(id => !!id); res.status(200).json({ message: `Bulk SMS broadcast initiated successfully for group '${groupId}'.`, batch_ids: batchIds, // Include all successful batch IDs total_recipient_count: recipients.length, group_id: groupId, batches_sent: sinchResponses.length }); } catch (error) { // 6. Pass error to centralized handler logger.error(`Failed to process broadcast request for group '${groupId}': ${error.message}`, { stack: error.stack }); // Let the errorHandler middleware format the response next(error); } } module.exports = { handleBroadcastRequest };
-
Create Broadcast Routes (
src/routes/broadcastRoutes.js
): This version assumes thegroupId
approach from Section 6// src/routes/broadcastRoutes.js const express = require('express'); const { body } = require('express-validator'); const { handleBroadcastRequest } = require('../controllers/broadcastController'); const router = express.Router(); // Define validation rules for the broadcast endpoint const broadcastValidationRules = [ body('groupId') .isString().withMessage('groupId must be a string.') .trim() .notEmpty().withMessage('groupId cannot be empty.') .isLength({ min: 1, max: 100 }).withMessage('groupId must be between 1 and 100 characters.'), // Added length constraint body('message') .isString().withMessage('Message must be a string.') .trim() .notEmpty().withMessage('Message cannot be empty.') // Standard SMS limits apply per segment (e.g., 160 GSM, 70 UCS-2). // Sinch handles concatenation up to ~1600 chars, but pricing might be per segment. .isLength({ max: 1600 }).withMessage('Message exceeds maximum length (1600 characters).') ]; // POST /api/broadcast - Endpoint to trigger a bulk SMS send router.post('/broadcast', broadcastValidationRules, handleBroadcastRequest); module.exports = router;
- Why
express-validator
? Provides clean, declarative validation of incoming request body data. - Why Controller? Separates routing from request handling logic.
- E.164 Validation: The validation of phone number format now correctly resides in the controller after fetching the numbers from the data source, as it's validating data integrity, not direct user input structure.
- Why
-
Integrate Routes in
server.js
: (Already done in Step 1.7, shown here for context)// server.js (relevant part) const broadcastRoutes = require('./src/routes/broadcastRoutes'); const { apiKeyAuth } = require('./src/middleware/authMiddleware'); // Will create next // ... other middleware ... // API Routes - Protected by API Key Auth app.use('/api', apiKeyAuth, broadcastRoutes); // ... error handler ... // ... server start ...
4. Integrating with Third-Party Services (Sinch Credentials)
We already set up .env
and the sinchService.js
to use these variables.
- Obtaining Credentials: As detailed in Step 1.5, get credentials from the Sinch Dashboard.
- Secure Handling: Use
dotenv
for local development. In production, inject environment variables securely via your hosting platform's mechanisms (e.g., Heroku Config Vars, AWS Secrets Manager/Parameter Store, Docker secrets). Never commit.env
files or hardcode credentials. - Environment Variables:
SINCH_SERVICE_PLAN_ID
,SINCH_API_TOKEN
,SINCH_FROM_NUMBER
. - Resilience: The
sinchService
now includes retries. For higher resilience:- Circuit Breaker: Consider libraries like
opossum
to temporarily halt calls to Sinch if it consistently fails, preventing cascading failures. - Alternative Provider (Advanced): For critical systems, configure a backup SMS provider.
- Circuit Breaker: Consider libraries like
5. Error Handling, Logging, and Retry Mechanisms
Robust error handling and logging are essential.
-
Centralized Error Handler (
src/middleware/errorHandler.js
): Catches errors passed vianext(error)
.// src/middleware/errorHandler.js const logger = require('../config/logger'); // Note: Using err.message.includes() is somewhat brittle. // For more robust production systems, consider defining custom error classes // (e.g., class SinchApiError extends Error {}) and checking `instanceof`. function errorHandler(err, req, res, next) { // Log the full error details internally logger.error('An error occurred and was caught by the centralized handler:', { errorMessage: err.message, errorStack: err.stack, errorStatus: err.status, // Include status if available on the error object requestUrl: req.originalUrl, requestMethod: req.method, requestIp: req.ip, }); // Determine appropriate status code and user-facing message let statusCode = err.status || 500; // Use error status if set, otherwise default to 500 let message = 'An unexpected error occurred. Please try again later.'; // Customize based on error type or message content if (err.message.includes('Sinch service configuration error')) { statusCode = 500; message = 'Service configuration error. Please contact support.'; } else if (err.message.includes('Sinch API request failed')) { statusCode = 502; // Bad Gateway - upstream failure message = 'Failed to communicate with the SMS provider.'; } else if (err.message.includes('Network error communicating with Sinch API')) { statusCode = 504; // Gateway Timeout message = 'Network error while communicating with the SMS provider.'; } else if (err.message.includes('One or more SMS batches failed')) { statusCode = 500; // Indicate partial or full failure in sending message = 'There was an issue sending one or more SMS batches.'; } else if (statusCode === 400) { // Handle validation errors passed via next(err) if any message = err.message || 'Bad Request.'; } else if (statusCode === 401 || statusCode === 403) { message = err.message || 'Access denied.'; } else if (statusCode === 404) { message = err.message || 'Resource not found.'; } // Add more specific error checks as needed // Ensure status code is in the valid HTTP range if (statusCode < 400 || statusCode > 599) { logger.warn(`Invalid status code (${statusCode}) detected in error handler. Defaulting to 500.`); statusCode = 500; } // Send the standardized JSON error response res.status(statusCode).json({ status: 'error', statusCode: statusCode, message: message, // Optionally include error details ONLY in development ...(process.env.NODE_ENV === 'development' && { error_details: err.message, stack_preview: err.stack?.split('\n').slice(0,