code examples
code examples
Receiving Infobip Delivery Reports with Fastify and Node.js
A guide to building a secure webhook endpoint using Node.js and Fastify to receive, validate, and store Infobip SMS delivery reports.
Track the status of your sent SMS messages in real-time by setting up a webhook endpoint using Fastify to receive delivery reports directly from Infobip. This guide provides a step-by-step walkthrough for building a webhook handler foundation in Node.js.
We'll cover everything from initial project setup and core logic implementation to security considerations, database integration, error handling, deployment, and testing.
Project Overview and Goals
This guide details how to build a reliable webhook endpoint using Node.js and the Fastify framework to receive and process SMS delivery status updates sent by Infobip.
Problem Solved: When you send SMS messages via an API like Infobip's, you often need confirmation of whether the message was successfully delivered, failed, or is still pending. Relying solely on the initial API response isn't enough for final status tracking. Infobip provides delivery report webhooks, pushing status updates to an endpoint you specify. This project builds that endpoint.
Key Goals:
- Create a Fastify application to listen for incoming HTTP POST requests from Infobip.
- Securely validate incoming webhook requests to ensure they originate from Infobip.
- Parse the delivery report payload sent by Infobip.
- Store the relevant status information (e.g., message ID, status, timestamp) in a database.
- Implement robust logging and error handling.
- Provide guidance on deployment and testing.
Technologies Used:
- Node.js: The JavaScript runtime environment.
- Fastify: A high-performance, low-overhead web framework for Node.js, chosen for its speed, extensive plugin ecosystem, and developer experience.
- Infobip API: Used for sending SMS messages (assumed) and configured to send delivery reports via webhooks.
- Prisma: A modern database toolkit for Node.js (used here for database interaction examples).
@prisma/client: The runtime client for Prisma.dotenv: For managing environment variables.pino: Fastify's default high-performance JSON logger.
Prerequisites:
- A working Node.js environment (LTS version recommended).
npmoryarnpackage manager.- An active Infobip account with API access. You should know how to send SMS messages via their API.
- Access to a database (PostgreSQL is used in examples, but adaptable).
- A way to expose your local development server to the internet (e.g.,
ngrok) for testing webhooks, or a deployed environment. - Basic understanding of JavaScript, Node.js, REST APIs, and webhooks.
Final Outcome: A functional Node.js application capable of securely receiving, validating, processing, and storing Infobip SMS delivery reports, providing a solid foundation for production deployment.
1. Setting up the project
Let's initialize our Node.js project and install the necessary dependencies.
-
Create Project Directory: Open your terminal and create a new directory for the project, then navigate into it.
bashmkdir infobip-webhook-handler cd infobip-webhook-handler -
Initialize npm Project: Initialize the project using npm. You can accept the defaults or customize them.
bashnpm init -yThis creates a
package.jsonfile. -
Install Dependencies: We need Fastify, dotenv for environment variables, @fastify/sensible for utilities, the Prisma client for database interaction, and Prisma CLI/pino-pretty for development.
bash# Runtime Dependencies npm install fastify dotenv @fastify/sensible @prisma/client # Development Dependencies npm install --save-dev pino-pretty prisma nodemonfastify: The core web framework.dotenv: Loads environment variables from a.envfile.@fastify/sensible: Adds useful decorators and error handling utilities.@prisma/client: The runtime Prisma client library.pino-pretty: Formats Pino logs during development (dev dependency).prisma: The Prisma CLI for migrations and generation (dev dependency).nodemon: Utility to auto-restart the server during development (dev dependency).
-
Configure
package.jsonScripts: Add scripts to yourpackage.jsonfor running the application:json// package.json { // ... other fields ""scripts"": { ""start"": ""node src/server.js"", ""dev"": ""nodemon --watch src --exec \""node src/server.js | pino-pretty\"""", ""db:migrate"": ""prisma migrate dev"", ""db:generate"": ""prisma generate"" }, // ... dependencies }start: Runs the application directly with Node.dev: Runs the application usingnodemonfor auto-restarts on file changes and pipes logs throughpino-pretty.db:migrate: Applies database migrations using Prisma during development.db:generate: Generates the Prisma client based on your schema.
-
Create Project Structure: Organize your project files:
plaintextinfobip-webhook-handler/ ├── prisma/ │ └── schema.prisma ├── src/ │ ├── routes/ │ │ └── webhooks.js │ ├── plugins/ │ │ └── prisma.js │ ├── server.js │ └── app.js ├── .env ├── .gitignore └── package.json -
Create
.gitignore: Add common Node.js ignores:plaintext# .gitignore node_modules .env dist npm-debug.log* yarn-debug.log* yarn-error.log* # Optional: Store only migration definition, not generated SQL # /prisma/migrations/*/*.sql -
Create
.envFile: This file will hold your environment variables. Do not commit this file to version control.dotenv# .env PORT=3000 HOST=0.0.0.0 LOG_LEVEL=info NODE_ENV=development # Infobip Configuration INFOBIP_WEBHOOK_SECRET=your_strong_shared_secret_here # Obtain from Infobip or define yourself # Database Configuration (Example for PostgreSQL) DATABASE_URL=""postgresql://user:password@localhost:5432/infobip_webhooks?schema=public""INFOBIP_WEBHOOK_SECRET: A secret string shared between your application and Infobip, used for verifying webhook signatures. You must define this and configure it in your Infobip webhook settings.DATABASE_URL: Connection string for your database. Adjust accordingly.
-
Initialize Prisma: Set up Prisma in your project.
bashnpx prisma initThis creates the
prisma/schema.prismafile and updates.envwith a placeholderDATABASE_URLif it wasn't already present. Configureprisma/schema.prismaas shown in the next section.
2. Creating a database schema and data layer (Prisma)
We need a way to store the delivery status updates. We'll use Prisma for this.
-
Define Prisma Schema: Open
prisma/schema.prismaand define the data source, generator, and a model to store status updates.prisma// prisma/schema.prisma generator client { provider = ""prisma-client-js"" } datasource db { provider = ""postgresql"" // Or your chosen database: mysql, sqlite, sqlserver, mongodb url = env(""DATABASE_URL"") } model MessageStatus { id String @id @default(cuid()) // Unique DB identifier messageId String @unique // Infobip's message ID status String // e.g., DELIVERED_TO_HANDSET, UNDELIVERABLE groupName String // e.g., DELIVERED, UNDELIVERABLE, PENDING description String? // Detailed status description errorCode Int? // Error code if applicable receivedAt DateTime @default(now()) // Timestamp when webhook was received updatedAt DateTime @updatedAt // Timestamp of last update to this record @@index([messageId]) @@index([status]) @@index([receivedAt]) }- This schema defines a
MessageStatustable. messageIdis marked unique as we typically want the latest status for a given message.- Adjust fields based on the actual data you need from the Infobip payload. Consult Infobip's documentation for the exact structure of the delivery report.
- This schema defines a
-
Run Initial Migration: Create the database table based on the schema. Make sure your database server is running and accessible using the
DATABASE_URLin.env.bashnpx prisma migrate dev --name initThis command:
- Creates the SQL migration file in
prisma/migrations/. - Applies the migration to your database.
- Generates the Prisma Client based on your schema (
node_modules/.prisma/client).
- Creates the SQL migration file in
-
Create Prisma Plugin for Fastify: Create a Fastify plugin to instantiate and provide the Prisma client to your routes.
javascript// src/plugins/prisma.js 'use strict' const fp = require('fastify-plugin') const { PrismaClient } = require('@prisma/client') async function prismaPlugin (fastify, options) { const prisma = new PrismaClient({ log: process.env.NODE_ENV === 'development' ? ['query', 'info', 'warn', 'error'] : ['warn', 'error'] }) await prisma.$connect() fastify.log.info('Prisma client connected.') fastify.decorate('prisma', prisma) fastify.addHook('onClose', async (instance) => { instance.log.info('Disconnecting Prisma client...') await instance.prisma.$disconnect() instance.log.info('Prisma client disconnected.') }) } module.exports = fp(prismaPlugin)- This plugin creates a
PrismaClientinstance. - It connects the client when the Fastify server starts.
- It decorates the Fastify instance with
fastify.prisma, making the client available in request handlers. - It disconnects the client gracefully when the server shuts down using the
onClosehook.
- This plugin creates a
3. Implementing core functionality & API layer
Now, let's build the main application logic and the webhook endpoint.
-
Configure Fastify Application (
src/app.js): Set up the core Fastify application, register plugins, and routes.javascript// src/app.js 'use strict' require('dotenv').config() const path = require('node:path') const Fastify = require('fastify') const sensible = require('@fastify/sensible') const prismaPlugin = require('./plugins/prisma') const webhookRoutes = require('./routes/webhooks') async function build (opts = {}) { const app = Fastify({ logger: { level: process.env.LOG_LEVEL || 'info', // Use pino-pretty only in development via CLI pipe: `| pino-pretty` }, ajv: { customOptions: { // Allow properties not defined in the schema (Infobip might add fields) // Consider setting this to `true` and defining a strict schema for production removeAdditional: false, coerceTypes: false, // Important for signature verification rawBody allErrors: true } }, ...opts }) // Add a hook to expose the raw request body for signature verification // This MUST run before Fastify's default JSON parser attempts to parse app.addContentTypeParser('application/json', { parseAs: 'buffer' }, function (req, body, done) { try { req.rawBody = body; // Store the raw buffer // Now parse the JSON from the buffer for handler use const json = JSON.parse(body.toString('utf8')); done(null, json); } catch (err) { err.statusCode = 400; // Bad Request if JSON parsing fails done(err, undefined); } }); // Register Plugins app.register(sensible) // Provides httpErrors, assert, etc. app.register(prismaPlugin) // Register Routes app.register(webhookRoutes, { prefix: '/webhooks' }) // Basic health check route app.get('/health', async (request, reply) => { try { // Optional: Check DB connection await app.prisma.$queryRaw`SELECT 1` return { status: 'ok', timestamp: new Date().toISOString(), db: 'connected' } } catch (dbError) { request.log.error({ err: dbError }, 'Health check failed - DB connection error') reply.code(503) // Service Unavailable return { status: 'error', timestamp: new Date().toISOString(), db: 'disconnected' } } }) return app } module.exports = { build }- Loads environment variables using
dotenv. - Initializes Fastify with logging options.
- Crucially, adds a custom content type parser for
application/json. This parser does two things:- Stores the original raw buffer of the request body onto
req.rawBody. This is essential for verifying the webhook signature later, as the signature is calculated based on the raw, unparsed body. - Parses the incoming JSON payload so your handler can access it easily (
done(null, json)).
- Stores the original raw buffer of the request body onto
- Registers the
@fastify/sensibleand ourprismaPlugin. - Registers the webhook routes under the
/webhooksprefix. - Includes a basic
/healthcheck endpoint.
- Loads environment variables using
-
Create Server Entry Point (
src/server.js): This file builds and starts the Fastify server.javascript// src/server.js 'use strict' require('dotenv').config() const { build } = require('./app') const start = async () => { const host = process.env.HOST || '127.0.0.1' const port = parseInt(process.env.PORT || '3000', 10) let server try { server = await build() await server.listen({ port: port, host: host }) // Handle graceful shutdown const signals = ['SIGINT', 'SIGTERM'] signals.forEach(signal => { process.on(signal, async () => { server.log.info(`Received ${signal}, shutting down gracefully...`) await server.close() // Prisma client disconnect handled by 'onClose' hook in plugin process.exit(0) }) }) } catch (err) { // Log error during startup if (server) { server.log.error(err, 'Server startup error after build') } else { console.error('Server startup error before build:', err) } process.exit(1) } } start()- Builds the Fastify app using
app.js. - Starts the server listening on the configured host and port.
- Includes basic graceful shutdown handling for
SIGINTandSIGTERM.
- Builds the Fastify app using
-
Implement Webhook Route (
src/routes/webhooks.js): Define the endpoint that Infobip will call.javascript// src/routes/webhooks.js 'use strict' const crypto = require('node:crypto') // !! IMPORTANT PLACEHOLDER !! // This schema MUST match the actual payload structure provided by Infobip // for delivery reports. Consult the official Infobip documentation. // Using an accurate schema enables validation and helps Fastify optimize parsing. const deliveryReportSchema = { body: { type: 'object', required: ['results'], properties: { results: { type: 'array', items: { type: 'object', required: ['messageId', 'status'], properties: { messageId: { type: 'string' }, to: { type: 'string' }, sentAt: { type: 'string', format: 'date-time' }, // Example format, verify actual doneAt: { type: 'string', format: 'date-time' }, // Example format, verify actual smsCount: { type: 'integer' }, price: { type: 'object', properties: { pricePerMessage: { type: 'number' }, currency: { type: 'string' } } }, status: { type: 'object', required: ['groupId', 'groupName', 'id', 'name', 'description'], properties: { groupId: { type: 'integer' }, groupName: { type: 'string' }, // e.g., PENDING, UNDELIVERABLE, DELIVERED, REJECTED id: { type: 'integer' }, name: { type: 'string' }, // e.g., PENDING_WAITING_DELIVERY, UNDELIVERABLE_NOT_DELIVERED description: { type: 'string' } } }, error: { // Optional error details - verify structure type: 'object', required: ['groupId', 'groupName', 'id', 'name', 'description', 'permanent'], properties: { groupId: { type: 'integer' }, groupName: { type: 'string' }, id: { type: 'integer' }, name: { type: 'string' }, description: { type: 'string' }, permanent: { type: 'boolean' } } } // Add any other fields documented by Infobip } // end properties for items } // end items } // end results property } // end properties for body } // end body schema } // end deliveryReportSchema async function webhookRoutes (fastify, options) { // Middleware/Hook for Signature Verification (applied only to routes in this file) fastify.addHook('preHandler', async (request, reply) => { const secret = process.env.INFOBIP_WEBHOOK_SECRET if (!secret) { request.log.error('CRITICAL: INFOBIP_WEBHOOK_SECRET environment variable is not configured. Cannot verify webhook signature.'); // Fail closed for security throw fastify.httpErrors.internalServerError('Webhook security configuration is missing.'); } // !! Verify Header Name !! Check Infobip documentation for the correct signature header. // Common example: 'x-infobip-signature' const signatureHeader = request.headers['x-infobip-signature'] if (!signatureHeader) { request.log.warn('Missing X-Infobip-Signature header from incoming request.'); throw fastify.httpErrors.unauthorized('Missing webhook signature header'); } if (!request.rawBody) { // This should not happen if the contentTypeParser is set up correctly request.log.error('FATAL: Raw request body (request.rawBody) is missing. Check Fastify content type parser setup in app.js.'); throw fastify.httpErrors.internalServerError('Internal server configuration error processing request body.'); } try { // !! Verify Algorithm !! Ensure 'sha256' matches the algorithm specified by Infobip. const hmac = crypto.createHmac('sha256', secret); const digest = Buffer.from(hmac.update(request.rawBody).digest('hex'), 'utf8'); const receivedSignature = Buffer.from(signatureHeader, 'utf8'); // Use timingSafeEqual to prevent timing attacks if (receivedSignature.length !== digest.length || !crypto.timingSafeEqual(digest, receivedSignature)) { request.log.warn(`Invalid webhook signature received. Header: ${signatureHeader}`); throw fastify.httpErrors.unauthorized('Invalid webhook signature'); } request.log.info('Webhook signature verified successfully.'); } catch (error) { if (error.statusCode === 401) { // Re-throw unauthorized specifically throw error; } // Log the underlying error for debugging request.log.error({ err: error }, 'Error occurred during signature verification process.'); throw fastify.httpErrors.internalServerError('Signature verification failed due to an internal error.'); } }); fastify.post('/infobip', { schema: deliveryReportSchema }, async (request, reply) => { // Signature verification already passed if we reach here (due to preHandler hook) request.log.info({ payload: request.body }, 'Received verified Infobip delivery report'); const results = request.body.results; // Process each result in the payload // Use Promise.allSettled for resilience: handles errors for individual // results without stopping the processing of others in the batch. const processingPromises = results.map(async (result) => { const { messageId, status, error } = result; if (!messageId) { request.log.warn({ result }, 'Skipping result: Missing required field ""messageId""'); return { messageId: null, status: 'skipped', reason: 'Missing messageId' }; } try { // Use upsert: Update if messageId exists, otherwise create new entry // This makes the endpoint idempotent regarding database state. const updatedStatus = await fastify.prisma.messageStatus.upsert({ where: { messageId: messageId }, update: { status: status.name, // Use specific status name groupName: status.groupName, description: status.description, errorCode: error ? error.id : null, // Store error code if present (verify 'error.id') // updatedAt is handled automatically by @updatedAt }, create: { messageId: messageId, status: status.name, groupName: status.groupName, description: status.description, errorCode: error ? error.id : null // receivedAt is handled automatically by @default(now()) } }); request.log.info({ messageId: updatedStatus.messageId, status: updatedStatus.status }, 'Message status updated/created in DB.'); return { messageId, status: 'processed' }; } catch (dbError) { request.log.error({ err: dbError, messageId }, 'Failed to update/create message status in DB'); // Depending on requirements, you might want to retry or flag this // for manual review (e.g., push to a dead-letter queue). return { messageId, status: 'failed', reason: dbError.message }; } }); // Wait for all database operations to settle (complete or fail) const outcomes = await Promise.allSettled(processingPromises); request.log.info({ outcomes }, 'Finished processing batch of delivery reports.'); // Filter outcomes to see if any failed (optional, for logging/alerting) const failedOutcomes = outcomes.filter(o => o.status === 'rejected' || (o.status === 'fulfilled' && o.value.status === 'failed')); if (failedOutcomes.length > 0) { request.log.warn({ count: failedOutcomes.length, details: failedOutcomes }, 'Some delivery reports in the batch failed processing.'); } // **CRITICAL:** Acknowledge receipt to Infobip with a 200 OK status. // Failure to respond with 2xx might cause Infobip to retry sending the // same webhook, leading to duplicate processing attempts. reply.code(200).send({ received: true, processed: outcomes.length }); }) } module.exports = webhookRoutes- Schema: Defines the expected structure (
deliveryReportSchema). This is explicitly marked as a placeholder. You must replace this with the actual schema based on Infobip's documentation. - Signature Verification (
preHandler):- Runs before the main route handler.
- Checks for
INFOBIP_WEBHOOK_SECRET. - Gets the signature from headers (assuming
X-Infobip-Signature– verify this header name). - Checks for
request.rawBody. - Calculates HMAC signature (assuming
sha256– verify this algorithm) using the secret andrequest.rawBody. - Compares signatures using
crypto.timingSafeEqual. - Throws appropriate HTTP errors (401 Unauthorized or 500 Internal Server Error) on failure.
- Main Handler (
fastify.post('/infobip', ...)):- Logs the received payload.
- Iterates through the
resultsarray. - Uses
fastify.prisma.messageStatus.upsertfor idempotent database updates/creates. - Handles database errors gracefully using
Promise.allSettled. - Sends a
200 OKresponse back to Infobip.
- Schema: Defines the expected structure (
4. Integrating with Infobip
Your application is ready to receive webhooks, but you need to tell Infobip where to send them and provide the necessary secret.
-
Obtain Public URL: Your webhook endpoint must be accessible from the public internet.
- Development: Use a tool like
ngrokto expose your local server. Runngrok http 3000(if your app runs on port 3000).ngrokprovides a temporary public HTTPS URL (e.g.,https://<unique-subdomain>.ngrok.io). Note: FreengrokURLs change each time you restart it, and have usage limitations; they are suitable only for temporary development testing. - Production: Use the stable, public URL of your deployed application (e.g.,
https://your-app-domain.com). This URL must be HTTPS.
- Development: Use a tool like
-
Configure Infobip Webhook:
- Log in to your Infobip account/portal.
- Navigate to the section for configuring API settings, applications, or webhooks. Look for terms like ""Delivery Reports,"" ""Callbacks,"" or ""Webhooks."" (This might be under an ""Applications"" section or general ""API Settings"").
- Find the option to set a URL for receiving delivery reports (often called ""Delivery Report URL"" or similar).
- Enter the full public HTTPS URL of your endpoint:
https://<your-public-url>/webhooks/infobip. - Look for a field to configure webhook security or a ""Secret Key."" Enter the exact same secret string you defined in your
.envfile forINFOBIP_WEBHOOK_SECRET. - Ensure the signature algorithm selected in Infobip (e.g., HMAC-SHA256) exactly matches the one used in your
preHandlercode (crypto.createHmac('sha256', ...)). - Save the configuration in Infobip.
-
API Key Security: While this guide focuses on receiving webhooks, remember that your sending application needs an Infobip API Key. Store this key securely (e.g., environment variable) and never commit it to version control. The
INFOBIP_WEBHOOK_SECRETis also highly sensitive and should be treated like a password.
5. Implementing proper error handling, logging, and retry mechanisms
- Error Handling:
@fastify/sensibleprovides convenient HTTP error generation (e.g.,fastify.httpErrors.unauthorized()).- The signature verification
preHandlercatches authentication/configuration errors early. - The main handler uses
try...catchwithin the mapping loop (viaPromise.allSettled) to handle potential database errors for individual status updates without crashing the entire request. - Fastify's default error handler catches unhandled exceptions and sends a generic 500 response. Customize using
fastify.setErrorHandler()if needed.
- Logging:
- Fastify uses
pinofor efficient JSON logging. Configured insrc/app.js. SetLOG_LEVELvia.env. - Log key events: receiving webhook, signature verification success/failure, database update success/failure, batch processing completion, configuration errors.
- In development, pipe to
pino-pretty(package.json'sdevscript) for readability. In production, ingest structured JSON logs into a log management system (e.g., Datadog, ELK stack, Splunk, Loki).
- Fastify uses
- Retry Mechanisms:
- Infobip Retries: If your endpoint fails to respond with
200 OK(e.g., downtime, crash, signature failure causing 4xx/5xx), Infobip may retry sending the webhook. Check their documentation for specific retry policies (timing, number of attempts). - Application Idempotency: Because Infobip might retry, your endpoint must be idempotent. Processing the same webhook multiple times should yield the same final state. Using
prisma.upsertbased on the uniquemessageIdachieves idempotency at the database level. The signature check prevents processing unauthorized duplicates. - Internal Retries: For transient database errors within your handler, you could implement limited retry logic (e.g., using
async-retry). However, often it's better to log the error clearly and rely on idempotency + Infobip's retries, or push failed updates to a dead-letter queue for separate investigation/processing. Ensure your endpoint still returns200 OKquickly even if some internal processing fails, to prevent unnecessary Infobip retries.
- Infobip Retries: If your endpoint fails to respond with
6. Adding security features
- Webhook Signature Verification: Mandatory. Implemented in the
preHandlerhook using HMAC (algorithm verified against Infobip docs) and a strong, unique shared secret. Ensures authenticity and integrity. - Input Validation: Fastify's schema validation (
deliveryReportSchema) provides basic structure validation. Crucially, ensure this schema matches Infobip's actual payload. SetremoveAdditional: falsecautiously; for stricter security, define a precise schema and consider settingremoveAdditional: trueor explicitly handle unexpected properties. - HTTPS: Mandatory. Always use HTTPS for the webhook endpoint URL configured in Infobip and exposed by your server.
ngrokprovides this for development. In production, ensure your load balancer or server terminates TLS/SSL. - Rate Limiting: Protect against DoS or abuse. Use
@fastify/rate-limit.bashnpm install @fastify/rate-limitjavascript// src/app.js - Register the plugin // ... other requires const rateLimit = require('@fastify/rate-limit') async function build (opts = {}) { const app = Fastify({ /* ... */ }) // Existing Fastify setup // Register rate limiting early, before routes if possible await app.register(rateLimit, { max: 100, // Example: Max 100 requests per minute per IP timeWindow: '1 minute', // Consider a key generator based on something Infobip might send if source IPs vary // keyGenerator: function (request) { return request.ip } // Default is IP }) // ... rest of app setup, including routes return app } - Secrets Management: Store
INFOBIP_WEBHOOK_SECRETandDATABASE_URLsecurely using environment variables. Never hardcode or commit them. Use a secrets management system in production (e.g., HashiCorp Vault, AWS Secrets Manager, Doppler, platform-specific secrets). - Least Privilege (Database): The database user specified in
DATABASE_URLshould have only the minimum required permissions (e.g.,SELECT,INSERT,UPDATEon theMessageStatustable) and not broad administrative rights. - Disable Introspection Endpoints: In production, ensure no debugging or introspection endpoints (like GraphQL playground if used elsewhere) are accidentally exposed publicly.
7. Handling special cases relevant to the domain
- Payload Variations: Infobip might send different fields or structures for different statuses (e.g., errors vs. success). Your schema and parsing logic must be robust. Constantly refer to Infobip documentation.
- Duplicate Webhooks (Idempotency): Handled by
upsertand the unique constraint onmessageId. Signature verification prevents processing unauthorized duplicates. - Out-of-Order Webhooks: It's possible to receive status updates out of sequence (e.g.,
DELIVEREDbeforeACCEPTED). Theupsertlogic inherently handles this by always storing the latest received state for a givenmessageId. If sequence matters critically, you might need to compare timestamps (doneAtfrom Infobip orreceivedAtfrom your system). - Large Payloads / Batching: The example code handles the
resultsarray. If batches become extremely large, consider performance implications. Ensure database transactions or operations can handle the batch size efficiently. - Time Zones: Be mindful of time zones when storing and comparing timestamps (
sentAt,doneAt,receivedAt). Store timestamps in UTC (DateTimein Prisma typically maps totimestamp with time zonein PostgreSQL) and perform conversions as needed for display or comparison.
Frequently Asked Questions
How to set up Infobip delivery reports webhook?
Set up a webhook endpoint using Fastify and Node.js to receive real-time SMS delivery reports from Infobip. This involves creating a Fastify application, validating incoming requests, parsing the payload, and storing the status information in a database. The guide provides a step-by-step process for implementing this.
What is the purpose of Infobip delivery reports?
Infobip delivery reports provide real-time updates on the status of your sent SMS messages, such as whether they were delivered, failed, or are pending. This allows for more accurate tracking than relying solely on the initial API response when sending messages.
Why use Fastify for Infobip webhooks?
Fastify is a high-performance Node.js web framework chosen for its speed and developer-friendly plugin ecosystem. Its efficiency makes it ideal for handling real-time updates from webhooks like those from Infobip.
When should I use Infobip delivery report webhooks?
Use Infobip delivery report webhooks when you need real-time tracking of SMS message statuses beyond the initial API sending response. They are crucial for applications requiring confirmation of delivery or handling failures.
Can I use a different database with Fastify for Infobip?
Yes, while the example uses PostgreSQL with Prisma, you can adapt the project to other databases. The provided Prisma schema can be modified to work with MySQL, SQLite, SQL Server, or MongoDB by changing the provider in the schema file.
How to verify Infobip webhook signature in Node.js?
Verify the signature using an HMAC-SHA256 algorithm with a shared secret configured in your Infobip account and your .env file. The signature is calculated based on the raw request body, so ensure your Fastify setup stores this raw body before parsing the JSON.
What is the role of Prisma in handling Infobip delivery reports?
Prisma acts as an ORM (Object-Relational Mapper), simplifying database interactions. It's used to define the database schema, manage migrations, and provide a client for querying and storing delivery report data in the database.
How to handle errors with Infobip delivery reports using Fastify?
Implement robust error handling using try...catch blocks within the webhook handler, especially within database operations. Use Fastify's HTTP error utilities (@fastify/sensible) for clear error responses, and implement logging for tracking issues.
How to secure Infobip delivery report webhook endpoint?
Secure your endpoint by verifying the HMAC-SHA256 signature of incoming requests, using HTTPS, validating input against a schema, and implementing rate limiting to prevent abuse. Proper secrets management and least privilege database access are also essential.
What are the prerequisites for setting up this Infobip webhook handler?
You need a working Node.js environment, npm or yarn, an active Infobip account with API access, database access, a way to publicly expose your server (like ngrok for development), and a basic understanding of JavaScript, Node.js, REST APIs, and webhooks.
What is the importance of rawBody in Infobip webhook verification?
The rawBody contains the original, unparsed request body which is crucial for calculating the HMAC-SHA256 signature. Ensure your Fastify application's content type parser stores this raw body before any JSON parsing occurs.
How to manage Infobip webhook secrets securely?
Store sensitive information like the INFOBIP_WEBHOOK_SECRET and DATABASE_URL in environment variables. Never hardcode or commit them to version control. In production, use a dedicated secrets management system.
How to handle duplicate Infobip delivery reports?
The provided code handles duplicates using Prisma's upsert functionality, ensuring idempotent database updates based on the unique messageId. This, combined with signature verification, prevents processing unauthorized or unintended duplicates.
How does the Infobip webhook handler handle large batches of delivery reports?
The handler processes results in batches using an array. For extremely large batches, consider potential performance impacts on database transactions and optimize accordingly. Ensure your database operations are efficient for handling bulk updates.