When integrating Shopify webhooks with an internal API protected by the Kong API Gateway, one of the first—and most critical—challenges you’ll encounter is authentication. Shopify signs its webhook payloads using an HMAC-SHA256 signature, which is delivered in a custom HTTP header (x-shopify-hmac-sha256). Unfortunately, this doesn’t align neatly with standard authentication mechanisms, especially those built into API gateways like Kong.
I explored several architectural options to securely and reliably handle this data flow. Below is a breakdown of the five approaches I considered, along with their pros, cons, and feasibility.
Option 1: Use Kong’s Native HMAC Authentication Plugin
Kong offers an HMAC authentication plugin that validates requests signed with an HMAC signature. However, it expects the signature in the standard Authorization (or Proxy-Authorization) header.
Conflict: Shopify sends its signature in a custom header (x-shopify-hmac-sha256) and does not allow customization of its webhook headers.
Verdict: ❌ Not viable out-of-the-box.
To use this plugin, you’d need to either:
- Build an intermediary service that rewrites the header, or
- Use a third-party webhook relay (e.g., Hookdeck, Pipedream) to normalize the request before it hits Kong.
Option 2: Develop a Custom Kong Plugin
A tailored Lua plugin could parse Shopify’s x-shopify-hmac-sha256 header, validate the payload using your shared secret, and reject invalid requests at the gateway layer.
Pros:
- Keeps validation logic within Kong, maintaining consistency.
- Reduces load on your downstream API.
- Enables early rejection of malicious or malformed requests.
- Fits well in architectures where the gateway is the first and only line of defense.
Cons:
- Long-term maintenance: Must track Kong updates and ensure plugin compatibility.
Verdict: ✅ A clean, efficient solution—if you have the in-house expertise.
In my own implementation, I chose this option. The custom plugin validates the Shopify HMAC directly in Kong, then forwards verified payloads to an internal API. That API publishes the payload to a message queue (e.g., RabbitMQ), where a worker service consumes, processes, and delivers it to its final destination. This design keeps the integration tight, observable, and secure—without introducing extra network hops.
Option 3: IP Address Whitelisting
This approach would restrict the Kong endpoint to only accept traffic from Shopify’s known IP ranges.
Conflict: Shopify does not publish a static list of webhook source IPs. Their infrastructure uses dynamic IPs that can change without notice.
Verdict: ❌ Not viable.
Relying on IP whitelisting would result in intermittent webhook failures and is not recommended by Shopify.
Option 4: Open Gateway Endpoint + App-Level Authentication
Remove authentication from Kong entirely and let your API handle signature validation in application code.
How it works:
- Kong route is left open (no authentication).
- Your API validates the
x-shopify-hmac-sha256header using the Shopify shared secret.
Major Risk:
While rate limiting can be configured in Kong to mitigate abuse, it’s important to note that the rate limiting plugin only tracks requests that have passed authentication. Failed authentication attempts are not counted, leaving the endpoint potentially vulnerable to brute force or denial-of-service attacks targeting the authentication mechanism.
Verdict: ⚠️ High risk for mission-critical integrations.
Only consider if webhook loss is tolerable or if you have robust retry/failure reconciliation mechanisms. That said, for non-public or internal-only APIs with low exposure, this approach can be acceptable—especially when development speed outweighs strict gateway-layer security.
Option 5: Create an Intermediary Service (Middleware)
Build a lightweight, dedicated service that:
- Receives Shopify webhooks directly.
- Validates the
x-shopify-hmac-sha256signature. - Forwards valid requests to your Kong-protected API—now with standardized authentication (e.g., API key, JWT, or HMAC in the expected format).
Pros:
- Decouples Shopify-specific logic from your core API.
- Full control over validation, logging, retries, and error handling.
- Secure: Your main API remains protected by Kong.
- Can add queueing, idempotency, or transformation as needed.
Cons:
- Additional service to deploy, monitor, and maintain.
- Slightly increased latency (usually negligible).
Verdict: ✅ A robust, enterprise-friendly pattern—especially when integrating multiple third-party webhooks or when your team prefers to keep gateway logic generic.
Final Thoughts: There’s No One-Size-Fits-All
The “best” option depends on your threat model, team capabilities, deployment constraints, and system criticality:
- If your API is public-facing and you want minimal infrastructure, Option 4 (app-level auth) might suffice—provided you accept the risks.
- If you prefer strict separation of concerns and plan to scale to multiple SaaS integrations, Option 5 (intermediary service) offers long-term flexibility.
- But if you run a Kong-centric architecture, have Lua/plugin experience, and want to keep authentication at the edge, Option 2 (custom plugin) is not only viable—it can be elegant and efficient.
In my case, Option 2 struck the right balance: it allowed me to receive Shopify webhook data directly on Kong Gateway, validate the payload inline, and seamlessly pass clean messages into a queue-based processing pipeline—all without adding extra services or compromising security. Here are the details about this implementation.
Choose the path that aligns best with your architecture, your team, and your reliability requirements.