```html

Building a Real-Time Vendor Referral Tracking System: Architecture & Implementation

Over the past development session, we built and deployed a complete vendor referral program infrastructure for the JADA network. This article details the technical implementation, architectural decisions, and DevOps patterns used to track referral clicks in real-time across seven vendor partners.

What We Built

The JADA Vendor Referral Program is a lightweight tracking system that allows partner vendors to share branded referral links, automatically logs click events to DynamoDB, and seamlessly redirects guests to the main site with UTM parameters for analytics attribution. The system went live with seven vendors:

  • Solare
  • Smallgoods
  • Puesto
  • Board & Brew
  • Whole Foods
  • Total Wine
  • Gianni Buonomo Vintners

Each vendor received a unique referral link following the pattern https://shipcaptaincrew.queenofsandiego.com/ref/{vendor_code}.

Technical Architecture

Endpoint Implementation

The core tracking logic was implemented as a new GET endpoint in the existing ShipCaptainCrew Lambda function. The handle_referral_click() function was added to the deployed Lambda codebase and handles the complete referral flow:

GET /ref/{code}
→ DynamoDB write (vendor_name, timestamp, click_count)
→ GA4 UTM redirect to queenofsandiego.com

Rather than creating a separate Lambda, we extended the existing function to keep operational overhead minimal and maintain unified logging. This decision reduced cold-start latency and simplified IAM role management—the Lambda already had write permissions to our DynamoDB tables.

Data Storage Strategy

Click events are persisted to a DynamoDB table with the following structure:

  • Partition Key: vendor_name (e.g., "solare", "smallgoods")
  • Sort Key: timestamp (ISO-8601 format)
  • Attributes: click_count (running aggregate), source_ip (for analytics)

We chose DynamoDB over RDS because referral tracking is write-heavy, eventually-consistent, and doesn't require complex relational queries. The on-demand billing model means we pay only for actual clicks—perfect for a new program with unpredictable volume.

Infrastructure & Deployment

Route53 Routing

No new DNS records were needed. The /ref/{code} path routes through the existing shipcaptaincrew.queenofsandiego.com DNS entry, which is already aliased to the SCC Lambda's function URL in Route53.

CloudFront Caching Strategy

The /ref/* paths bypass CloudFront caching to ensure real-time click tracking. In the CloudFront distribution (ID: E*****), we configured cache behaviors with:

  • Path pattern: /ref/*
  • Cache policy: CachingDisabled (TTL: 0)
  • Origin request policy: AllViewerAndWhitelistCloudFrontHeaders

This ensures every click reaches the Lambda function immediately, avoiding stale data in edge caches. After deploying the referral endpoint, we invalidated the CloudFront distribution cache using:

aws cloudfront create-invalidation \
  --distribution-id E***** \
  --paths "/ref/*"

S3 & Preview Infrastructure

To manage vendor outreach and approval workflows, we created a preview page stored in S3:

  • File: /tmp/vendor-referral-outreach.html (local dev)
  • S3 Bucket: queenofsandiego-assets (existing bucket)
  • S3 Key: vendor-referral-outreach/preview-20250108.html
  • CloudFront URL: https://assets.queenofsandiego.com/vendor-referral-outreach/preview-20250108.html

The preview page embeds all three pending vendor email templates (HTML formatted) for internal review and approval before SES sends them to the vendor contacts.

Vendor Outreach & Approval Workflow

Rather than sending emails ad-hoc, we built an approval workflow by creating a dashboard task card linked to the preview page.

Dashboard Task Creation

A new task card was created in the progress dashboard system using the internal update_dashboard.py script:

python update_dashboard.py \
  --task-type "vendor_referral_approval" \
  --title "Review & Approve Vendor Referral Emails" \
  --description "Preview 3 vendor outreach emails for Smallgoods, Puesto, and Board & Brew" \
  --preview-url "https://assets.queenofsandiego.com/vendor-referral-outreach/preview-20250108.html" \
  --status "pending_review"

This task is visible in the internal dashboard and provides a single source of truth for vendor outreach status. The embedded preview URL allows reviewers to inspect the exact email templates before approval.

Key Technical Decisions

Why Extend the Existing Lambda vs. Creating a New Function

Combining the referral endpoint with the existing SCC Lambda reduced operational complexity:

  • Single CloudWatch Logs stream for all SCC traffic
  • Unified IAM role—no new role creation or permissions debugging
  • Faster deployment iteration—no new function aliases or environment variables
  • Lower cost—fewer concurrent Lambda instances and reserved concurrency units

Why DynamoDB Over RDS

DynamoDB's schemaless design and on-demand billing align perfectly with referral tracking requirements:

  • No database connection pooling needed (Lambda cold starts are faster)
  • Pay-per-request pricing scales elegantly if one vendor drives 10,000 clicks
  • TTL policies can auto-purge old click records after 90 days
  • No need for complex JOIN queries—just partition by vendor_name

Why Route Referrals Through ShipCaptainCrew vs. Main Site

Using shipcaptaincrew.queenofsandiego.com as the referral endpoint provides several benefits:

  • Keeps tracking logic in a single, dedicated Lambda function
  • Allows future vendor-specific logic without cluttering the main site router
  • Shorter redirect chain (1 hop: /ref → UTM redirect, vs. 2 hops through main site)
  • Simpler CloudFront cache invalidation (only SCC distribution needs updating)

Monitoring & Verification

The implementation was verified through multiple checks:

  • Lambda Logs: CloudWatch Logs group /aws/lambda/shipcaptaincrew for handle_referral_click()