```html

Automating Event Calendar Sync and Boat Operations Dispatch: A Multi-Platform Integration Case Study

What Was Done

Over this development session, we tackled two parallel infrastructure challenges for the Queen of San Diego events platform:

  • Replaced manual Google Apps Script (GAS) calendar management with a programmatic Lambda-backed API for event synchronization
  • Built an automated boat cleaning dispatch system that orchestrates tasks across multiple platforms without manual intervention
  • Integrated email notification workflows to keep stakeholders informed of system status and operational changes

The work required coordinating across four distinct systems: Google Calendar, AWS Lambda/API Gateway, Google Apps Script, and multiple email platforms. The goal was to eliminate manual calendar entry errors and reduce the cognitive load on event coordinators.

Technical Details: Calendar Sync Architecture

The existing calendar management relied on Google Apps Script files located at /Users/cb/Documents/repos/sites/queenofsandiego.com/rady-shell-events/apps-script-replacement/CalendarSync.gs. This script was performing Google Calendar API calls directly, but lacked a scalable query interface and didn't integrate with our event platform's Lambda infrastructure.

The solution: We mapped the GAS implementation to a Lambda-backed API Gateway endpoint that serves as the single source of truth for calendar operations.

API Design Pattern

Rather than directly calling Google Calendar from the web tier, we implemented an action-based Lambda interface. The Lambda function (deployed via /Users/cb/Documents/repos/sites/queenofsandiego.com/tools/shipcaptaincrew/lambda_function.py) accepts structured requests with an action parameter:


POST /calendar
Content-Type: application/json
Authorization: Bearer {dashboard_token}

{
  "action": "add-calendar-event",
  "title": "Sea Scout Wednesday Hold",
  "start_time": "2024-04-28T18:00:00Z",
  "end_time": "2024-04-28T20:00:00Z",
  "calendar_id": "{calendar_id}"
}

The Lambda function validates the action against a whitelist of allowed operations before executing any Google Calendar API calls. This prevents injection attacks and ensures audit-ability—every calendar modification is logged with its action context.

Credentials and Authentication Flow

Google Calendar API credentials are stored as Lambda environment variables (not in code). The deployment process reads from a centralized repos.env file that contains:

  • Google Service Account JSON (for headless API access)
  • Calendar IDs for each managed calendar
  • Dashboard authentication tokens for request validation

This allows the Lambda function to authenticate against Google Calendar without storing secrets in version control. The Lambda execution role is granted only the IAM permissions needed to invoke the Google Calendar API—principle of least privilege.

Technical Details: Boat Operations Dispatch System

The boat cleaning system required coordination across three subsystems:

  • /Users/cb/Documents/repos/tools/dispatch_boat_cleaner.py — Main dispatch orchestrator
  • /Users/cb/Documents/repos/tools/platform_inbox_scraper.py — Monitors external platform inboxes for booking confirmations
  • Boat platform APIs (GetMyBoat, Boatsetter) — External booking platforms

The workflow is event-driven: when a boat booking is confirmed on an external platform, the scraper detects it and triggers the dispatcher, which generates a cleaning task and sends notifications to the operations team.

Scraper Architecture

The platform_inbox_scraper.py script polls external booking platforms (checking for confirmation emails) on a configurable interval. Rather than making raw HTTP requests, it uses platform-specific credential files:


# Read credentials from secure store
BOATSETTER_API_KEY = os.getenv('BOATSETTER_API_KEY')
GETMYBOAT_API_KEY = os.getenv('GETMYBOAT_API_KEY')

# Query both platforms for recent bookings
bookings = {
    'boatsetter': fetch_boatsetter_bookings(api_key, lookback_hours=24),
    'getmyboat': fetch_getmyboat_bookings(api_key, lookback_hours=24)
}

The scraper is deployed via /Users/cb/Documents/repos/tools/deploy_inbox_scraper.sh, which:

  • Packages the script and dependencies into a Lambda layer
  • Creates a CloudWatch Events rule to invoke it every 30 minutes
  • Stores bookings in DynamoDB for deduplication (prevents duplicate dispatch tasks)

Deduplication is critical—without it, a network retry could trigger multiple cleaning requests for the same booking.

Dispatcher Logic

The dispatch_boat_cleaner.py script receives booking data and:

  1. Validates the booking has all required fields (boat ID, customer email, date)
  2. Calculates the cleaning window (24 hours before next scheduled booking, or next day if no future booking)
  3. Formats a task payload compatible with the operations dashboard
  4. Sends notifications via SES to the operations team with the task details

The dispatcher doesn't directly invoke external cleaning services. Instead, it creates an operational task that a team member can approve. This human-in-the-loop design prevents accidental double-booking of cleaning crews.

Email Campaign Infrastructure

To support stakeholder communication, we built a campaign scheduler system:

  • /Users/cb/Documents/repos/tools/campaign_scheduler.py — Orchestrates scheduled email sends
  • /Users/cb/Documents/repos/tools/campaign_schedule.json — Configuration for blast timing and templates
  • /Users/cb/Documents/repos/tools/templates/rady_shell_blast1.html — Email templates for the Queen of San Diego events

The campaign scheduler reads a JSON configuration that specifies:


{
  "campaigns": [
    {
      "name": "sea_scout_wednesday",
      "schedule_cron": "0 10 * * 3",
      "template": "templates/rady_shell_blast1.html",
      "recipients_source": "dynamodb_table:sea_scout_subscribers",
      "send_method": "aws_ses"
    }
  ]
}

This configuration-driven approach allows non-engineers to adjust campaign timing without touching code. The campaign scheduler is deployed via Lambda scheduled rules (CloudWatch Events), executing the orchestration on the specified cron schedule.

Infrastructure and Deployment Strategy

All Lambda functions are deployed using shell scripts that:

  • Package Python code with pip dependencies
  • Create or update Lambda functions via AWS CLI
  • Attach execution IAM roles with minimal required permissions
  • Inject environment variables from repos.env

Example deployment pattern:


#!/bin/bash
set -e

FUNCTION_NAME="boat-cleaner-dispatcher"
HANDLER_FILE="dispatch_boat_cleaner.py"

# Package dependencies
pip install -r requirements.txt -t ./build/
cp $HANDLER_