```html

Replacing Google Apps Script with Lambda: Automating Calendar Sync and Service Dispatch

This session involved migrating critical automation from Google Apps Script (GAS) to AWS Lambda while simultaneously tackling operational challenges around service scheduling and communications. The work revealed interesting patterns in how to bridge legacy spreadsheet-driven workflows with modern serverless infrastructure.

The Problem Statement

We had two interconnected issues:

  • Calendar sync complexity: Google Apps Script maintaining CalendarSync.gs was fragile, polling-based, and difficult to debug
  • Service dispatch gaps: Manual boat cleaning coordination was breaking down; no systematic way to dispatch contractors or track status

The session involved rewriting the calendar sync logic as a Lambda function while simultaneously building out a boat cleaning dispatch system in Python.

Architecture: From GAS to Lambda

Previous Pattern (GAS)

The original CalendarSync.gs file lived in the Rady Shell Events Apps Script replacement project at:

/Users/cb/Documents/repos/sites/queenofsandiego.com/rady-shell-events/apps-script-replacement/CalendarSync.gs

This script used:

  • Time-based triggers (polling intervals) to check Google Sheets
  • Direct sendEmail() calls scattered throughout the code
  • Tight coupling to Google Sheets API and Gmail API
  • Hard-coded polling intervals that were inefficient

New Pattern (Lambda + API Gateway)

We identified a Lambda function already running calendar operations and exposed it via API Gateway. The function accepts JSON payloads with an action parameter. Discovered action names by downloading and parsing the Lambda code:

// Example action names found:
- add-calendar-event
- list-calendar-events
- update-calendar-event
- delete-calendar-event

The Lambda function was already authenticated to Google Calendar API via stored credentials in its environment. Rather than rewriting from scratch, we:

  1. Called the existing Lambda endpoint directly using curl with auth token from repos.env
  2. Verified the endpoint responded correctly
  3. Batched calendar operations (added 7 Sea Scout Wednesday holds in a single session)
  4. Removed polling logic from GAS entirely

Calendar Event Bulk Operations

Added recurring calendar holds for Sea Scout meetings using the Lambda API:

curl -X POST https://api-gateway-endpoint/calendar \
  -H "Authorization: Bearer $DASHBOARD_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "action": "add-calendar-event",
    "title": "Sea Scout Wednesday - [Location]",
    "startTime": "2025-05-07T19:00:00Z",
    "endTime": "2025-05-07T21:00:00Z",
    "recurrence": "WEEKLY"
  }'

Why Lambda over GAS:

  • Decoupling: Calendar operations no longer tied to spreadsheet changes
  • Scalability: Can handle bursts of event creation without time limits
  • Observability: CloudWatch logs for all calendar operations
  • No polling: Event-driven rather than time-triggered

Boat Cleaning Dispatch System

Created two new Python tools in /Users/cb/Documents/repos/tools/:

1. dispatch_boat_cleaner.py

Main dispatch logic. This script:

  • Reads boat cleaning requirements from a task queue
  • Looks up contractor availability (from stored contractor database)
  • Sends dispatch notifications via email or SMS
  • Logs dispatch events to a DynamoDB table for audit trail

Key decision: Stored contractor details separately rather than in the main task table, allowing for quick lookups and rate limit management.

2. platform_inbox_scraper.py

Bridges GetMyBoat and Boatsetter platforms. Scrapes incoming booking requests and normalizes them into a common schema before dispatch:

// Normalized booking schema:
{
  "platform": "getmyboat|boatsetter",
  "boat_id": "...",
  "requested_date": "ISO8601",
  "duration_hours": int,
  "renter_contact": "email|phone",
  "special_requirements": [...]
}

Why separate the scraper:

  • Platform APIs change frequently; isolation makes updates easier
  • Can schedule scraping independently of dispatch (cron vs event-driven)
  • Easier to test and debug platform-specific issues

3. deploy_inbox_scraper.sh

Deployment wrapper script that:

  • Validates Python syntax
  • Runs unit tests against mock platform responses
  • Uploads to Lambda (or EC2 cron, depending on frequency)
  • Updates CloudWatch event rules if schedule changed

Integration Points with Existing Infrastructure

Email delivery: Using AWS SES (Simple Email Service) instead of GAS's built-in `sendEmail()`. Removed all direct sendEmail calls from CalendarSync.gs and replaced with:

// In Lambda or dispatch scripts:
boto3.client('ses').send_email(
  Source='operations@queenofsandiego.com',
  Destination={'ToAddresses': [contractor_email]},
  Message={
    'Subject': {'Data': 'Dispatch: Boat Cleaning Required'},
    'Body': {'Html': {'Data': html_body}}
  }
)

Calendar storage: Events still land in Google Calendar, but no longer polled by GAS. Lambda maintains the source of truth.

Contractor database: Lives in a private S3 bucket (`queen-ops-data/contractors/`) with versioning enabled. Checked into git as encrypted JSON.

Key Decisions and Trade-offs

  • Why not replace GAS entirely? Sheets integration still works for certain workflows. We kept GAS but removed polling logic and email functions.
  • Why Python for dispatch? Easier to maintain complex business logic (contractor matching, availability checks) than GAS's JavaScript-like syntax.
  • Why Lambda over EC2 cron? Inbox scraper runs only when needed (event-driven from platform webhooks when available), scaling to zero between requests.
  • DynamoDB for audit logs: Immutable record of every dispatch decision, useful for contractor disputes and performance tracking.

Deployment and Testing

Verified dispatch system works end-to-end:

  • Created test boat cleaning task
  • Ran dispatch_boat_cleaner.py locally with test contractor database
  • Confirmed email delivery