Automating Boat Cleaning Dispatch and Calendar Integration for Multi-Site Operations
This session focused on solving a critical operational problem: integrating boat cleaning services across multiple event platforms without relying on manual third-party services. With FancyHands unable to provide reliable support, we built an automated dispatch system that centralizes cleaning requests from multiple booking platforms and syncs them to a shared calendar.
The Problem: Multi-Platform Scheduling Without Central Coordination
The Queen of San Diego operates across multiple boat rental platforms (GetMyBoat, Boatsetter) and maintains a custom events calendar in Google Calendar. Previously, cleaning requests arrived through disparate channels with no unified dispatch mechanism. The FancyHands service was supposed to bridge this gap, but after cancellation, we needed an alternative that could:
- Aggregate cleaning requests from multiple boat booking platforms
- Centralize dispatch decisions in a single system
- Sync scheduled cleanings back to Google Calendar for crew visibility
- Provide audit trails and status tracking
Solution Architecture: Script-Based Dispatch with Calendar Sync
We implemented a two-layer system combining Python-based scraping and Google Apps Script calendar synchronization:
Layer 1: Platform Ingestion
File: /Users/cb/Documents/repos/tools/platform_inbox_scraper.py
This script polls boat rental platforms for new bookings and cleaning requirements. The design prioritizes:
- Credential isolation: Platform credentials stored in environment variables, never in source code
- Structured output: Produces JSON objects with standardized fields (booking_id, platform, date_required, notes)
- Error resilience: Implements retry logic with exponential backoff for API timeouts
- Rate limiting: Respects platform API quotas by batching requests
The scraper runs on a schedule (currently manual trigger, can be automated via Lambda or cron) and outputs to a standardized location for the dispatch layer to consume.
Layer 2: Dispatch Management
File: /Users/cb/Documents/repos/tools/dispatch_boat_cleaner.py
This is the decision engine. It receives cleaning requests and determines which cleaner gets assigned based on:
- Cleaner availability (queried from a shared availability source)
- Geographic proximity to the boat location
- Specialization (some cleaners handle interior detailing, others handle engine maintenance)
- Historical performance metrics (stored in DynamoDB for quick lookup)
Output is a dispatch assignment that feeds directly into the calendar sync layer.
Layer 3: Calendar Synchronization
File: /Users/cb/Documents/repos/sites/queenofsandiego.com/rady-shell-events/apps-script-replacement/CalendarSync.gs
This Google Apps Script handles bidirectional sync between dispatch assignments and Google Calendar:
- Reads dispatch queue from a shared Sheet (source of truth)
- Creates calendar events with cleaner contact info, boat location, and specific instructions
- Sets event color codes by status (yellow=pending assignment, green=confirmed, blue=completed)
- Monitors calendar for manual overrides (if a human manually changes an event, CalendarSync respects the change)
The script runs on a polling interval (configurable, currently set to 15-minute checks) and uses the Google Calendar API to maintain synchronization without requiring direct database access.
Integration with Existing Infrastructure
Credentials Management: All platform credentials are stored in /repos.env and loaded via Python's python-dotenv library. The dispatch script never writes credentials to disk or logs.
Calendar API Access: CalendarSync.gs authenticates via the Apps Script built-in CalendarApp service, which uses the executing Google account's permissions. For programmatic access from Lambda functions, we use a service account credential stored in AWS Secrets Manager and referenced via environment variable.
Data Persistence: Dispatch history is stored in a JSON file at /Users/cb/Documents/repos/tools/campaign_schedule.json for audit purposes. For production scaling, this should migrate to DynamoDB for better query performance and automatic backups.
Deployment Strategy
File: /Users/cb/Documents/repos/tools/deploy_inbox_scraper.sh
Deployment is handled via shell scripts that:
- Validate Python syntax and import dependencies
- Copy scripts to their runtime location (Lambda for scrapers, GAS for calendar sync)
- Inject environment-specific configuration (API endpoints, polling intervals)
- Run a smoke test against sandbox environments before production deployment
The current setup uses a hybrid approach: Python scripts can run locally, on EC2, or as Lambda functions. CalendarSync.gs runs in the Apps Script runtime (Google's servers), ensuring no infrastructure overhead.
Why This Approach Over Alternatives
FancyHands Replacement: Third-party task management services add cost, introduce dependency risk, and create audit gaps. By building internal dispatch, we maintain control over data and decision logic.
Direct API vs. Scraping: Some boat platforms expose limited APIs. Scraping is more resilient to API changes (though more brittle). For critical platforms with good API support, the architecture allows switching to direct API calls by modifying only the platform adapter layer.
Google Calendar as Source of Truth: Crew members already use Google Calendar. Syncing to Calendar (rather than asking them to check a separate dashboard) reduces context switching and ensures cleaners see assignments in their existing workflow.
Testing and Validation
Before deploying to production:
- Verified dispatch script logic against synthetic booking data
- Tested CalendarSync.gs in a sandbox Google Calendar to ensure event creation works
- Confirmed Lambda API endpoint returns calendar events with correct format
- Validated that manual calendar edits persist and don't get overwritten on next sync cycle
Next Steps and Scaling Considerations
- Cleaner Mobile Apps: Build a simple React Native app so cleaners receive push notifications when assigned jobs, with GPS tracking for navigation
- Photo Verification: Integrate image upload into the dispatch flow so cleaners can submit before/after photos that auto-sync to a shared S3 bucket
- Metrics Dashboard: Export dispatch history to a BI tool (Tableau or Metabase) to track cleaner utilization, turnaround times, and cost per cleaning
- Machine Learning: Once historical data accumulates, train a simple model to predict optimal cleaner assignment based on past performance
- Webhook Integration: Replace polling with webhooks from boat platforms so dispatch happens in real-time rather than on a schedule
The foundation is in place to scale this system horizontally as cleaning volume grows. The architecture separates concerns (platform ingestion, dispatch logic, calendar sync) so each layer can be optimized independently.
```