Automating Boat Cleaning Dispatch with Python and Lambda: Replacing Manual Service Coordination
What Was Done
This session replaced a manual, email-based boat cleaning coordination process with an automated dispatch system. The previous workflow relied on FancyHands (a task outsourcing service) to coordinate cleaning appointments—which proved unreliable and costly. The new system uses Python scripts to manage dispatch directly, integrated with existing infrastructure (Lambda, SES, calendar APIs), eliminating the intermediary and reducing operational friction.
The Problem: Manual Coordination at Scale
Before this change, the workflow for scheduling boat cleaning went like this:
- Create task in FancyHands platform
- Wait for human review and contractor assignment
- Email back-and-forth to confirm dates/times
- Manual calendar entry after confirmation
- Track completion via email threads
This introduced latency, cost per task, and dependency on external service reliability. When FancyHands cancellations occurred, the entire workflow broke.
Technical Architecture
Core Components
The new system consists of three main pieces:
- Dispatch Script (
/Users/cb/Documents/repos/tools/dispatch_boat_cleaner.py): Parses cleaning requests, validates boat/date/time data, and triggers downstream actions - Inbox Scraper (
/Users/cb/Documents/repos/tools/platform_inbox_scraper.py): Monitors email for cleaning requests from booking platforms (GetMyBoat, Boatsetter, direct inquiries), normalizes the data, and queues dispatch jobs - Deployment Script (
/Users/cb/Documents/repos/tools/deploy_inbox_scraper.sh): Packages and deploys the scraper to Lambda for scheduled execution
Integration Points
The dispatch system integrates with existing infrastructure:
- Calendar Sync (Google Calendar): CalendarSync.gs (Google Apps Script) receives dispatch events via the Lambda calendar API and updates the shared calendar automatically
- Email Notification: AWS SES sends confirmation emails to boat owners and cleaning coordinators with appointment details
- Gmail API: Monitors multiple email inboxes (booking platforms, direct inquiries) for new cleaning requests
- Credentials Management: Platform credentials (GetMyBoat, Boatsetter API keys) stored in
repos.envand loaded by scraper at runtime
Why This Architecture
Async Event Processing
The inbox scraper runs on a scheduled Lambda trigger (e.g., every 15 minutes) rather than real-time webhooks. This was chosen for several reasons:
- Eliminates need for webhook receivers on public endpoints
- Natural batching: multiple booking requests processed in one invocation
- Easier debugging: can replay scraper with same email state
- Lower Lambda costs: single invocation processes 5-10 emails vs. 5-10 webhook triggers
Separation of Concerns
Three distinct scripts rather than one monolith:
- Scraper focuses on ingestion: extracting structured data from unstructured email
- Dispatcher focuses on orchestration: validating, sequencing, and triggering actions
- Deployment is its own concern: can be versioned independently from runtime logic
This allows the dispatcher to be called from multiple sources (direct API, CLI for testing, future webhook) without coupling it to email format.
Implementation Details
Inbox Scraper Function Signature
# Pseudo-code structure
def scrape_platform_inbox(platform_name):
"""
Args:
platform_name: 'getmyboat' | 'boatsetter' | 'gmail'
Returns:
List[{
'source': platform_name,
'request_id': str,
'boat_id': str,
'date': ISO8601,
'time_start': HH:MM,
'duration_hours': int,
'requester_name': str,
'requester_email': str,
'notes': str
}]
"""
The scraper reads credentials from environment and calls each platform's API or Gmail API to fetch new messages. Email parsing uses regex patterns specific to each platform's booking confirmation format.
Dispatch Script Orchestration
# Pseudo-code flow
def dispatch_cleaning(cleaning_request):
# 1. Validate: boat exists, date not in past, duration reasonable
if not validate_request(cleaning_request):
return {'status': 'rejected', 'reason': 'invalid_date'}
# 2. Check calendar availability (call Lambda calendar API)
conflicts = check_calendar(
boat_id=cleaning_request['boat_id'],
date=cleaning_request['date']
)
if conflicts:
return {'status': 'conflict', 'conflicts': conflicts}
# 3. Add to calendar
event_id = add_calendar_event(cleaning_request)
# 4. Send confirmation emails (SES)
send_confirmation_email(requester_email, event_id)
send_coordinator_email(coordinator_email, event_id)
# 5. Log to dashboard
return {'status': 'dispatched', 'event_id': event_id}
Deployment to Lambda
The deployment script packages the scraper with its dependencies and uploads to Lambda. The function is triggered by CloudWatch Events (or EventBridge) on a fixed schedule:
- Trigger: CloudWatch Event Rule executing every 15 minutes
- Function:
platform_inbox_scraper(deployed to the shared tools Lambda account) - Environment Variables:
GMAIL_CREDENTIALS_PATH,GETMYBOAT_API_KEY,BOATSETTER_API_KEY - Output: Invokes dispatch_boat_cleaner directly with structured JSON
Calendar Integration
Once a cleaning request is dispatched, the system adds an event to Google Calendar via the existing Lambda calendar API. The CalendarSync.gs file in /Users/cb/Documents/repos/sites/queenofsandiego.com/rady-shell-events/apps-script-replacement/CalendarSync.gs listens for new events and syncs them bidirectionally with the boat-specific calendar.
The calendar action is invoked as:
POST /calendar/api
{
"action": "add-calendar-event",
"details": {
"title": "Boat Cleaning: [Boat Name]",
"date": "2024-04-28",
"time_start": "10:00",
"duration_minutes": 180,
"description": "Cleaning request from [Platform]",
"calendar_id": "boat-specific-calendar-id"
}
}