Deploying a Receipt Upload Portal for quickdumpnow.com While Automating Monthly Port Sheet Generation

This session involved two parallel infrastructure projects: standing up a new receipts management page for a trailer rental business, and completing automation for a recurring monthly charter port sheet distribution system. Both required careful orchestration of static site deployment, Google Sheets integration, OAuth token management, and email delivery pipelines.

Project 1: Receipt Portal Deployment to quickdumpnow.com

The quickdumpnow.com site needed a dedicated landing page for users to submit trailer rental receipts. The infrastructure challenge was that CloudFront was configured with a custom error response that redirects all 404s to the homepage, masking missing S3 objects and making debugging deployment issues difficult.

File Structure and Deployment

The books page was created at:

/Users/cb/Documents/repos/sites/quickdumpnow.com/books/index.html

This follows the standard pattern for pretty URLs in S3 static hosting. S3 requires the HTML file to be uploaded to two keys to support both URL patterns:

  • books/ (directory-style, handled by index.html auto-routing)
  • books (bare key for direct access)

The deployment command uploaded both variants to the S3 bucket backing quickdumpnow.com. The robots.txt file was also updated to block the books endpoint from search indexing during development:

Disallow: /books

Both files were uploaded via AWS CLI to the S3 bucket (specifics redacted for security), then CloudFront cache was invalidated using the distribution ID to clear the CDN layer.

CloudFront Cache Invalidation Strategy

Two separate invalidations were required:

  • /books and /books/* patterns to clear any cached responses
  • /robots.txt to ensure the updated block list propagates

CloudFront invalidations typically complete within 30–60 seconds. The distribution config was verified to confirm no custom error responses were overriding the 200 status for the books page—the 404 redirect was scoped only to root-level requests.

Project 2: Automating Port Sheet Generation and Distribution

The port sheet is a monthly charter ledger that tracks revenue, vessel movements, and crew assignments. Previously, this required manual spreadsheet updates and email assembly. The goal was to fully automate the monthly generation, formatting, and distribution to stakeholders (Maria, Edwin, Caro).

Google Sheets and Apps Script Architecture

The system is built on Google Sheets with Apps Script as the orchestration layer. The core file is:

/Users/cb/Documents/repos/sites/queenofsandiego.com/PortSheetReporter.gs

This Apps Script file reads from the JADA Port Log 2026 Google Sheet (a shared workbook containing monthly tabs). Each month gets its own tab (e.g., "April 2026") with a standard format:

  • Column A: Date
  • Column B: Captain/Charter Name
  • Column C: Vessel
  • Column D: Revenue
  • Columns E–H: Crew assignments

The Python layer reads this structure and generates formatted output. To test the April 2026 entry, a new charter was added for Joseph Zurek at $1,845.72.

OAuth Token Management and Calendar Integration

A critical challenge emerged: the port sheet script needs to read from Google Drive (the Port Log sheet) and also verify dates against the JADA sailing calendar in Google Calendar. These require different OAuth scopes, and tokens can expire.

Two credential files are in use:

  • Drive credentials: Scope includes sheets and drive.readonly
  • Calendar credentials: Scope includes calendar.readonly

A new Python script was created to handle calendar token refresh:

/Users/cb/Documents/repos/tools/reauth_jada_calendar.py

This script:

  1. Checks for an existing refresh token in the local token file
  2. If the token is stale, uses the refresh token to obtain a new access token
  3. Tests the new token by querying the JADA sailing calendar
  4. Stores the refresh token for future use

The reauth script runs locally and updates an environment variable (GCAL_REFRESH_TOKEN) that is then pushed to AWS Lambda, where the actual port sheet generation runs.

Local Development and Testing

The main port sheet logic lives in Python:

/Users/cb/Documents/repos/tools/jada_port_sheet.py

This script was modified multiple times to:

  • Read the Port Log sheet structure and extract month/year from the active tab
  • Query the calendar token to validate charter dates
  • Format the port sheet as an HTML email body
  • Send via SMTP to the recipient list

During local testing, a port 8765 conflict arose (likely a stale Python process from a previous test run). This was resolved by identifying and killing the background process before restarting the script.

Excel Format Preservation

The actual port sheet is stored as an Excel file (.xls format) with multiple sheets: a "Template" tab and historical monthly tabs. Python's openpyxl library was used to read and write the Excel structure while preserving formatting and row heights.

The script:

  1. Downloads the existing Excel file from Google Drive
  2. Reads the "Template" sheet to understand the expected layout
  3. Creates a new month tab (e.g., "April 2026")
  4. Copies template formatting and writes new entries
  5. Uploads the updated file back to Drive

This preserves version history in Google Drive and ensures all stakeholders reference the same source of truth.

Email Delivery and Final Deployment

The port sheet is emailed to three recipients. The email construction happens in the Python script, which:

  • Formats the port sheet data as an HTML table with inline CSS
  • Includes the month and year in the subject line
  • Attaches the updated Excel file
  • Sends via SMTP (SMTP server details and credentials redacted)

The April 2026 port sheet with the Zurek charter ($1,845.72) was successfully generated and sent to Maria, Edwin, and Caro.

Key Decisions and Trade-offs

  • Hybrid Python + Apps Script: Apps Script handles Google Workspace integration, Python handles heavy lifting (Excel manipulation, scheduling). This avoids vendor lock-in while keeping Google APIs close to their native client libraries.
  • Environment Variables for Secrets: Tokens are stored as Lambda environment variables rather than hardcoded, allowing rotation without code changes.
  • Refresh Token Caching: The refresh token is stored locally and only refreshed when the access