Deploying a Receipt Upload Portal and Automating Port Sheet Generation for Multi-Site Infrastructure

This session focused on two parallel tracks: standing up a new receipts collection portal for a trailer rental business and automating recurring port sheet generation with Google Calendar integration. Both required careful coordination across S3, CloudFront, Google Apps Script, and Lambda—and exposed some interesting architectural patterns worth documenting.

The Problem

https://quickdumpnow.com/books was returning the site homepage instead of a dedicated receipts page. Meanwhile, the JADA port sheet—a monthly charter log sent to stakeholders—was being generated manually, with no systematic way to track updates or maintain historical records.

Part 1: Deploying the Books/Receipts Portal

File Structure and Content Delivery

The local source lives at /Users/cb/Documents/repos/sites/quickdumpnow.com/books/index.html. The deployment required uploading to S3 with a specific key structure to enable CloudFront's pretty URL routing:

  • s3://quickdumpnow-web/books/index.html — standard object key for nested access
  • s3://quickdumpnow-web/books/ — bare directory key for root-level pretty URL requests

Why both? CloudFront's origin path resolution doesn't automatically strip trailing slashes on directory requests. By uploading to both keys, we ensure /books, /books/, and /books/index.html all resolve correctly. This is a common pattern when CloudFront sits in front of S3 without a Lambda@Edge rewrite layer.

robots.txt Configuration

Updated s3://quickdumpnow-web/robots.txt to block indexing of the books portal (not yet ready for public discovery). This required:

  • Adding disallow rule: User-agent: * / Disallow: /books
  • Re-uploading to S3 with Cache-Control: max-age=3600 to ensure search engines pick up changes within an hour

CloudFront Cache Invalidation

Invalidated the following paths on the QuickDumpNow CloudFront distribution:

  • /books
  • /books/
  • /books/index.html
  • /robots.txt

CloudFront's default error response (a custom 404 → homepage redirect) was previously masking the missing object. Invalidation clears the cached redirect, forcing CloudFront to refetch from origin and discover the newly uploaded objects.

Why Not a Lambda@Edge Rewrite?

Given that this is a simple static portal, the dual S3 key approach was preferred over a Lambda@Edge origin request rewrite because:

  • Cost: S3 uploads are negligible; Lambda@Edge adds per-request charges
  • Latency: No extra compute on the request path
  • Maintenance: One fewer Lambda function to version-control and monitor

If the receipts portal evolves into a backend-driven system (form processing, database writes), we'd reconsider this decision.

Part 2: Automating the JADA Port Sheet

The Architecture

The port sheet automation spans three codebases:

  • Google Apps Script: /Users/cb/Documents/repos/sites/queenofsandiego.com/PortSheetReporter.gs — reads the Google Sheet, formats data, triggers email
  • Python CLI tools: /Users/cb/Documents/repos/tools/jada_port_sheet.py and reauth_jada_calendar.py — local utilities for sheet inspection and OAuth token refresh
  • Google Calendar + Sheets API: Credentials stored in Lambda environment variables; calendar events trigger the Apps Script

Sheet Structure and Data Model

The source of truth is the "JADA Port Log 2026" Google Sheet, which uses a multi-tab design:

  • Template tab: Column headers and formatting rules
  • Month tabs (e.g., "March", "April"): Individual charter entries with fields: Date, Boat Name, Captain, Destination, Duration, Crew Count, Gross, Expenses, Net

Entries are appended programmatically; the April 2026 charter (Joseph Zurek, $1,845.72 total) was added using the Google Sheets API with the correct tab ID (sheetId extracted from the sheet metadata).

OAuth Token Refresh and Calendar Integration

The port sheet generation is triggered by a Google Calendar event. To avoid token expiry mid-run, we implemented reauth_jada_calendar.py

python reauth_jada_calendar.py
# Uses local OAuth credentials to refresh the calendar API token
# Persists new refresh_token to disk
# On success, pushes GCAL_REFRESH_TOKEN environment variable to Lambda

This script:

  • Checks the existing token's expiry and scopes
  • Compares client ID in the calendar credentials file against the drive credentials to ensure they match
  • Handles the OAuth 2.0 authorization code flow on localhost:8765 (killed any stale processes holding the port)
  • Validates the new token against the Calendar API before pushing to Lambda

Why this complexity? Google Calendar tokens expire after ~1 hour; the port sheet runs on a monthly schedule. Keeping a persistent refresh_token in Lambda ensures the script never fails due to expiry.

Email Delivery

The Apps Script invokes sendEmail() to deliver the formatted port sheet to stakeholders (Edwin, Maria, Caro). Email is sent from the Google Workspace account associated with the Sheet; recipients are defined in the script as a hardcoded array (could be parameterized in a future iteration).

Key Decisions and Trade-offs

  • Spreadsheet as Database: Using Google Sheets instead of a relational database keeps operational friction low (non-technical stakeholders can inspect/audit the data), but sacrifices querying flexibility. A future refactor to a proper backend would enable richer analytics.
  • Monthly Tab Structure: Organizing by month aids readability and aligns with business operations, but complicates year-over-year comparisons. Consider a "Unified" tab with helper formulas for aggregation.
  • Apps Script vs. Cloud Functions: Apps Script is tightly integrated with Google Sheets and Calendar; Cloud Functions would require additional credential management. Apps Script was the right choice here.
  • Lambda vs. Cloud Tasks for Token Refresh: Lambda's environment variable injection (via Terraform or AWS CLI) is simpler than Cloud Tasks' secret manager integration for this use case.

Testing and Validation

Before sending the port sheet, we validated the pipeline:

  • Dry-run