Deploying a Receipt Management System for quickdumpnow.com and Automating Monthly Port Sheet Generation

Overview

This session involved two parallel infrastructure initiatives: deploying a receipt management landing page for a trailer rental business at quickdumpnow.com/books, and completing the automation pipeline for monthly port sheet generation and distribution to stakeholders. Both required careful coordination of static site hosting, CloudFront caching behavior, and serverless function deployment.

Part 1: Deploying the Books Receipt Management Page

The Problem

The quickdumpnow.com/books endpoint was returning the site homepage instead of a dedicated receipt management page. The local development files existed at /Users/cb/Documents/repos/sites/quickdumpnow.com/books/index.html, but hadn't been deployed to production S3 storage. Additionally, CloudFront's error handling configuration was masking the missing object by defaulting to the homepage.

Deployment Strategy

The solution required three coordinated steps:

  • S3 Object Upload: Deploy the books index page to two S3 keys within the quickdumpnow distribution bucket to support both directory and pretty URL routing patterns:
    • s3://quickdumpnow-site/books/index.html (directory routing)
    • s3://quickdumpnow-site/books (pretty URL)
  • Robots.txt Update: Modified /Users/cb/Documents/repos/sites/quickdumpnow.com/robots.txt to block the /books path from search engine crawling, preventing duplicate content issues during the development phase.
  • CloudFront Cache Invalidation: Issued invalidation requests for both /books/* and /books patterns plus robots.txt to ensure edge caches in all regions refreshed within 30–60 seconds.

Root Cause: CloudFront Error Responses

CloudFront's distribution was configured with custom error responses that redirect 404s to the homepage. This is a common pattern for single-page applications but masks deployment issues. The distribution configuration shows this behavior at the origin level—when S3 returns a 404, CloudFront's error response rule catches it and serves the root index page instead. This necessitated uploading to the exact S3 keys CloudFront expects before the change would be visible.

Part 2: Automating the Port Sheet Pipeline

Background and Architecture

The port sheet system tracks sailing charters, fuel expenses, and port visits for a commercial sailing operation. Previously, this was a manual spreadsheet update process. The automation pipeline includes:

  • A Google Apps Script file at queenofsandiego.com/PortSheetReporter.gs serving as the report generator
  • A Python CLI tool (/tools/jada_port_sheet.py) for local testing and data validation
  • A new Python authentication module (/tools/reauth_jada_calendar.py) handling OAuth 2.0 token refresh
  • AWS Lambda integration for monthly scheduled execution

The Core Challenge: Authentication Across Services

The system needed to read from both Google Sheets (JADA Port Log 2026) and Google Calendar (to fetch charter event data), then generate and distribute a formatted port sheet via email. This required managing OAuth 2.0 refresh tokens across multiple Google scopes:

  • https://www.googleapis.com/auth/spreadsheets (read JADA Port Log)
  • https://www.googleapis.com/auth/calendar.readonly (fetch charter events)
  • https://www.googleapis.com/auth/gmail.send (send port sheet emails)

Implementation: The reauth_jada_calendar.py Module

Created a dedicated reauthentication script that handles the OAuth 2.0 authorization code flow:

# Pseudo-code structure (actual credentials redacted)
def authorize_with_browser():
    """Initiate browser-based OAuth flow on localhost:8765"""
    auth_url = build_oauth_url(client_id, scopes, redirect_uri)
    # Open browser, user grants permissions
    # Local server receives authorization code
    # Exchange code for refresh token
    return refresh_token

def refresh_access_token(refresh_token, client_id, client_secret):
    """Exchange refresh token for new access token"""
    # Used by Lambda to maintain fresh credentials
    pass

The script uses a local HTTP server on port 8765 to receive the OAuth callback, which required handling stale processes. After running the authorization flow, the refresh token is securely stored and deployed to AWS Lambda as an environment variable.

Port Sheet Data Structure

The JADA Port Log 2026 spreadsheet structure required careful mapping:

  • Template Tab: Contains column headers and formatting specifications for each month
  • Month Tabs (March, April, etc.): Data rows in format: Date, Captain, Destination, Fuel Cost, Charter Revenue, Notes
  • Data Preservation: When generating new month tabs, existing data must be preserved exactly as formatted

The Python script reads this structure using the Google Sheets API and validates data types before appending new charter entries.

Processing the Latest Charter Entry

For the charter completed yesterday with total revenue of $1,845.72 and captain Joseph Zurek:

  • Queried Google Calendar for the charter event (date/time)
  • Retrieved destination and other metadata from event details
  • Validated fuel costs and charter revenue figures
  • Appended formatted row to the April 2026 tab: 2026-04-XX | Joseph Zurek | [Destination] | [Fuel] | 1845.72 | [Notes]
  • Triggered email send to stakeholders (Edwin, Maria, Caro) with the updated port sheet

Key Infrastructure Decisions

Why Store Refresh Tokens in Lambda Environment Variables

Rather than requesting user authorization on every scheduled execution, the refresh token is persisted and passed to Lambda. This allows the monthly cron job to execute without human intervention while maintaining compliance with OAuth 2.0 specs (refresh tokens don't expire unless explicitly revoked).

Why Dual S3 Key Upload for the Books Page

S3 + CloudFront doesn't have built-in directory index behavior like traditional web servers. Uploading to both books/index.html and books (with content-type set to text/html) ensures that both /books/ and /books requests resolve correctly, regardless of client URL normalization.

Why Python for Port Sheet Logic, Google Apps Script for Reporting

Google Apps Script (GAS) serves as the report template and formatting engine (familiar to non-technical stakeholders), while Python handles the orchestration, OAuth token management, and API coordination—areas where dynamic language capabilities and library ecosystems are stronger.

Testing and Validation

  • Dry-run the port sheet pipeline with test data before production deployment
  • Validate CloudFront cache behavior by checking response headers and timing
  • Verify robots.txt is correctly blocking /books