Deploying a Receipt Management System for quickdumpnow.com and Automating Port Sheet Generation with Google Apps Script
This post covers two concurrent infrastructure and automation projects: standing up a receipt management interface at quickdumpnow.com/books and refactoring the JADA port sheet generation pipeline to use Google Apps Script with calendar token refresh automation.
Project 1: Receipt Management UI at quickdumpnow.com/books
What Was Done
Deployed a new receipt upload interface at https://quickdumpnow.com/books to support trailer rental business documentation. The deployment involved:
- Creating and updating
/Users/cb/Documents/repos/sites/quickdumpnow.com/books/index.htmlwith a custom HTML interface - Modifying
/Users/cb/Documents/repos/sites/quickdumpnow.com/robots.txtto block the /books path from search indexing - Uploading to S3 and invalidating CloudFront cache
- Debugging CloudFront's custom 404 error handler which was redirecting /books to the homepage
Technical Details: S3 and CloudFront Configuration
The quickdumpnow.com infrastructure uses a common pattern: S3 origin with CloudFront distribution and custom error responses.
The Problem: When accessing https://quickdumpnow.com/books, CloudFront returned the homepage instead of the books landing page. This occurred because:
- The S3 object at
books/index.htmlexisted but wasn't accessible via the pretty URL/books - CloudFront had a custom error response rule redirecting all 404s to the root homepage (a common pattern for single-page applications)
- S3 doesn't automatically route directory requests to index.html like a traditional web server does
The Solution: Deploy the index.html file to two S3 keys to support both directory-style and explicit file access:
# Upload both keys to support /books and /books/ access patterns
s3 cp books/index.html s3://quickdumpnow-prod/books/index.html
s3 cp books/index.html s3://quickdumpnow-prod/books
After uploading, we invalidated the CloudFront distribution cache for both the directory pattern and the explicit filename:
cloudfront create-invalidation --distribution-id [DIST_ID] --paths "/books" "/books/*" "/robots.txt"
Why This Approach: S3 + CloudFront doesn't have built-in directory index routing like Apache or Nginx. By uploading to both the directory and explicit keys, we ensure CloudFront can serve the content regardless of whether the client requests /books or /books/. The robots.txt modification prevents search engines from indexing the receipt management interface, keeping it as an internal tool.
Infrastructure Changes
- S3 Bucket: quickdumpnow-prod (existing)
- CloudFront Distribution: E[DIST_ID] (existing, configured with custom error responses)
- Route53: No changes required; DNS already points to CloudFront
- robots.txt Update: Added disallow rule for /books to prevent indexing
Project 2: Automated Port Sheet Generation with Google Apps Script and Calendar Token Refresh
What Was Done
Refactored the JADA port sheet generation system to:
- Read charter and port data from the JADA Port Log 2026 Google Sheet (tabs: Template, monthly sheets)
- Dynamically append monthly data with proper formatting (preserving Excel formatting from the template)
- Handle Google Calendar API authentication via refresh token stored in AWS Lambda environment
- Create a standalone reauthentication utility to refresh expired calendar tokens
- Send completed port sheets via email to stakeholders (Edwin, Maria, Caro)
Technical Details: Multi-Language Automation Stack
Core Scripts:
/Users/cb/Documents/repos/tools/jada_port_sheet.py— Main orchestration script that reads Google Sheets, formats data, and triggers email notifications/Users/cb/Documents/repos/tools/reauth_jada_calendar.py— OAuth 2.0 refresh token utility for calendar API authentication/Users/cb/Documents/repos/sites/queenofsandiego.com/PortSheetReporter.gs— Google Apps Script (GAS) file deployed to the queenofsandiego.com project for email sending and spreadsheet operations
Data Source: JADA Port Log 2026 Google Sheet with multiple tabs:
- Template Tab: Contains the formatted header row and structure for new monthly data
- Monthly Tabs (e.g., April): Populated dynamically with charter entries, including columns: Date, Charter Name, Captain, Crew, Amount, Notes
The Python script reads the sheet structure, preserves the template formatting, and appends monthly entries. For the test entry, we added:
Date: [Charter Date]
Charter: Joseph Zurek Charter
Captain: [Captain Name]
Crew: [Crew Members]
Amount: $1,845.72
Notes: [Additional Details]
Authentication Architecture: OAuth 2.0 Refresh Tokens
Challenge: The port sheet script needs to authenticate with Google Calendar API to fetch events and send emails. Access tokens expire every hour, so we need persistent refresh tokens.
Solution: Three-tier authentication system:
- Local Development:
reauth_jada_calendar.pyimplements a local OAuth 2.0 flow listening onlocalhost:8765. It:- Directs browser to Google OAuth consent screen
- Captures authorization code from redirect
- Exchanges code for access + refresh tokens
- Saves refresh token to local config file
- Lambda Runtime: Refresh token stored in AWS Lambda environment variable
GCAL_REFRESH_TOKEN - Script Execution:
jada_port_sheet.pyreads the refresh token and uses it to obtain a fresh access token before each API call
Why This Approach: Refresh tokens are long-lived credentials that don't require user interaction to renew. Storing one in Lambda environment means the port sheet can run on a schedule (e.g., daily or weekly) without manual re-authentication. The local reauthentication script only runs when the refresh token expires or needs rotation.
OAuth Flow Implementation
The reauth_jada_calendar.py script uses the Google Python client library:
# Pseudocode structure (no actual credentials shown)
from google.auth.transport.requests import Request
from google.oauth2.service_account import Credentials
# For local reauthentication