Deploying a Receipt Upload System for quickdumpnow.com and Automating Port Sheet Data Sync
This session involved two parallel projects: standing up a receipt management interface for a trailer rental business, and debugging an automated port sheet synchronization system that pipes charter data into Google Sheets. Both required careful coordination of static asset deployment, CloudFront caching strategies, and OAuth token refresh logic.
Receipt Upload System Deployment
The quickdumpnow.com books page needed to transition from a static landing page to a functional receipt upload interface. The site structure already existed at /Users/cb/Documents/repos/sites/quickdumpnow.com/, with a books/ directory ready for deployment.
File Structure and Deployment Strategy
- Modified
/books/index.htmlto match the existing site design language - Updated
robots.txtto block the/bookspath from search indexing (since it's an authenticated resource for internal use) - Deployed to S3 bucket (quickdumpnow.com distribution) using dual-key upload pattern: one for
books/index.htmland one for the directory-style pretty URLbooks/
The dual-key approach is critical here. S3 doesn't automatically serve index.html` when you request a directory path without explicit configuration. By uploading the same content to both books/index.html and books/ keys, we ensure that both https://quickdumpnow.com/books and https://quickdumpnow.com/books/ resolve correctly, avoiding redirect chains.
CloudFront Configuration Issue
Initial testing showed requests to /books returning the homepage instead of the receipt page. This indicated the CloudFront distribution had a custom error response configured—likely routing all 404s back to the root index. While this is excellent UX for a marketing site (users with typos don't see error pages), it masks deployment issues during development.
The fix involved:
- Verifying the distribution's custom error response settings (Status Code: 404 → 302 redirect to /)
- Running explicit CloudFront cache invalidation for both
/booksand/books/*patterns - Waiting for invalidation completion (~30–60 seconds) before verifying the live endpoint
Command executed:
aws cloudfront create-invalidation \
--distribution-id [DISTRIBUTION_ID] \
--paths "/books" "/books/*" "/robots.txt"
The robots.txt change prevents search engines from indexing this internal tool, which is important for keeping the site's crawl budget focused on public-facing content.
Port Sheet Automation and Calendar Token Refresh
In parallel, the automated port sheet system that feeds charter data into the JADA Port Log sheets was experiencing authentication failures. The system consists of three components:
/Users/cb/Documents/repos/tools/jada_port_sheet.py— Main data sync script that reads from Google Calendar and writes to Google Sheets/Users/cb/Documents/repos/tools/reauth_jada_calendar.py— Token refresh utility for maintaining OAuth credentials/Users/cb/Documents/repos/sites/queenofsandiego.com/PortSheetReporter.gs— Apps Script that triggers the sync and handles response formatting
Root Cause: Stale OAuth Tokens
Google's OAuth 2.0 refresh tokens occasionally become invalid, particularly after credential file changes or extended idle periods. The initial troubleshooting revealed:
- The existing
GCAL_REFRESH_TOKENin the Lambda environment was no longer valid - Two credential JSON files existed with potentially mismatched client IDs
- The calendar token file contained an expired refresh token that couldn't be refreshed
The solution required re-authenticating from scratch. The reauth_jada_calendar.py script implements a local OAuth flow:
python3 reauth_jada_calendar.py
This script:
- Launches a local HTTP server on port 8765 to receive the OAuth callback
- Opens the user's browser to Google's authorization endpoint with the Calendar API scope
- Exchanges the authorization code for fresh access and refresh tokens
- Saves the refresh token to a file for persistence
Infrastructure Gotcha: Port Conflicts
During testing, port 8765 was already in use by a stale Python process. Rather than changing the callback redirect URI in the OAuth credential (which would require updating Google Cloud Console configuration), we killed the existing process:
lsof -ti:8765 | xargs kill -9
This is a small operational detail, but important: when integrating local OAuth flows into CI/CD or automated tooling, you need to either bind to an ephemeral port (0), implement port retry logic, or ensure cleanup between runs.
Token Management and Lambda Deployment
Once the new refresh token was obtained, it was pushed to AWS Lambda's environment variables:
aws lambda update-function-configuration \
--function-name [PORT_SHEET_LAMBDA_NAME] \
--environment Variables={GCAL_REFRESH_TOKEN=[NEW_TOKEN]}
The Lambda function uses this token to request fresh access tokens on each invocation, avoiding the need to redeploy when tokens expire. This is the standard pattern for long-lived integrations with external APIs.
Port Sheet Entry Format and Data Sync
With authentication restored, we added the charter entry for the $1,845.72 transaction. The port sheet system reads from Google Calendar events and maps them into the JADA Port Log spreadsheet structure.
Data Flow:
- Charter booking appears in Google Calendar (via external system or manual entry)
jada_port_sheet.pyqueries the calendar using the refresh token- Script parses event titles/descriptions for vessel name, client, and amount
- Data is appended to the appropriate month tab in the JADA Port Log 2026 sheet
- PortSheetReporter.gs (Apps Script) handles any post-processing or notifications
The entry format follows the existing template structure in the spreadsheet, ensuring consistency with historical records. The script intelligently creates month tabs if they don't exist, allowing the system to handle year-round data without manual sheet setup.
Key Decisions and Trade-offs
- Dual S3 Key Upload: More efficient than configuring S3 index document settings, which require bucket-level changes and potential side effects
- CloudFront 404 Custom Error: Good for user experience, but requires explicit invalidation during deployments—we accepted this as a known limitation
- Token Refresh in Lambda: Avoids credential drift by refreshing on each invocation rather than caching tokens indefinitely
- Local OAuth Flow: More secure than embedding long-lived service account keys, and doesn't require additional GCP service account infrastructure