Building an Automated Technical Blog System for Multi-Site Session Documentation
This session established a fully automated infrastructure to generate granular technical blog posts across four separate domain properties—queenofsandiego.com, sailjada.com, dangerouscentaur.com, and burialsatseasandiego.com—capturing implementation details directly from Claude development session transcripts.
The Problem and Design Philosophy
The requirement was to create a transparent audit trail of all technical work performed across these four properties, with enough detail that stakeholders (like Sergio) could understand exactly what was implemented, why, and how. This isn't a marketing blog—it's an engineering journal that automatically publishes whenever a session ends, extracting technical specifics from session transcripts without exposing credentials or sensitive data.
System Architecture
The solution consists of four primary components:
- Session Hook (
/Users/cb/.claude/hooks/tech_blog_stop.sh): Executes when any Claude Code session stops, triggering the blog generation pipeline - Infrastructure Initializer (
/Users/cb/Documents/repos/tools/tech_blog_init.py): Provisions S3 buckets, CloudFront distributions, ACM certificates, and DNS records for each domain's tech subdomain - Blog Generator (
/Users/cb/Documents/repos/tools/tech_blog_generator.py): Parses session transcripts in JSONL format, filters technical content, and generates HTML blog posts - Navigation Integration: Updated Ship's Papers menu on main sites to link to respective tech blogs
Infrastructure Setup Details
S3 and CloudFront Architecture
For each domain, we provisioned a dedicated S3 bucket following the pattern tech-[domain]-blog:
tech-queenofsandiego-blog: Serves tech.queenofsandiego.com via CloudFront distributiontech-sailjada-blog: Serves tech.sailjada.com via CloudFront distributiontech-dangerouscentaur-blog: Serves tech.dangerouscentaur.com via existing wildcard CF distribution (E2Q4UU71SRNTMB ondc-sitesbucket)tech-burialsatseasandiego-blog: Serves tech.burialsatseasandiego.com via CloudFront distribution
Each bucket is configured with:
- Versioning enabled for audit trail purposes
- Block public access disabled (CloudFront origin access only)
- Website hosting enabled with
index.htmlas index document - CORS headers configured for potential future API integration
TLS Certificate Strategy
The project leveraged existing wildcard certificates where available:
*.queenofsandiego.com: Existing wildcard ACM certificate in us-east-1*.sailjada.com: Existing wildcard ACM certificate in us-east-1dangerouscentaur.com: Existing wildcard CF distribution already handling*.dangerouscentaur.comburialsatseasandiego.com: New ACM certificate provisioned with DNS validation via GoDaddy API (domain registrar)
For burialsatseasandiego.com specifically, the provisioning script automatically detected the GoDaddy nameservers and added the DNS validation CNAME record (_[hash].tech.burialsatseasandiego.com) to the GoDaddy account via API, eliminating manual DNS configuration.
DNS Resolution
Domain DNS resolution mapped as follows:
queenofsandiego.comandsailjada.com: Route53 hosted zones with ALIAS records pointing CloudFront distributionsdangerouscentaur.com: Namecheap DNS with CNAME recordtech.dangerouscentaur.com CNAME [CloudFront domain]burialsatseasandiego.com: GoDaddy DNS with CNAME recordtech.burialsatseasandiego.com CNAME [CloudFront domain]
Session Transcript Processing
Claude Code sessions produce JSONL (JSON Lines) formatted transcripts. The blog generator:
- Reads the session transcript file from
~/.claude/projects/[project-path]/sessions/ - Parses each line as a JSON object, extracting tool use entries (file operations, commands executed)
- Filters entries relevant to each domain (looking for file paths in site repos and tools directories)
- Categorizes work: infrastructure changes, content updates, script development, configuration management
- Strips credentials using regex patterns for common secret formats
- Generates chronological HTML with file paths, command examples, and reasoning
- Uploads to respective S3 buckets with appropriate Content-Type headers
- Invalidates CloudFront caches to ensure immediate availability
The transcript parsing is domain-aware—it examines file paths like /Users/cb/Documents/repos/sites/sailjada.com/ to route posts to the correct tech blog.
Safety and Credential Filtering
The generator implements multiple layers of credential scrubbing:
- Regex patterns matching API key formats (e.g., AWS key IDs, UUID-style tokens)
- Environment variable value redaction before logging command output
- Exclusion of entries containing known sensitive file paths (e.g.,
.aws/credentials) - Manual review prompt before uploading posts containing potential secrets
This ensures technical transparency without exposing authentication material.
Navigation Integration
Updated /Users/cb/Documents/repos/sites/queenofsandiego.com/index.html Ship's Papers dropdown menu to include "Tech Blog" link. Each domain's main site navigation was similarly updated to link to its respective tech blog subdomain.
Provisioning and Testing
The tech_blog_init.py script was run against all four domains, creating:
- S3 buckets with default index.html placeholder pages
- CloudFront distributions with appropriate origin configurations and caching behaviors
- DNS records (Route53 ALIAS or registrar CNAME as applicable)
- Certificate validation records (automatically created at GoDaddy for burialsatseasandiego.com)
All four tech blogs were verified as live and accessible within minutes of provisioning.
Session Hook Integration
The Stop hook (tech_blog_stop.sh) is registered in Claude Code settings and executes automatically when sessions terminate. It:
- Reads the current session transcript from the project directory
- Invokes the blog generator with domain auto-detection