```html

Building Multi-Site Automated Technical Blog Infrastructure with Session Capture and Auto-Publishing

This session established a fully automated technical blogging system across four domain properties, with infrastructure to capture development work, generate granular technical posts, and publish them to dedicated tech subdomains. The system hooks into Claude's session lifecycle to extract detailed engineering notes and transform them into formatted technical articles—no manual intervention required.

System Architecture Overview

The implementation consists of four key components:

  • Session Capture Hook: A Stop hook that executes when Claude sessions end, extracting session transcripts from the local `.claude/projects/` directory
  • Blog Generator: Python script that parses session transcripts, extracts tool use patterns and commands, and generates structured HTML blog posts
  • Infrastructure Provisioning: Initialization script that creates S3 buckets, CloudFront distributions, and DNS records for each tech blog domain
  • Deployment Pipeline: Automated S3 upload and CloudFront invalidation to publish posts immediately after generation

Infrastructure Created

Four tech blog properties were provisioned with identical infrastructure patterns:

  • tech.queenofsandiego.com – S3 bucket `qos-tech-blog`, CloudFront distribution, Route53 DNS alias pointing to CloudFront
  • tech.sailjada.com – S3 bucket `jada-tech-blog`, CloudFront distribution, Route53 DNS alias
  • tech.dangerouscentaur.com – S3 bucket `dc-tech-blog`, CloudFront distribution, Namecheap CNAME record (GoDaddy not available for this domain)
  • tech.burialsatseasandiego.com – S3 bucket `bats-tech-blog`, CloudFront distribution, GoDaddy DNS CNAME record

Each distribution uses:

  • S3 origin configured with public read access via bucket policy
  • CloudFront cache behavior with 5-minute TTL for index.html and 24-hour TTL for static assets
  • Existing wildcard ACM certificates: *.queenofsandiego.com and *.sailjada.com were already issued and active
  • New ACM certificates for tech.dangerouscentaur.com and tech.burialsatseasandiego.com with DNS validation via appropriate DNS providers

Session Capture and Blog Generation Workflow

The Stop hook (/Users/cb/.claude/hooks/tech_blog_stop.sh) executes this sequence:

1. Extract the session ID from the active Claude session
2. Locate the JSONL transcript in ~/.claude/projects/[project-path]/sessions/
3. Parse tool_use entries to identify:
   - Files modified (Write/Edit operations)
   - Commands executed (shell_command entries)
   - Projects/buckets/distributions affected
4. Invoke blog generator with transcript path and target site
5. Generator creates HTML post with granular technical details
6. Upload post to appropriate S3 bucket
7. Invalidate CloudFront distribution cache
8. Create index.html redirect if needed

The blog generator (/Users/cb/Documents/repos/tools/tech_blog_generator.py) processes the JSONL transcript and structures output as:

- Title (extracted from session context or inferred from major changes)
- What Was Done (bullet list of modified files with paths)
- Technical Details (commands run, AWS resources touched, config changes)
- Infrastructure Changes (S3/CloudFront/Route53/GoDaddy updates)
- Key Decisions (why certain approaches were chosen)
- What's Next (follow-up tasks or dependencies)

The transcript parser specifically redacts sensitive data:

  • AWS credentials, access keys, secret keys
  • API tokens and authentication headers
  • GoDaddy API credentials and domain auth codes
  • Private IP addresses and internal service endpoints
  • Any content from reference_godaddy_credentials.md or similar sensitive memory files

Integration with Ship's Papers Navigation

The main site navigation at /Users/cb/Documents/repos/sites/queenofsandiego.com/index.html was updated to include a "Ship's Tech Blog" link in the Ship's Papers dropdown menu. This provides immediate visibility to stakeholders like Sergio who want deep visibility into what's being built.

The implementation mirrors this across all four properties:

  • queenofsandiego.com points to tech.queenofsandiego.com
  • sailjada.com points to tech.sailjada.com
  • dangerouscentaur.com points to tech.dangerouscentaur.com
  • burialsatseasandiego.com points to tech.burialsatseasandiego.com

CloudFront and DNS Deployment Details

For the Route53-managed domains (queenofsandiego.com and sailjada.com), alias records were created directly:

Route53 Alias Record:
  Name: tech.queenofsandiego.com
  Type: A (IPv4)
  Alias Target: [CloudFront distribution domain]
  Evaluate Target Health: false
  Routing Policy: Simple

For Namecheap-managed dangerouscentaur.com and GoDaddy-managed burialsatseasandiego.com, CNAME records point to CloudFront:

CNAME Record:
  Name: tech
  Value: [CloudFront distribution domain name]
  TTL: 3600 seconds

The GoDaddy integration required programmatic DNS updates via the GoDaddy REST API, with credentials stored securely in the project memory files and excluded from git tracking.

Configuration and Infrastructure State

Infrastructure configuration was written to /Users/cb/.claude/projects/-Users-cb-Documents-repos/memory/project_tech_blogs.md, documenting:

  • S3 bucket names and regions
  • CloudFront distribution IDs (for invalidation API calls)
  • ACM certificate ARNs
  • DNS provider mapping (Route53 vs. Namecheap vs. GoDaddy)
  • Hosted zone IDs for Route53 domains

This reference ensures the blog generator can target the correct infrastructure without hardcoding sensitive values.

Key Technical Decisions

Why S3 + CloudFront instead of a traditional blog engine: Static HTML distribution from S3 through CloudFront provides global edge caching, zero infrastructure to maintain, and sub-second response times. Posts are generated as static HTML files, making them lightweight and resilient.

Why hook into session lifecycle: Capturing work at the moment sessions end ensures no engineering work goes undocumented. Developers don't need to remember to manually trigger blog generation—it's automatic.

Why JSONL transcripts: Claude's session transcripts already include precise structured data about tool usage, commands, and file changes. Parsing these avoids requiring developers to