```html

Building Auto-Generated Technical Blog Infrastructure Across Four Domains

This session involved designing and implementing a comprehensive technical documentation system that automatically generates granular blog posts across four separate domains: tech.queenofsandiego.com, tech.dangerouscentaur.com, tech.sailjada.com, and tech.burialsatseasandiego.com. The goal was to create an audit trail of all engineering work with enough detail for stakeholders to understand the technical decisions being made.

Architecture Overview

The system consists of three primary components:

  • Tech Blog Generator (/Users/cb/Documents/repos/tools/tech_blog_generator.py) — Parses Claude session transcripts and converts them into structured HTML blog posts
  • Infrastructure Initializer (/Users/cb/Documents/repos/tools/tech_blog_init.py) — Provisions S3 buckets, CloudFront distributions, and DNS records for each tech blog domain
  • Stop Hook (/Users/cb/.claude/hooks/tech_blog_stop.sh) — Executes at the end of each Claude Code session to automatically generate and publish blog posts

Infrastructure Provisioning Strategy

Each tech blog required different DNS and CDN approaches based on existing wildcard certificates and domain registrars:

  • queenofsandiego.com and sailjada.com — Both have existing AWS ACM wildcard certificates (*.queenofsandiego.com and *.sailjada.com). These were leveraged to create S3 origin buckets and CloudFront distributions without additional certificate provisioning. Route53 hosted zones were used for DNS.
  • dangerouscentaur.com — Reuses existing wildcard CloudFront distribution (E2Q4UU71SRNTMB) pointing to the dc-sites S3 bucket. Added CNAME record at Namecheap DNS registrar for tech.dangerouscentaur.com.
  • burialsatseasandiego.com — Managed at GoDaddy DNS. Provisioned new S3 bucket and CloudFront distribution with ACM certificate validation via GoDaddy API integration.

This mixed-registrar approach required conditional logic in the infrastructure init script to detect domain ownership and apply appropriate DNS configuration mechanisms.

Blog Generation Pipeline

The tech blog generator reads Claude Code session transcripts in JSONL format from /Users/cb/.claude/projects/-Users-cb-Documents-repos/ and extracts relevant information:

  • File modifications — Tracks all created and edited files with relative paths
  • Commands executed — Captures CLI operations and AWS API calls (with credential filtering)
  • Tool interactions — Records code generation, refactoring, and debugging sessions
  • Decision rationale — Extracts explanations for architectural choices

The generator filters out sensitive data including AWS credentials, API keys, database passwords, and personal information before publishing. Output is formatted as semantic HTML with proper heading hierarchy for accessibility and SEO.

Integration with Site Navigation

The "Ship's Papers" menu in the main navigation (/Users/cb/Documents/repos/sites/queenofsandiego.com/index.html) was updated to include a "Technical Blog" link pointing to tech.queenofsandiego.com. Similar navigation updates were made across all four site properties. This makes the technical documentation discoverable by stakeholders reviewing project progress.

Key Technical Decisions

Why auto-generation instead of manual blogging? Manual documentation creates friction and doesn't scale. By hooking into the development environment itself, every significant action is automatically captured. This ensures consistency and prevents the common problem of outdated documentation.

Why granular detail? High-level summaries hide the decision-making process. Including exact file paths, function names, and infrastructure resource IDs allows engineers like Sergio to understand not just what was done, but why — essential for code review and knowledge transfer.

Why separate domains for each property? Each tech blog is independently owned by the corresponding business entity. This maintains clear separation of concerns and allows fine-grained analytics tracking specific to each operation.

Why filter credentials? The system uses Claude Code's built-in credential filtering mechanisms combined with additional regex-based scrubbing to ensure no secrets leak into public-facing documentation. All API calls are logged with parameter values visible but actual keys redacted.

Infrastructure Resources Created

For each tech blog, the following AWS resources were provisioned:

  • S3 bucket named tech-[domain]-blog configured for static website hosting
  • CloudFront distribution with origin pointing to S3, configured for HTTPS with ACM certificate
  • DNS records (Route53 for AWS-managed zones, Namecheap CNAME for dangerouscentaur, GoDaddy CNAME for burialsatseasandiego)
  • CloudFront invalidation rules to ensure fresh content on deployment

Example command executed for QOS tech blog:

aws s3api create-bucket \
  --bucket tech-queenofsandiego-blog \
  --region us-west-2 \
  --create-bucket-configuration LocationConstraint=us-west-2

aws cloudfront create-distribution \
  --origin-domain-name tech-queenofsandiego-blog.s3.us-west-2.amazonaws.com \
  --viewer-certificate acm-certificate-arn

Session-Based Publishing Workflow

At the end of each development session, the Stop hook:

  1. Reads the current session transcript from Claude Code's project memory
  2. Filters credentials and sensitive data using regex patterns
  3. Generates an HTML blog post with semantic markup
  4. Determines which domain(s) are affected by the changes
  5. Uploads the post to the appropriate S3 bucket
  6. Invalidates the CloudFront distribution to purge cache
  7. Logs success/failure to /Users/cb/.claude/logs/tech_blog_publish.log

The hook runs asynchronously so it doesn't block session termination. Posts are timestamped and indexed by date for chronological browsing.

What's Next

Future enhancements include: automated email notifications to stakeholders when new posts are published, RSS feed generation for tech blog subscribers, search indexing of technical content, and enhanced analytics to track which engineering decisions drive business outcomes (specifically booking conversions for the sailing operations).

Additionally, addressing the incorrect imagery on the burialsatseasandiego.com fleet pages has been tracked as a priority task. The system is now in place to document such fixes in real-time as they're implemented.

```