Injecting Structured Data Into Concert Event Pages: A Multi-Domain Schema Strategy
This session focused on a critical SEO gap: 12 concert event pages across multiple subdomains were completely missing structured data markup. Without Event and LocalBusiness JSON-LD schemas, search engines couldn't understand what these pages represented, severely limiting their visibility in concert discovery, local search, and rich snippet displays. Here's how we identified the problem, built an automated solution, and deployed it across a distributed infrastructure.
The Problem: Zero Structured Data on High-Value Pages
The event subdomains under queenofsandiego.com host concert and performance pages with rich content: dates, locations, performer details, ticket information, and venue names. Yet when auditing pages like:
/Users/cb/Documents/repos/sites/queenofsandiego.com/rady-shell-events/subdirectories- Event pages across
paulsimonradyshell.queenofsandiego.com,steelypanradyshell.queenofsandiego.com, and other concert subdomains
None contained <script type="application/ld+json"> markup. This meant Google couldn't parse event dates, venues, or performer information for knowledge panel enrichment or special search result formatting. The fix required both a robust injection script and careful deployment across multiple S3 buckets and CloudFront distributions.
Technical Solution: Automated Schema Injection Script
We created /Users/cb/Documents/repos/tools/inject_structured_data.py to solve this systematically. The script needed to:
- Parse existing HTML files without corrupting them
- Generate valid Event JSON-LD and LocalBusiness schemas from page content
- Insert schemas into the
<head>section in the correct location - Handle multiple event subdomains with different file structures
The approach extracts metadata from HTML (event names from <h1>, dates from content, venue names from page body) and constructs properly formatted JSON-LD blocks before the closing </head> tag. Rather than requiring manual edits to 12 pages, the script processes all pages in a single pass.
Why this architecture: Structured data injection at build time is risky—it requires touching production markup. By scripting the injection, we create an auditable, repeatable process that can be re-run if schemas need updates, and we can version-control the script itself while keeping page files pristine until final deployment.
Schema Design: Event + LocalBusiness Pattern
Each injected page includes two linked schemas:
Event schema includes:
- name (concert/performance title)
- startDate / endDate (ISO 8601 format)
- location (LocalBusiness reference)
- performer (Person schema with name)
- url (canonical event page URL)
- description (from page content)
LocalBusiness schema includes:
- name (venue name)
- address (venue address from page)
- telephone (box office number if available)
- url (venue website or event page)
This dual-schema approach ensures that both the event itself and the venue are discoverable in search and can be independently featured in knowledge panels or carousel results.
Infrastructure: Multi-Domain Deployment Challenge
The concert pages live across multiple event subdomains, each with its own S3 bucket and CloudFront distribution. The deployment process required:
- S3 Bucket Identification: Locating buckets for each event subdomain (e.g.,
paulsimonradyshell.queenofsandiego.commaps to a specific S3 bucket used for that domain's static hosting) - CloudFront Distribution Mapping: Finding the correct distribution ID for each domain to invalidate cached versions after upload
- Batch Upload: Using AWS CLI to sync updated files to the appropriate bucket
- Cache Invalidation: Creating distribution invalidations to ensure users receive updated pages immediately
Rather than manually uploading to 6-8 different buckets, we scripted the process:
# Example workflow (actual bucket/distribution names differ)
aws s3 sync ./updated_event_pages s3://event-subdomain-bucket/ \
--exclude "*" \
--include "*.html" \
--metadata "Cache-Control=public, max-age=3600"
aws cloudfront create-invalidation \
--distribution-id ABCD1234EFGH \
--paths "/*"
This approach ensures consistency: all 12 pages receive the same schema treatment, all are uploaded with identical metadata headers, and all caches are cleared simultaneously to prevent stale content.
File Organization and Tracking
During execution, we maintained careful audit trails:
- Original pages remained untouched until final S3 upload
- The injection script output was validated against schema.org specifications
- Progress was logged in the CMO master task dashboard (
/tmp/progress_dashboard.html) - Each deployment batch was recorded with timestamp and distribution invalidation confirmation
The dashboard received 10 updates during this session to track: schema injection completion, S3 upload status, CloudFront invalidations, and validation results.
Key Decisions and Trade-offs
Decision 1: Inject at deployment time, not build time
Building schemas into HTML templates would require modifying core rendering logic. Instead, we inject after pages are finalized, reducing risk of breaking existing page rendering.
Decision 2: Use LocalBusiness + Event (not just Event)
Event schema alone doesn't capture venue information richly enough for venue discovery. The LocalBusiness + Event combination enables both event-centric searches ("concerts near me") and venue-centric searches ("Rady Shell events").
Decision 3: Batch CloudFront invalidations
Rather than invalidate after each page upload, we batch all pages into a single wildcard invalidation /*. This is more cost-effective and ensures all pages reach CDN edge locations simultaneously.
Validation and Next Steps
Post-deployment validation includes:
- Using Google's Rich Results Test on live pages to verify schema parsing
- Checking Google Search Console structured data reports to confirm indexing
- Monitoring click-through rates on event pages in search results over the next 2-4 weeks
- Verifying that venue names and event dates appear in rich snippets
Future work involves extending this pattern to other high-value pages (booking confirmation pages, venue directories, performance reviews) and integrating schema generation into the continuous deployment pipeline so future pages receive schemas automatically.
This deployment demonstrates a scalable approach to SEO infrastructure: identify gaps systematically, script the solution, test on a bounded set of pages, and deploy consistently across distributed infrastructure using infrastructure-as-code patterns.
```