Building Production SEO in a 29MB Binary
Case Study: SERVER-SIDE_RENDERING | Stack: Go/SQLite
I recently implemented a complete SEO infrastructure for this site—server-side meta tag injection, Open Graph images, and dynamic metadata—all within a single Go binary. The entire process took one focused session, and the results speak for themselves: 100/100 SEO score on Google PageSpeed Insights.
This is a case study in pragmatic architecture. No microservices, no complex build pipelines, no CDN dependencies for core functionality. Just clean execution guided by a clear mental model of what needed to happen.
The Problem: Invisible to Search Engines
Despite having quality content, my blog posts were effectively invisible to Google and social media platforms. The issue was simple but critical:
// The Source Code Reality
<title>Journal | Hugo Palma</title>
<meta property="og:description" content="">
<!-- No specific metadata, no OG images -->
My JavaScript was updating the browser tab title dynamically, but search engine bots don't execute JavaScript. They only see the raw HTML that the server sends. To them, every blog post looked identical—generic title, empty description, no preview images.
The Architecture: Hybrid SSR-SPA
Before diving into the SEO implementation, it's worth understanding the unique architecture of this site. I call it a Hybrid SSR-SPA pattern—it behaves like a Single Page Application for users, but renders like a traditional server-side application for search engines.
// The Hybrid Pattern
- Initial Load: Server renders full HTML with proper SEO headers injected
- Client Navigation: Vanilla JavaScript router handles routing for smooth SPA-like transitions
- Deep Links: Every URL works perfectly when hit directly (bots, social media, bookmarks)
- Modular Templates: Go templates act as injectable modules (
index.html+journal.html+jobs.html)
This isn't Next.js or Nuxt. This is vanilla JavaScript + Go templates achieving the same result with zero framework dependencies. The site feels like a SPA to users, but looks like proper SSR to search engines.
The Infrastructure Context
What makes this architecture even more impressive is what it's running on. This entire system operates on a Linode 1vCPU / 2GB RAM instance that hosts:
- This 29MB Go binary (web server with embedded assets)
- Puppeteer-based job scraper (headless Chrome automation)
- AI-powered job scorer (LLM API integration)
- SQLite databases (agency.db + jobs.db)
Two additional lightweight containers handle email services and NPM builds. That's it. No Kubernetes, no microservices sprawl, no DevOps theater.
The SEO solution needed to fit within this constraint—no separate CDN dependencies, no complex build pipelines, no additional services. Just clean, embedded assets and server-side template injection.
The SEO Implementation: Three Components
With the architectural context established, here's how I implemented production-grade SEO:
-
Database Schema: Store metadata as JSON in the
metadatacolumn{ "title": "Why I Built This Website", "description": "Detailed look at the 'Metal-Up' mandate...", "keywords": "Go, SQLite, Performance, Web Development", "og_image": "/static/images/og/why-i-built.jpg" } - Go Backend: Extract metadata from the database and inject it into the HTML template before sending to the client
-
Embedded Assets: OG images bundled directly into the binary via
//go:embed
Implementation Details
1. Backend Metadata Extraction
I modified the /journal/ handler in main.go to unmarshal the JSON metadata and populate template variables:
var seoMeta SEOMetadata
seoMeta.Image = "https://hugopalma.work/static/og-image.jpg" // Fallback
if activeBlog != nil && activeBlog.Metadata != "" {
json.Unmarshal([]byte(activeBlog.Metadata), &seoMeta)
if seoMeta.Title != "" {
pageTitle = seoMeta.Title
}
}
data := map[string]interface{}{
"Title": pageTitle,
"SeoDesc": seoMeta.Description,
"SeoImage": seoMeta.Image,
"SeoKeywords": seoMeta.Keywords,
// ... other template data
}
2. Template Placeholders
The index.html layout already had the placeholders in place:
<title>{{if .Title}}{{.Title}}{{else}}Hugo Palma{{end}}</title>
{{if .SeoDesc}}<meta name="description" content="{{.SeoDesc}}">{{end}}
{{if .SeoKeywords}}<meta name="keywords" content="{{.SeoKeywords}}">{{end}}
<meta property="og:title" content="{{if .Title}}{{.Title}}{{end}}">
<meta property="og:description" content="{{if .SeoDesc}}{{.SeoDesc}}{{end}}">
{{if .SeoImage}}<meta property="og:image" content="{{.SeoImage}}">{{end}}
3. OG Image Pipeline
I created custom thumbnails for each blog post and processed them using ImageMagick:
magick input.png -resize 1200x630! static/images/og/output.jpg
The images were then embedded into the binary automatically via the existing //go:embed static directive. No separate deployment step needed—the images ship with the binary.
4. AI-Assisted Metadata Generation
Here's where the "Architect and AI" philosophy really shines. For blog posts without metadata, I use AI to generate it automatically:
// The Metadata Generation Workflow
- Read the blog post content from the database
- Prompt AI: "Generate SEO metadata for this content in JSON format"
- Specify exact schema:
title,description,keywords,og_image - AI analyzes content and returns structured JSON
- Insert metadata into database automatically
This is the same pattern I use throughout the site—I define the structure and constraints, the AI fills in the details. The result is consistent, SEO-optimized metadata without manual writing.
For the OG images, I either generate custom thumbnails (as shown above) or default to a professional headshot. The system handles the fallback logic automatically in main.go.
The Results: Perfect SEO Scores
After deploying the changes, I validated the implementation using Google PageSpeed Insights. The results exceeded expectations:
Figure 1: PageSpeed Insights scores across different pages
Homepage
- Performance: 95
- Accessibility: 88
- Best Practices: 100
- SEO: 100
Blog Post (Why I Built)
- Performance: 90
- Accessibility: 93
- Best Practices: 100
- SEO: 100
Most importantly, the OG images now appear correctly in the PageSpeed preview thumbnails, confirming that social media platforms will display proper link previews when the posts are shared.
Performance Insights
Google flagged a few optimization opportunities:
- 640ms Render-blocking requests: CSS loading synchronously. Acceptable tradeoff for simplicity.
- 528KB Cache lifetimes: Will be resolved by Cloudflare caching in production mode.
- 72KB Unused JavaScript: Chart.js loaded globally but only used on one page. Acceptable for code simplicity.
These are all conscious architectural decisions. The site is currently in Cloudflare Development Mode, which bypasses caching. Once production caching is enabled, the performance scores will likely jump to 98-100 across the board.
The "Architect and AI" Dynamic
This implementation perfectly demonstrates the philosophy I outlined in my previous post, "Why I'm Skeptical of AGI, but You Should Use AI." I didn't ask the AI to "build SEO for my site." I came to the session with the architecture already mapped out:
// The Mental Model
- Database stores JSON metadata
- Go handler extracts and injects into templates
- Images embedded via go:embed
- Fallback to profile image if specific OG image missing
- Validate with PageSpeed Insights
The AI acted as a Staff Engineer—handling the syntax, catching edge cases, and executing the implementation. But I remained the Chief Architect, making the structural decisions and ensuring the solution aligned with the "Metal-Up" philosophy.
Key Takeaways
- Server-Side Rendering Matters: Client-side JavaScript is invisible to search engines. If you want proper SEO, the metadata must be in the raw HTML.
-
Embedded Assets = Zero Dependencies: By using
//go:embed, the OG images ship with the binary. No separate CDN, no deployment coordination issues. - Pragmatic > Perfect: I could lazy-load Chart.js and shave off 72KB, but the complexity isn't worth it for a 95/100 Performance score.
- Architecture First, AI Second: The AI is a force multiplier, but only if you know exactly what you're building. Without the mental model, you get bloated solutions.
The Stack Efficiency
This entire system—web server, job scraper (Puppeteer), AI scorer, and now production-grade SEO—runs on a single vCPU container. The binary is 29MB. The deployment process is scp and supervisorctl restart.
This is what "Metal-Up" looks like in practice. No Kubernetes, no microservices sprawl, no DevOps theater. Just clean, efficient systems that do exactly what they need to do—and nothing more.
SPA Chart Implementation: A Unique Challenge
Because this site uses the Hybrid SSR-SPA pattern, all blog posts are pre-loaded into the DOM for instant client-side navigation. This creates an interesting challenge for Chart.js implementations:
// The Canvas ID Problem
When multiple blog posts contain charts, they all exist in the DOM simultaneously. If two posts use the same canvas ID (e.g., dataWallChart), JavaScript will only find the first one, breaking charts on other posts.
Solution: Every chart must have a unique canvas ID across all blog posts. This post uses seoScoresChart, the AGI post uses dataWallChart, and future posts will need their own unique IDs.
This is a perfect example of how architectural decisions cascade. The SPA pattern gives us instant navigation, but requires discipline in how we structure content. The tradeoff is worth it—users get a snappy experience, and we just need to be mindful of global scope.
The site is the resume. The code is the proof.