
You set up meta tags. You generated a sitemap. You installed a Sanity SEO plugin. By every guide on the internet, your Sanity site is "optimized for search." So why are you still guessing which pages are actually ranking?
Most Sanity SEO guides stop at setup. They show you how to add title tags and meta descriptions, how to generate sitemaps, and which plugins to install. That is important work. But setup is only half the equation. The other half — measuring what happens after you publish — is almost never discussed.
This guide covers both halves. We will walk through what a complete Sanity SEO workflow looks like: from content modeling and metadata, through structured data and sitemaps, all the way to pulling Google Search Console data directly into your Sanity Studio so you can see what is working and what is not.
The Sanity SEO ecosystem today
Sanity does not ship with built-in SEO features. Unlike WordPress with Yoast pre-installed, every SEO capability needs to be added intentionally. That is actually a strength — you get exactly what you need, nothing you don't — but it means teams need to be deliberate about what they build.
Here is what the ecosystem looks like.
Metadata plugins
Several community plugins handle the basics:
- sanity-plugin-seo adds ready-made fields for meta titles, descriptions, Open Graph, and Twitter Cards
- sanity-plugin-seofields provides similar functionality with validation rules and no-code setup
- metamanager gives granular control over individual HTML meta tags
For content scoring, SlashSEO offers real-time keyword optimization and readability checks inside the Studio, similar to what Yoast does for WordPress.
Schema design pattern
The standard approach is a reusable SEO object type:
export const seoType = defineType({
name: 'seo',
title: 'SEO',
type: 'object',
fields: [
defineField({
name: 'title',
title: 'Meta Title',
type: 'string',
validation: (rule) =>
rule.max(65).warning('May be truncated in search results'),
}),
defineField({
name: 'description',
title: 'Meta Description',
type: 'text',
rows: 3,
validation: (rule) =>
rule.max(160).warning('May be truncated in search results'),
}),
defineField({
name: 'ogImage',
title: 'Open Graph Image',
type: 'image',
options: { hotspot: true },
}),
defineField({
name: 'noIndex',
title: 'No Index',
type: 'boolean',
initialValue: false,
}),
],
})Add this to any document type with a single field definition, and editors get consistent SEO controls across every content type.
Sitemaps and structured data
Sanity does not generate sitemaps. That responsibility falls to the frontend framework. With Next.js App Router, the convention is straightforward:
// app/sitemap.ts
export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
const pages = await client.fetch(`
*[_type in ["page", "post"] && defined(slug.current)] {
"slug": slug.current,
_updatedAt
}
`)
return pages.map((page) => ({
url: `${baseUrl}/${page.slug}`,
lastModified: new Date(page._updatedAt),
}))
}Structured data follows the same pattern: content lives in Sanity, JSON-LD rendering happens on the frontend. This separation of concerns is clean, but it also means SEO lives across two layers that need to stay in sync.
The piece every guide leaves out
Here is what none of the existing Sanity SEO guides talk about: what happens after you hit publish.
You have written a blog post. You have filled in the meta title, crafted a description under 160 characters, added an OG image, and submitted your sitemap to Google. Now what?
If you want to know whether that post is actually ranking, you need to:
- Open Google Search Console in a separate tab
- Navigate to Performance, filter by your URL
- Check impressions, clicks, average position, and CTR
- Switch back to Sanity Studio
- Repeat for every page you want to review
This is the Sanity SEO feedback loop that does not exist by default. And it matters far more than most teams realize.
Why the feedback loop matters
SEO is not set-and-forget. Content decays. A blog post that ranked on page one six months ago might have quietly dropped to page three. A landing page that was getting 500 clicks per week might now get 50. Without a feedback loop, these problems are invisible until someone happens to notice the traffic graph.
The typical Sanity team discovers content decay in one of two ways: someone manually checks Google Search Console (rare), or a quarterly traffic report reveals the damage after months of decline (too late).
Neither is a good outcome. The gap between "content published" and "content performing" is where most Sanity SEO strategies break down.
The context-switching tax
Even teams that do check Google Search Console regularly pay a steep price in context-switching. Every time an editor moves between the Studio and GSC, they lose context. They need to find the right URL, remember which metrics they were looking at, then translate that back into editorial action.
At scale — 50, 100, 500 pages — this workflow becomes unsustainable. The result is that performance data gets checked less and less frequently, and editorial decisions get made without data.
Connecting Sanity and Google Search Console
This is the gap that PageBridge was built to fill. It syncs Google Search Console data directly into Sanity Studio, so editors see search performance alongside the content they are editing.
Instead of switching tabs, you see clicks, impressions, CTR, and average position in the document sidebar. Instead of manually checking for ranking drops, you get automated alerts. Instead of guessing which pages need attention, you get a prioritized queue.
How it works
PageBridge connects to Google Search Console's API and pulls performance data into a Postgres database that you own. A nightly sync (configurable, with manual sync available) keeps the data current. A configurable URL resolver maps Sanity documents to their public URLs, so the right data appears on the right document.
The key detail: PageBridge never touches your content. It only reads search performance metrics from GSC and makes them available inside Sanity. Your data stays yours.
How to Connect Google Search Console with Sanity CMS — it is open source and free.
What you get inside Sanity Studio
Once connected, each document sidebar shows:
Performance metrics — Clicks, impressions, CTR, and average position for the current page. No tab-switching, no manual lookup. The data is just there while you edit.
Content decay detection — Automatic alerts when a page drops in rankings. Instead of discovering a position drop months later, you see it flagged with severity level and timeframe (for example, "Position dropped from 3 to 12 over 30 days").
Striking distance opportunities — Pages ranking between position 4 and 10 with high impressions get surfaced automatically. These are your highest-leverage optimization targets: a small improvement in title tag or content could push them to page one.
Publishing impact — Before-and-after analysis showing how your most recent edits affected search performance. Did that title change improve CTR? Did the content refresh recover lost positions? Now you can see it without leaving the Studio.
CTR anomaly detection — Flags pages where click-through rate is significantly lower than expected for their ranking position. If you are ranking in position 3 but getting less clicks than a typical position-6 result, your title and meta description probably need work.
Keyword intent matching — Define target keywords per page. PageBridge cross-references your intent against actual GSC queries and tells you where the gaps are. It also catches keyword cannibalization, where multiple pages compete for the same query.
Team alerts without logins
Not everyone needs to be in Sanity Studio to stay informed. PageBridge sends Slack and Discord alerts when revenue pages start slipping. An editor might not check the dashboard every day, but they will see a notification in their team channel that says a key landing page lost five ranking positions this week.
Building the complete Sanity SEO workflow
With the feedback loop closed, here is what a complete Sanity SEO workflow looks like:
1. Content modeling with SEO built in
Design your schemas with a reusable SEO object. Every content type that generates a public URL should have meta title, description, OG image, and indexing controls. Use validation rules to enforce character limits.
2. Frontend rendering
Render meta tags, Open Graph data, and JSON-LD structured data on the frontend using generateMetadata() in Next.js App Router. Use coalesce() in your GROQ queries to fall back from SEO-specific fields to default title and excerpt fields.
3. Technical fundamentals
Generate dynamic sitemaps from Sanity data. Implement canonical URLs for syndicated or variant content. Store redirect rules in Sanity so editors can manage URL changes without developer involvement. Optimize Core Web Vitals with Sanity's CDN, image pipeline, and ISR.
4. Performance monitoring with Sanity GSC data
Connect PageBridge to pull Google Search Console data into the Studio. Now every content decision is informed by actual search performance. Editors see what is ranking, what is declining, and what needs attention — right where they work.
5. Ongoing optimization
Use the striking distance queue to prioritize quick wins. Monitor the refresh queue for decaying content. Check publishing impact after every major edit. Review CTR anomalies to improve titles and descriptions. Let the data guide the editorial calendar instead of guessing.
This is the workflow that transforms Sanity SEO from a one-time setup into a continuous improvement loop. Setup gets you indexed. The feedback loop keeps you ranking.
Setup is where most Sanity SEO guides end. But SEO is not a launch checklist — it is an ongoing process that requires visibility into what is actually happening in search. Connecting Google Search Console data to your Sanity Studio closes the gap between publishing and performance, turning every content edit into an informed decision.
Get started with PageBridge — it is open source and free.
