Largest Contentful Paint (LCP) measures how quickly the main content of your Sanity-powered site loads. Poor LCP directly impacts SEO rankings and conversion rates.
Target: LCP under 2.5 seconds Good: Under 2.5s | Needs Improvement: 2.5-4.0s | Poor: Over 4.0s
For general LCP concepts, see the global LCP guide.
Sanity-Specific LCP Issues
1. Unoptimized Sanity Images
The most common LCP issue is large, unoptimized images from Sanity's Image API.
Problem: Hero images or featured images loading at full resolution.
Diagnosis:
- Run PageSpeed Insights
- Check if Sanity image is the LCP element
- Look for "Properly size images" warning
Solutions:
A. Use Sanity Image URL Builder
Sanity's Image API supports dynamic image transformation:
import imageUrlBuilder from '@sanity/image-url'
import { client } from '@/lib/sanity.client'
const builder = imageUrlBuilder(client)
export function urlFor(source: any) {
return builder.image(source)
}
// Usage with optimization
export function HeroImage({ image }) {
const imageUrl = urlFor(image)
.width(1920)
.height(1080)
.quality(80)
.format('webp')
.fit('crop')
.url()
return (
<img
src={imageUrl}
alt={image.alt}
width={1920}
height={1080}
loading="eager" // For LCP image
/>
)
}
B. Implement Responsive Images with Sanity
Use srcset for different screen sizes:
export function ResponsiveSanityImage({ image, alt }) {
return (
<img
src={urlFor(image).width(1920).format('webp').url()}
srcSet={`
${urlFor(image).width(640).format('webp').url()} 640w,
${urlFor(image).width(750).format('webp').url()} 750w,
${urlFor(image).width(828).format('webp').url()} 828w,
${urlFor(image).width(1080).format('webp').url()} 1080w,
${urlFor(image).width(1200).format('webp').url()} 1200w,
${urlFor(image).width(1920).format('webp').url()} 1920w,
`}
sizes="100vw"
alt={alt || image.alt}
width={1920}
height={1080}
loading="eager"
/>
)
}
C. Next.js Image Component with Sanity
Use Next.js Image component with Sanity:
import Image from 'next/image'
import { urlFor } from '@/lib/sanity.image'
// Configure Sanity domain in next.config.js
// next.config.js
module.exports = {
images: {
domains: ['cdn.sanity.io'],
formats: ['image/webp'],
loader: 'custom',
loaderFile: './lib/sanity-image-loader.ts',
},
}
// lib/sanity-image-loader.ts
import { urlFor } from './sanity.image'
export default function sanityImageLoader({ src, width, quality }) {
return urlFor(src)
.width(width)
.quality(quality || 75)
.format('webp')
.url()
}
// Component
export function SanityImage({ image, priority = false }) {
return (
<Image
src={image}
alt={image.alt}
width={1920}
height={1080}
priority={priority} // true for LCP image
quality={80}
placeholder="blur"
blurDataURL={getBlurDataURL(image)}
/>
)
}
D. Generate Blur Placeholders from Sanity
Create low-quality placeholders for better perceived performance:
import { urlFor } from '@/lib/sanity.image'
export function getBlurDataURL(image: any): string {
return urlFor(image)
.width(10)
.quality(10)
.blur(50)
.url()
}
// Or use Sanity's low-quality image placeholder (LQIP)
export async function getStaticProps() {
const query = `*[_type == "post" && slug.current == $slug][0] {
...,
mainImage {
...,
asset-> {
...,
metadata {
lqip
}
}
}
}`
const post = await client.fetch(query, { slug: params.slug })
return {
props: {
post,
blurDataURL: post.mainImage.asset.metadata.lqip,
},
}
}
2. Slow GROQ Query Performance
Waiting for Sanity queries on the client causes LCP delays.
Problem: Content fetched on client-side, blocking render.
Diagnosis:
- Check Network tab for Sanity API calls
- Measure time to first byte (TTFB)
- Look for GROQ queries blocking render
Solutions:
A. Use Static Site Generation (SSG)
Pre-render pages at build time:
Next.js:
// pages/blog/[slug].tsx
import { client } from '@/lib/sanity.client'
export async function getStaticProps({ params }) {
const query = `*[_type == "post" && slug.current == $slug][0]`
const post = await client.fetch(query, { slug: params.slug })
return {
props: { post },
revalidate: 3600, // ISR: Rebuild every hour
}
}
export async function getStaticPaths() {
const query = `*[_type == "post"]{ "slug": slug.current }`
const posts = await client.fetch(query)
return {
paths: posts.map(post => ({ params: { slug: post.slug } })),
fallback: 'blocking',
}
}
// gatsby-node.js
exports.createPages = async ({ graphql, actions }) => {
const { createPage } = actions
const result = await graphql(`
query {
allSanityPost {
nodes {
slug {
current
}
}
}
}
`)
result.data.allSanityPost.nodes.forEach(post => {
createPage({
path: `/blog/${post.slug.current}`,
component: require.resolve('./src/templates/blog-post.tsx'),
context: { slug: post.slug.current },
})
})
}
B. Optimize GROQ Queries
Fetch only necessary fields:
// Bad - fetches everything
*[_type == "post"][0]
// Good - only needed fields
*[_type == "post"][0] {
_id,
title,
slug,
publishedAt,
mainImage {
asset-> {
url,
metadata {
lqip,
dimensions
}
}
},
author-> {
name,
image
}
}
C. Use Sanity's CDN
Enable CDN for faster response times:
import { createClient } from '@sanity/client'
export const client = createClient({
projectId: process.env.NEXT_PUBLIC_SANITY_PROJECT_ID!,
dataset: process.env.NEXT_PUBLIC_SANITY_DATASET!,
apiVersion: '2024-01-01',
useCdn: true, // Enable CDN for faster reads
token: process.env.SANITY_API_TOKEN, // Only for writes
})
D. Implement Query Caching
Cache Sanity responses:
// Simple in-memory cache
const queryCache = new Map()
export async function cachedFetch(query: string, params?: any) {
const cacheKey = `${query}_${JSON.stringify(params)}`
if (queryCache.has(cacheKey)) {
return queryCache.get(cacheKey)
}
const result = await client.fetch(query, params)
queryCache.set(cacheKey, result)
// Clear cache after 5 minutes
setTimeout(() => queryCache.delete(cacheKey), 5 * 60 * 1000)
return result
}
// Redis cache for production
import { createClient as createRedisClient } from 'redis'
const redis = createRedisClient({ url: process.env.REDIS_URL })
export async function fetchWithRedis(query: string, params?: any) {
const cacheKey = `sanity:${query}:${JSON.stringify(params)}`
const cached = await redis.get(cacheKey)
if (cached) {
return JSON.parse(cached)
}
const result = await client.fetch(query, params)
// Cache for 1 hour
await redis.setEx(cacheKey, 3600, JSON.stringify(result))
return result
}
3. Client-Side Rendering (CSR) Delays
CSR requires additional round trips before rendering content.
Problem: Page waits for JavaScript, then fetches Sanity content, then renders.
Diagnosis:
- Check if content appears after JavaScript loads
- Look for "white screen" before content
- Measure time to interactive (TTI)
Solutions:
A. Use Server-Side Rendering (SSR)
Render content on the server:
Next.js:
// pages/blog/[slug].tsx
export async function getServerSideProps({ params }) {
const query = `*[_type == "post" && slug.current == $slug][0]`
const post = await client.fetch(query, { slug: params.slug })
return {
props: { post },
}
}
Nuxt:
<script>
export default {
async asyncData({ $sanity, params }) {
const query = `*[_type == "post" && slug.current == $slug][0]`
const post = await $sanity.fetch(query, { slug: params.slug })
return { post }
},
}
</script>
B. Implement Streaming SSR (Next.js App Router)
// app/blog/[slug]/page.tsx
import { Suspense } from 'react'
import { client } from '@/lib/sanity.client'
async function BlogContent({ slug }) {
const query = `*[_type == "post" && slug.current == $slug][0]`
const post = await client.fetch(query, { slug })
return <article>{/* Content */}</article>
}
export default function BlogPage({ params }) {
return (
<Suspense fallback={<LoadingSkeleton />}>
<BlogContent slug={params.slug} />
</Suspense>
)
}
4. Large Portable Text Content
Processing large Portable Text on the client is slow.
Problem: Portable Text rendering blocks paint.
Diagnosis:
- Check if LCP element is in Portable Text
- Profile Portable Text processing time
- Look for multiple embedded images/references
Solutions:
A. Process Portable Text on Server
// Server-side (Next.js)
import { toHTML } from '@portabletext/to-html'
export async function getStaticProps() {
const query = `*[_type == "post"][0]`
const post = await client.fetch(query)
// Process on server
const bodyHtml = toHTML(post.body)
return {
props: {
post,
bodyHtml,
},
}
}
// Client-side
export default function BlogPost({ bodyHtml }) {
return <div dangerouslySetInnerHTML={{ __html: bodyHtml }} />
}
B. Lazy Load Embedded Content
import { Suspense, lazy } from 'react'
import { PortableText } from '@portabletext/react'
const EmbeddedVideo = lazy(() => import('./EmbeddedVideo'))
const components = {
types: {
videoEmbed: ({ value }) => (
<Suspense fallback={<div>Loading video...</div>}>
<EmbeddedVideo {...value} />
</Suspense>
),
},
}
export function PortableTextRenderer({ value }) {
return <PortableText value={value} components={components} />
}
5. Web Fonts Loading
Custom fonts can delay LCP.
Problem: Waiting for font files to load.
Diagnosis:
- Check if text is LCP element
- Look for font-related delays in waterfall
- Check for FOIT (Flash of Invisible Text)
Solutions:
A. Preload Fonts
// next.config.js or app/layout.tsx
export default function RootLayout({ children }) {
return (
<html>
<head>
<link
rel="preload"
href="/fonts/custom-font.woff2"
as="font"
type="font/woff2"
crossOrigin="anonymous"
/>
</head>
<body>{children}</body>
</html>
)
}
B. Use font-display: swap
@font-face {
font-family: 'CustomFont';
src: url('/fonts/custom-font.woff2') format('woff2');
font-display: swap; /* Show fallback immediately */
font-weight: 400;
font-style: normal;
}
Framework-Specific Optimizations
Next.js Optimizations
Image Optimization:
import Image from 'next/image'
import { urlFor } from '@/lib/sanity.image'
export function SanityImage({ image, ...props }) {
return (
<Image
src={urlFor(image).url()}
alt={image.alt}
{...props}
/>
)
}
Incremental Static Regeneration:
export async function getStaticProps() {
const data = await fetchSanityData()
return {
props: { data },
revalidate: 60, // Rebuild every 60 seconds
}
}
Gatsby Optimizations
Gatsby Image Plugin:
// gatsby-config.js
module.exports = {
plugins: [
{
resolve: 'gatsby-source-sanity',
options: {
projectId: process.env.SANITY_PROJECT_ID,
dataset: process.env.SANITY_DATASET,
watchMode: false,
overlayDrafts: false,
},
},
'gatsby-plugin-image',
],
}
Use GatsbyImage:
import { GatsbyImage } from 'gatsby-plugin-image'
export function SanityImage({ image }) {
return (
<GatsbyImage
image={image.asset.gatsbyImageData}
alt={image.alt}
loading="eager" // For LCP image
/>
)
}
Testing & Monitoring
Test LCP
Tools:
- PageSpeed Insights - Lab and field data
- WebPageTest - Detailed waterfall
- Chrome DevTools - Local testing
- Lighthouse CI - Automated testing
Test Different Pages:
- Homepage
- Blog posts
- Product pages (if e-commerce)
- Dynamic content pages
Monitor LCP Over Time
Real User Monitoring:
// Track LCP with Web Vitals
import { getLCP } from 'web-vitals'
getLCP((metric) => {
// Send to analytics
if (window.gtag) {
window.gtag('event', 'web_vitals', {
event_category: 'Web Vitals',
event_label: 'LCP',
value: Math.round(metric.value),
non_interaction: true,
})
}
})
Quick Wins Checklist
Priority optimizations for immediate LCP improvements:
- Use Sanity Image URL Builder with width/quality parameters
- Add explicit width/height to all Sanity images
- Set
loading="eager"on LCP image - Preload LCP image in
<head> - Use SSG or ISR instead of CSR for Sanity content
- Optimize GROQ queries (fetch only needed fields)
- Enable Sanity CDN (
useCdn: true) - Implement query caching (Redis or in-memory)
- Process Portable Text server-side if possible
- Preload critical fonts
- Use Next.js Image component or gatsby-plugin-image
- Enable WebP format via Sanity Image API
- Test with PageSpeed Insights
Advanced Optimizations
Edge Caching
Cache Sanity content at the edge:
// Cloudflare Worker example
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
const cache = caches.default
let response = await cache.match(request)
if (!response) {
response = await fetch(request)
// Cache Sanity API responses for 1 hour
if (request.url.includes('api.sanity.io')) {
const newResponse = new Response(response.body, response)
newResponse.headers.set('Cache-Control', 'max-age=3600')
event.waitUntil(cache.put(request, newResponse.clone()))
}
}
return response
}
Sanity Webhooks for Cache Invalidation
Invalidate cache when Sanity content changes:
// pages/api/sanity-webhook.ts
export default async function handler(req, res) {
// Verify webhook signature
// ...
// Invalidate cache
await invalidateCache(req.body)
res.status(200).json({ revalidated: true })
}
// Configure in Sanity Studio
// https://www.sanity.io/docs/webhooks
When to Hire a Developer
Consider hiring a developer if:
- LCP consistently over 4 seconds after optimizations
- Complex Sanity implementation requires refactoring
- Need advanced caching strategies
- Framework-specific expertise required
- Migration to different framework needed
Next Steps
For general LCP optimization strategies, see LCP Optimization Guide.