title: SEO Best Practices: The Complete Technical Checklist description: Master the technical SEO fundamentals that drive organic rankings. Covers crawlability, indexation, page experience, structured data, security, and.
Technical SEO is the foundation that determines whether your content can be discovered, understood, and ranked by search engines. This guide covers every critical area of SEO operations, organized by the order you should address them. Fix crawlability first, then indexation, then on-page signals, then performance.
1. Crawlability
Before Google can rank your pages, Googlebot must be able to find and access them.
robots.txt
Your robots.txt file controls which areas of your site search engines can crawl:
User-agent: *
Allow: /
Disallow: /admin/
Disallow: /search
Disallow: /*?sort=
Disallow: /*?filter=
Sitemap: https://example.com/sitemap-index.xml
Verify Googlebot can access your important pages using the URL Inspection tool in Search Console.
XML Sitemaps
Submit a well-structured sitemap that contains only indexable URLs. Google ignores priority and changefreq -- only loc and accurate lastmod values matter. Segment large sitemaps by content type (pages, blog posts, products) for better Search Console diagnostics.
Crawl Depth
Keep important pages within 3 clicks of the homepage. Pages at depth 4+ get crawled less frequently and accumulate less PageRank. Use internal links, breadcrumbs, and hub pages to flatten deep architecture.
Crawl Budget
For sites with 10,000+ pages, crawl budget optimization matters:
- Remove or noindex low-value pages (index bloat)
- Fix soft 404s and redirect chains
- Improve server response time (TTFB < 200ms)
- Block parameter-based URLs that create duplicate content
2. Indexation
Getting crawled is step one. Getting indexed is step two.
Canonical Tags
Every page needs a self-referencing canonical tag. When duplicate or near-duplicate content exists, use canonicals to consolidate:
<link rel="canonical" href="https://example.com/preferred-url">
Meta Robots
Use noindex to keep low-value pages out of the index:
<meta name="robots" content="noindex, follow">
Target pages for noindex: internal search results, thin tag pages, paginated archives beyond page 3, thank-you pages.
Index Bloat Prevention
Compare your intended indexable page count with what Google actually indexes (Search Console > Pages). If indexed count exceeds your target by 2x or more, you have bloat that dilutes crawl budget and site quality signals.
Orphan Page Detection
Pages with zero internal links are invisible to crawlers. Run monthly crawl-vs-sitemap comparisons to detect orphaned content and add internal links.
3. On-Page Optimization
Title Tags
- 50-60 characters (Google truncates at ~580px pixel width)
- Include primary keyword near the beginning
- Each page must have a unique title
- Action-oriented and descriptive, not keyword-stuffed
Meta Descriptions
- 120-155 characters
- Include a clear value proposition and call to action
- Target the search query the user typed
- Unique per page (duplicate descriptions are treated as missing)
Heading Hierarchy
Use a single <h1> per page. Maintain a logical hierarchy (h1 > h2 > h3) without skipping levels. Headings help both users and crawlers understand content structure.
Content Quality
Google's Helpful Content system evaluates whether content is written for humans or for search engines. Characteristics of content that ranks well:
- Demonstrates first-hand experience (E-E-A-T)
- Provides comprehensive coverage that satisfies the search intent
- Includes specific data, examples, and actionable steps
- Is regularly updated when information changes
4. Page Experience
Core Web Vitals
The three metrics that matter:
| Metric | Good Threshold | What It Measures |
|---|---|---|
| LCP | < 2.5s | Loading speed (largest element) |
| INP | < 200ms | Interactivity (response to clicks/taps) |
| CLS | < 0.1 | Visual stability (layout shifts) |
Optimize LCP by preloading hero images, deferring non-critical JavaScript, and inlining critical CSS. Fix CLS by setting explicit image dimensions and reserving space for ads and embeds.
Mobile Usability
Google uses mobile-first indexing. Your mobile experience is your ranking experience. Test at 375px viewport width and verify:
- No horizontal scroll
- Tap targets at least 48x48px
- Text readable without zooming (16px minimum)
- No content hidden behind interstitials
HTTPS
HTTPS is a confirmed ranking signal. Use HSTS with preloading to enforce secure connections and eliminate mixed content warnings.
5. Structured Data
Implement JSON-LD structured data to earn rich results in search:
- BreadcrumbList: On every page for navigation display
- Article: On blog posts and news content
- Product: On e-commerce pages with price and availability
- FAQ: On pages with genuine question-and-answer content
- LocalBusiness: On local business landing pages
Validate with Google's Rich Results Test and monitor in Search Console under Enhancements.
6. Security
Security headers prevent attacks that lead to deindexation:
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
Content-Security-Policy: default-src 'self'; script-src 'self' https://www.googletagmanager.com
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
Referrer-Policy: strict-origin-when-cross-origin
A hacked site injected with spam content gets deindexed. Prevention through security headers is far easier than recovery.
7. Link Building and Authority
Internal Linking
- Every page needs at least 2-3 contextual internal links
- Link from high-authority pages to pages that need ranking support
- Use descriptive anchor text that matches the target page's topic
- Build topic clusters with hub pages linking to related content
External Link Quality
Focus on earning editorial backlinks through:
- Original research and data
- Comprehensive guides that become reference resources
- HARO/Connectively responses for journalist quotes
- Guest contributions on industry publications
8. Monitoring and Maintenance
Weekly Checks
- Search Console Performance: Compare week-over-week impressions and clicks
- Index coverage: Watch for new errors or drops in indexed page count
- Core Web Vitals: Monitor for regressions from deployments
Monthly Checks
- Full site crawl with Screaming Frog: Check for broken links, redirect chains, orphan pages
- Security headers scan at securityheaders.com
- Backlink profile review for toxic or lost links
Quarterly Checks
- Content audit: Identify pages with declining traffic for refresh
- Competitor analysis: Track ranking changes and content gaps
- Technical debt review: Address accumulated SEO issues
Priority Order for New Sites
If you are starting from scratch, address these in order:
- HTTPS + security headers (day 1)
- robots.txt + XML sitemap (day 1)
- Canonical tags + meta robots (week 1)
- Title tags + meta descriptions (week 1)
- Core Web Vitals optimization (week 2)
- Structured data implementation (week 2)
- Internal linking structure (week 3)
- Content quality and E-E-A-T signals (ongoing)
- Monitoring and alerting setup (week 4)
Every element in this guide is measurable and auditable. The difference between sites that rank and sites that do not is rarely about one factor. It is about consistently executing across all of them.