Technical SEO errors are quiet. Unlike a 404 page or a broken form, they rarely announce themselves. A misconfigured robots.txt can silently block Googlebot from your entire site for weeks before you notice a rankings drop. Here are the 10 checks that should be part of every site audit.
1. Title Tag — Present, Unique, and the Right Length
Missing title tags are an automatic Critical deduction. Duplicate titles across pages confuse search engines about which page to rank for a given query. The optimal length is 30–60 characters — short enough to display without truncation in most SERPs.
2. Meta Description
While not a direct ranking factor, the meta description drives click-through rate. A missing description lets Google auto-generate one — usually poorly. Keep it under 160 characters and include your target keyword.
3. H1 Tag — One Per Page
Multiple H1s are not an error in HTML5, but they dilute topical authority signals. Best practice: one H1 per page, containing your primary keyword, closely matching the title tag.
4. Canonical Tag
Canonical tags tell search engines which URL is the "original" when duplicate content exists across multiple URLs (e.g., ?ref= tracking parameters, HTTPS vs HTTP, trailing slash variants).
5. robots.txt
A single incorrectly placed Disallow: / line blocks all crawlers from your entire site. Verify your robots.txt is accessible, valid, and doesn't accidentally block Googlebot.
6. XML Sitemap
Sitemaps accelerate indexing of new and updated pages. Your sitemap should list only canonical, indexable URLs — not noindex pages, pagination variants, or redirect chains.
7. Open Graph Tags
OG tags control how your content appears when shared on social media and in AI-generated previews. At minimum: og:title, og:description, og:image, and og:url.
8. Structured Data
JSON-LD structured data enables rich results in Google Search (star ratings, FAQs, breadcrumbs, how-to steps). It also dramatically improves your GEO Score by giving AI engines structured, machine-readable content to cite.
9. HTTPS
HTTPS has been a Google ranking signal since 2014. In 2026 it also affects user trust signals, Core Web Vitals measurements, and whether modern browsers load your resources without warnings.
10. llms.txt
The /llms.txt file is to AI crawlers what robots.txt is to traditional search bots. It signals to GPTBot, PerplexityBot, and others which content is available for training and citation. Sites without it miss a growing share of AI-driven referral traffic.