What Site Audit does
Site Audit crawls your website pages and generates:- A site health score (overall quality indicator)
- A prioritized issues list (what to fix first)
- Page-level details (meta, canonical, status codes, content metrics)
- Crawl configuration options (rendering, JavaScript, crawl limits)
Best use cases
- New client onboarding audit (first 10 to 100 pages)
- Finding duplicate meta, canonical, redirect chains, broken links
- Checking clean URL patterns (slug rules)
- Validating indexability signals (canonical, duplicates)
- Basic content quality signals (word count, readability, text ratio)
- Comparing before and after improvements (run again after fixes)
Quick start
- Login
- Connect Google Search Console (recommended)
- Select your Google account
- Choose the property (domain or URL-prefix)
- Configure crawl settings
- Enable browser rendering if your site is JS heavy
- Enable JavaScript if content loads dynamically
- Set crawl limit (start small, scale up later)
- Run the audit
- Wait for crawling and processing to complete
- Review results
- Check score
- Open the Issues section
- Review page-level details and affected URLs
- Fix the highest impact items first
- Re-run to confirm improvements
Crawl settings explained
Browser rendering
Use this when:- Your site is built with React, Next.js, Vue, Angular
- Content appears only after the page loads
- You want the crawler to see what a real browser sees
Enable JavaScript
Use this when:- Important text, links, or metadata are injected by JS
- You want to detect issues that only appear after hydration
Max pages (crawl limit)
Recommended approach:- Phase 1: 10 pages (smoke test)
- Phase 2: 100 pages (quick wins, top templates)
- Phase 3: 1,000+ pages (full audit, large site)
What you get in the report
1) Site Score
A summarized health indicator based on detected problems and severity. How to use it:- Treat it as a direction metric
- Your goal is steady improvement, not perfection in one pass
2) Issues list (prioritized)
Issues are grouped and ranked by impact. Common issue types:- Duplicate meta titles
- Duplicate meta descriptions
- Non SEO friendly URLs (slug issues)
- Canonical problems
- Redirect and status code issues
- Duplicate URLs or near-duplicates
- Thin content signals (very low word count)
- Broken internal links
3) Page and content metrics
Per page, you can review:- Word count
- Text ratio (text vs markup)
- Readability signals
- Link counts (internal and external)
- Status codes and redirects
How to prioritize fixes (recommended order)
Priority 1 (highest impact)
- 4xx pages (broken pages)
- 5xx pages (server issues)
- Redirect chains and loops
- Canonical mismatches that cause duplication
- Duplicate titles on important pages
Priority 2 (high ROI)
- Duplicate meta descriptions on important pages
- Non clean URL patterns (query spam, weird slugs, unnecessary params)
- Missing or inconsistent canonical tags
Priority 3 (quality improvements)
- Very thin pages that should rank
- Low readability pages in informational sections
- Pages with weak internal linking
Fix guidance (fast explanations)
Duplicate meta title
Why it matters:- Confuses search engines about which page should rank
- Can reduce CTR if SERP titles look repetitive
- Write unique titles per page template
- Ensure category, product, and blog templates generate unique variables
Duplicate meta description
Fix:- Use page-specific summaries
- For templates, auto-generate with unique page fields
Canonical issues
Symptoms:- Multiple URLs show the same content
- Canonical points to wrong page or missing
- Ensure one canonical URL per content
- Normalize trailing slash rules
- Control URL parameters (filter pages, tracking params)
Non SEO friendly URLs
Examples:- Random IDs, unreadable slugs
- Long query strings for indexable pages
- Use short, descriptive slugs
- Remove unnecessary params
- Standardize URL structure across templates
Redirect issues
Fix:- Update internal links to final destination
- Avoid chains, keep to 1 hop
- Use 301 for permanent moves
Recommended workflow for agencies and freelancers
- Run 10-page audit first
- Fix obvious blockers (status codes, canonical disasters)
- Run 100-page audit
- Fix template-level issues (meta duplication, URL rules)
- Run full crawl (1,000+)
- Export findings into tasks for the dev team
- Re-run audits after each fix batch
Troubleshooting
Crawl shows missing content
- Enable browser rendering
- Enable JavaScript
- Increase crawl delay if available
- Confirm pages are not blocked by robots.txt or WAF
Audit is slow
- Reduce max pages
- Turn off JS if site is mostly static
- Start with a smaller sample (top templates)
Results show many duplicates
- Check trailing slash consistency
- Check www vs non-www
- Check HTTP vs HTTPS
- Check parameters and filter URLs
FAQ
Can I run audits without GSC?
Yes. GSC connection helps with better context, but crawling still works.How often should I run audits?
- After major releases
- Weekly for fast-moving sites
- Monthly for stable sites
- Immediately after fixing technical issues to confirm improvements
What is the ideal first audit size?
10 pages. It helps you validate settings before you crawl 1,000 pages.Changelog notes (optional to maintain)
- v1: Initial Site Audit module documentation
- v1.1: Added JS and rendering guidance
- v1.2: Added prioritization framework
.png?fit=max&auto=format&n=8QpDbgnOnX2tAZSr&q=85&s=921f7e1aa490aa490de8911bceeb95c9)