Google updates its search algorithm thousands of times every year - and organic search still drives 53% of all website traffic. If your site has hidden technical problems, those updates can quietly push your pages off the first page of results without any warning.
An SEO audit is a full health check for your website. Think of it like taking your car in for a service - a mechanic inspects every part to find what is worn, broken, or running below its best. An SEO audit does the same thing for your site, checking whether search engines can find your pages, read them properly, and decide they are worth showing to people.
Search engines use automated programmes called bots to scan websites. When something goes wrong - a broken link, a slow-loading page, or a missing instruction file - those bots can get stuck or skip your content entirely. An audit finds exactly where those problems are hiding.
This checklist walks you through every layer of your website's health. You will start by learning how search engines scan your site and how to check whether your pages are actually showing up in Google's index. From there, you will measure how fast your pages load and fix anything that is slowing them down.
You will then look at the words and structure on each page - things like titles, headings, and descriptions - to make sure they clearly tell both readers and search engines what your content is about. After that, you will check the links pointing to your site from other websites, remove any harmful ones, and strengthen the links connecting your own pages together.
Finally, if your business serves a local area, you will learn how to make sure your details appear correctly in local search results and on Google Maps.
None of this requires a computer science degree. Every step in this guide is written for someone starting from zero. By the end, you will have a clear picture of what is holding your site back - and exactly what to fix first.
Inspecting Your Index Status with Google
Google Search Console's URL Inspection Tool shows you exactly whether Google has saved a specific page in its index - its giant library of web pages it uses to serve search results. If your page isn't in that library, it simply won't appear when anyone searches.
Paste any URL from your site into the tool and hit enter. Google returns one of two verdicts: indexed (Google knows about it and can show it) or not indexed (Google either hasn't found it or chose to ignore it).
Honestly, most beginners are surprised by what they find here. Pages they assumed were live and visible turn out to be completely invisible to Google.
- Open Google Search Console - Go to search.google.com/search-console and select your property. You need to have already verified ownership of your site.
- Paste Your URL - Type or paste the full page address into the inspection bar at the top. Use the exact URL, including https://.
- Read the Index Status - Google tells you whether the page is indexed. If it isn't, it shows a reason - blocked by robots.txt, a noindex tag, or a crawl error.
- Check Your Robots.txt File - Navigate to yourdomain.com/robots.txt in your browser. Visually scan it for any rules that block CSS, JavaScript, or image files. Google needs access to these files to render your pages correctly - blocking them is like handing someone a book with half the pages glued shut.
- Submit Your XML Sitemap - Inside Search Console, go to Sitemaps and submit your sitemap.xml file. Check the report for errors, including duplicate pages, error pages listed in the sitemap, and HTTP URLs appearing on an HTTPS site.
HTTP URLs listed inside an HTTPS sitemap are one of the most common errors Google flags - check your sitemap file directly and confirm every URL starts with https://, not http://.
Your robots.txt file deserves a second look beyond just the obvious blocks. Years ago, SEOs routinely blocked JavaScript files to save crawl resources, but Google now needs those files to fully understand your pages. Any blocked resource is a potential blind spot.
Fixing index errors gives you a clean foundation - but knowing which pages Google can reach is only half the picture. Understanding how Google moves between those pages, and whether it's wasting time on low-value URLs, shapes everything about how well your site performs in search.
Mapping Every Page for Better Crawling
Map your site poorly, and search engines waste their limited time crawling pages that will never rank - leaving your best content undiscovered.
Crawl budget is the number of pages a search engine will scan on your site within a given period. Google does not crawl every page every day, so you want it spending that budget on pages that actually matter.
Low-value pages eat into that budget fast. Thin content pages, duplicate pages, and old staging URLs all pull search bots away from your real content.
Screaming Frog SEO Spider is the go-to tool for mapping this out. The free version crawls up to 500 URLs, which covers most small sites. Larger sites need the paid version at $279 per year.
Run a crawl and Screaming Frog shows you every page it finds - exactly as a search engine would see your site. From there, you can spot problems that are silently hurting your rankings.
What to Look For During Your Crawl
Start by checking for pages tagged with noindex. A noindex tag tells search engines to ignore that page entirely. That is useful for private pages, but if it is sitting on a product page or blog post by accident, that page will never appear in search results.
Screaming Frog flags every noindex tag it finds across your site. Review each one and confirm it is intentional.
Next, identify thin content pages - pages with very little text or value. These drag down your overall site quality in Google's eyes. Either expand them with useful content, merge them into a stronger page, or remove them completely.
- Check for duplicate pages serving the same content at different URLs
- Find old tag pages, category archives, or filtered URLs with no real content
- Spot pages returning 4xx errors that are still listed in your sitemap
- Remove HTTP URLs from your sitemap if your site runs on HTTPS
Your XML sitemap should only list pages you want Google to index and rank. Keeping error pages, duplicate content, or noindex pages inside your sitemap sends mixed signals - and wastes crawl budget.
Honestly, most beginners skip the sitemap cleanup entirely. That single oversight sends search bots chasing dead ends instead of your best pages.
Once your sitemap is clean and your noindex tags are intentional, search engines move through your site with far less friction - which sets you up to catch the next layer of problems, the broken links and server errors that quietly block bots from reaching your content at all.
Repairing Broken Links and Server Errors
Broken links are not just an annoyance - they are dead ends that stop both visitors and search bots cold. Every dead end wastes your crawl budget and chips away at user trust.
Two error families cause most of the damage. 4xx errors are client-side problems, meaning the page is missing or the URL was typed wrong - a classic example is the familiar 404 "page not found." 5xx errors are server-side failures, meaning your server crashed or choked before it could send anything back.
Fixing these is not optional. Search engines log every error they hit, and a site riddled with them signals poor maintenance. Honestly, clearing errors is the single fastest way to show Google your site is worth crawling regularly.
How to Find and Fix Every Error
- Crawl Your Site First - Run Screaming Frog SEO Spider across your entire site to surface all broken URLs in one report. The free version handles up to 500 URLs; the paid version costs $279 per year and removes that limit.
- Cross-Check in Google Search Console - Open the Coverage report to see which URLs Google itself flagged as errors. This shows real-world crawl failures, not just theoretical ones.
- Fix 4xx Errors First - Update any internal links pointing to dead pages, and set up a 301 redirect - a permanent redirect - to send users and bots to the correct live page. A 301 passes the SEO value from the old URL to the new one, so you do not lose ranking power.
- Investigate 5xx Errors Separately - These need your hosting provider or developer, because the problem lives on the server itself. Common causes include overloaded servers or broken database connections.
- Eliminate Redirect Chains - A redirect chain happens when URL A sends you to URL B, which sends you to URL C. Each extra hop slows down crawling and dilutes link equity. Point every redirect directly to the final destination URL.
- Break Redirect Loops - A redirect loop is when URL A redirects to URL B, which redirects back to URL A. Bots get stuck and give up entirely. Use Screaming Frog's redirect report to spot and cut these cycles.
After fixing redirects, re-crawl your site to confirm no chains remain. One direct 301 per moved page is the rule - no exceptions.
External broken links matter too. Links pointing out to dead third-party pages make your site look neglected. Remove or replace them during the same audit pass.
Once every URL either loads correctly or points cleanly to a working page via a single 301, your site becomes far easier for search engines to crawl - and far less frustrating for real people to use.
Measuring Your Core Web Vitals Score
Google uses three specific speed metrics to judge whether your website gives users a good experience - and sites that fail these tests rank lower in search results. These three metrics are called Core Web Vitals, and every site owner needs to know them.
Largest Contentful Paint (LCP) measures how long it takes for the biggest visible element on your page to load - usually a hero image or a large heading. A good LCP score is under 2.5 seconds. Anything slower and Google considers your page too sluggish.
First Input Delay (FID) measures how quickly your page responds when a visitor first clicks something, like a button or a link. A fast page reacts in under 100 milliseconds. Slow FID means your page feels frozen, even if it looks fully loaded.
Cumulative Layout Shift (CLS) measures visual stability - how much your page jumps around as it loads. A score below 0.1 is good. Higher scores mean buttons and text are shifting position, which causes users to click the wrong thing by accident.
Run PageSpeed Insights on your most important landing page first, not your homepage - landing pages drive conversions, so fixing their Core Web Vitals scores delivers the fastest business impact.
Finding your scores is free and takes under two minutes. Go to Google PageSpeed Insights and paste in your URL. The tool analyses your page and returns a score from 0 to 100, broken down by each Core Web Vital.
Pay close attention to the mobile results tab. Google ranks websites based on their mobile performance first, so a page that passes on desktop but fails on mobile is still a problem for your search rankings.
Each metric shows a clear status: green means passed, orange means needs improvement, and red means failed. Your job is to identify which of the three metrics shows red or orange - that is the one demanding your attention first.
GTmetrix works as a useful second opinion. It gives you a waterfall chart showing exactly which files load slowly, making it easier to pinpoint the specific cause behind a poor LCP or FID score.
Once you know which metric is failing, the most common culprit behind slow LCP scores is oversized images taking too long to download - which is exactly where compressing your media files makes an immediate, measurable difference.
Compressing Heavy Media for Faster Browsing
A single uncompressed image can weigh more than an entire webpage should - and on mobile, that difference costs you visitors before they even see your content. Reducing the "weight" of your media and code is one of the most direct ways to cut load times fast.
Image compression is the process of shrinking a photo's file size without making it look blurry or pixelated. Free tools like Squoosh or TinyPNG do this automatically, and most users cannot tell the difference between a compressed image and the original.
Beyond compression, you also need to resize your media assets properly. Uploading a 4,000-pixel-wide photo for a 600-pixel column is wasteful - the browser downloads all that extra data for nothing. Match your image dimensions to the space they actually fill on screen.
Alt text matters here too, even though it sounds like a content issue. Image alt text is a short written description attached to every image, and search engines read it to understand what the picture shows. Descriptive alt text with a relevant keyword helps both SEO rankings and accessibility.
Code files cause their own loading problems. Render-blocking resources are JavaScript or CSS files that force the browser to stop and fully load before showing anything on screen. Removing or deferring these files - pushing them to load after the visible page appears - cuts that waiting time significantly.
Minification is the next step for your code files. JavaScript and CSS minification strips out spaces, line breaks, and comments that humans use for readability but browsers do not need. Plugins like Autoptimize handle this automatically on WordPress sites - honestly, it is one of the easiest wins on this entire checklist.
Setting a browser caching policy tells visitors' browsers to save copies of your files locally. Next time that person visits, the browser loads from their device instead of downloading everything again, which makes repeat visits dramatically faster.
- Compress images using Squoosh or TinyPNG before uploading
- Resize media assets to match their display dimensions on the page
- Add descriptive, keyword-relevant alt text to every image
- Identify and defer render-blocking JavaScript and CSS files
- Minify JavaScript and CSS to remove unnecessary characters
- Set a browser caching policy so repeat visitors load pages faster
- Check mobile touch elements - buttons and forms must be large enough to tap easily
Mobile users deserve special attention on touch elements. Buttons and navigation links that are too small or too close together frustrate users and push them away. Google's Mobile-Friendly Test flags these issues directly, so run it after any layout changes.
Drafting Clickable Titles and Meta Descriptions
A library book with a blank spine gets ignored, and your search result works exactly the same way. Your title tag and meta description are the two lines of text Google shows people before they ever visit your site - get them wrong, and nobody clicks.
Title tags have a firm limit of around 60 characters. Go longer and Google cuts your title off mid-sentence, replacing the end with "..." - which looks unprofessional and loses your message. Keep it tight, include your main keyword near the front, and make every word earn its place.
Your meta description is the short summary that appears below the title in search results. Google caps these at roughly 155 characters. Use that space to tell the reader exactly what they will get and add a clear call to action - something like "Learn how to fix it in minutes" works far better than a dry sentence that just restates the page title.
Write your title tag under 60 characters with your primary keyword first, then write a meta description under 155 characters that ends with a direct call to action - this combination consistently increases click-through rates.
Each page also needs one H1 tag - the main on-page heading that tells both readers and search engines what the page covers. One page, one H1. No exceptions.
Finding missing or broken tags across a whole website by hand is impossible at scale. Screaming Frog SEem> crawls every page on your site and flags missing title tags, duplicate meta descriptions, and H1 problems in one report. The free version handles up to 500 URLs, and the paid version costs $279 per year.
Once you have written a title, check how it actually looks in a search result before publishing. The Title Tag Preview Tool shows you a live preview of your title and description exactly as Google displays them - so you catch any cut-off text before it goes live.
Checking your work on a live page is just as fast with the MozBar browser extension. Install it, visit any page, and your title tag, meta description, and H1 appear in a toolbar across the top of the screen without opening a single extra tool.
- Keep title tags under 60 characters with the target keyword near the start
- Write meta descriptions under 155 characters with a clear call to action
- Use one unique H1 tag per page - never duplicate it across pages
- Run Screaming Frog to find missing or duplicate tags across your whole site
- Use the Title Tag Preview Tool to confirm nothing gets cut off in search results
- Install MozBar for quick, page-by-page checks without leaving your browser
Organizing Content with Proper Header Hierarchy
Headers are your page's skeleton. Search engines read them to understand exactly what your content covers and how the ideas connect.
Every page on your site follows a header hierarchy - a system that runs from H1 down to H6, like chapters and sub-chapters in a book. Each level signals importance, with H1 at the top and H6 at the bottom.
Your H1 is the most important tag on the page. Each page gets exactly one H1, and it must contain your primary keyword - the main topic you want that page to rank for.
Below the H1, your H2 tags act as major section headings. Your H3 tags break those sections into smaller pieces. A well-built page reads like an outline: one big topic at the top, supported by organised sub-topics beneath it.
Placing your primary keyword within the first 100 words of your page content is just as important as the H1 itself. Search engines weight early content more heavily, so get your main topic on the page fast.
One mistake beginners make constantly is keyword stuffing - cramming the same keyword into every heading and paragraph. Google penalises this. Your headers should describe the content in that section naturally, not repeat the same phrase on a loop.
Related to stuffing is a separate problem called keyword cannibalization. This happens when two or more pages on your site target the exact same keyword. They end up competing against each other in search results, which weakens both pages. Each page must target a unique topic.
Honestly, cannibalization is one of the most underdiagnosed problems on websites with more than 20 pages. Most site owners have no idea it is happening until their rankings drop for no obvious reason.
To find the right supporting keywords for your H2s and H3s, use a tool like Clearscope. It analyses top-ranking pages and suggests semantically related terms - words and phrases that signal to search engines your content covers a topic thoroughly.
Screaming Frog, mentioned in the title tag section above, also flags pages with missing or duplicate H1 tags during a crawl. Running that check takes minutes and surfaces structural problems across your entire site at once.
A clean header structure gives search engines a clear map of your page. Build that map deliberately, and ranking becomes a much more predictable process.
Cleaning Up Toxic Backlink Patterns
Semrush Backlink Analytics and Ahrefs both scan every link pointing at your site and flag ones that look suspicious or harmful. Bad links can drag your rankings down just as surely as good links lift them up.
A toxic backlink is a link from a low-quality or spammy site that Google associates with manipulation. Search engines treat your site partly by the company it keeps - links from shady sources signal that your site belongs in the same category.
One of the clearest signs of trouble is a bad ratio between referring domains and total backlinks. If one domain links to you 200 times but you only have 50 total referring domains, something is off. Natural link profiles spread links across many different, unrelated sites.
Link farms are networks of fake websites built purely to sell links. Spotting them is easier than it sounds - look for sites sharing similar IP addresses, identical page templates, and heavy cross-linking between each other. Ahrefs domain intelligence shows you the registration dates and WHOIS ownership data that reveal these patterns.
Negative SEO attacks are a real threat worth knowing about. A competitor - or a bad actor - can deliberately point thousands of spammy links at your site to trigger a Google penalty. Checking your backlink profile regularly in Semrush or Ahrefs lets you catch these attacks early.
When you find toxic links, your first move should be to contact the site owner and request removal. Most requests go ignored, which is why Google built the disavow tool.
Here is how to handle toxic links step by step:
- Run a full backlink audit in Semrush Backlink Analytics or Ahrefs.
- Flag links from link farms, low-quality directories, or sites with no real content.
- Check the referring domain to total backlink ratio for unnatural spikes.
- Email site owners requesting link removal.
- Compile unresolved toxic links into a plain text disavow file.
- Upload the file via the Disavow Tool inside Google Search Console.
The Disavow Tool is a last resort - Google's own words, not a marketing phrase. Overusing it on normal links does more harm than good, so only disavow links you are confident are genuinely harmful.
Cleaning up external links that hurt your site is only half the picture. The links inside your own site also shape how Google reads your authority - and fixing your internal linking strategy is where that work continues.
Updating Your Internal Linking Strategy
Most websites leak ranking power every single day - not because of bad content, but because their pages don't link to each other properly. Internal linking is the practice of connecting your own pages together, which shares authority between them and helps users find more of your content.
Every link on your site passes a small amount of what SEOs call ranking power - sometimes called "link equity" - from one page to another. Pages buried deep in your site with no links pointing to them get almost none of this power, which means Google treats them as low priority.
Finding and Fixing Orphan Pages
Start by hunting for orphan pages - pages that exist on your site but have zero internal links pointing to them. Google has a much harder time finding these pages, so they rarely rank well even if the content is good.
Run your site through a crawler like Screaming Frog (free up to 500 URLs, paid version at $279/year) and cross-reference the results with your sitemap. Any URL in the sitemap that receives no internal links is an orphan - and it needs a bridge.
Your blog posts are the easiest fix for orphan service pages - add a contextual link from a relevant article directly to the service page you want to rank higher.
Steps to Build a Stronger Internal Link Web
- Audit Your Internal Links - Use Screaming Frog or Ahrefs Site Audit to map every internal link on your site. You need a clear picture of which pages are well-connected and which are isolated.
- Identify Orphan Pages - Compare your crawl data against your sitemap. Flag any page that appears in the sitemap but receives no internal links from other pages.
- Connect Blog Content to Service Pages - Find blog posts that cover topics related to your service pages, then add a natural link from the post to the service page. This builds a direct bridge between your content and your most important pages.
- Use Descriptive Anchor Text - Anchor text is the clickable word or phrase in a link. Instead of writing "click here," write something like "our SEO audit services." Descriptive anchors tell Google exactly what the linked page is about.
- Vary Your Anchor Text - Using the exact same anchor text on every link looks unnatural. Mix in synonyms, partial phrases, and brand names to keep your anchor text diversity healthy.
- Prioritise Deep Pages - Your homepage already gets plenty of links. Focus your internal linking effort on deeper pages - product pages, location pages, and older blog posts - that rarely get linked to naturally.
Building this web of connections keeps users on your site longer because they always have somewhere relevant to go next. Search engines follow the same paths your visitors do, so more connections mean more pages discovered and ranked.
Syncing Your Google Business Profile Data
Your Google Business Profile (GBP) is your storefront on Google's map. When someone searches "coffee shop near me," Google pulls business details from GBP to decide who appears in the Map Pack - the three local results shown above the regular search results.
Getting into that Map Pack starts with one rule: every detail on your profile must be accurate and consistent.
Start With Your Legal Business Name
Your business name on GBP must match your legal registered name - exactly. No extra words, no location tags shoehorned in, no keyword phrases added to boost rankings.
Google calls that last trick keyword stuffing in business names, and it violates their rules directly. If caught, your listing faces penalties or removal. If you genuinely want a keyword in your trade name, file a "doing business as" (DBA) registration - that gives you a legal basis for using it.
NAP Consistency Is Non-Negotiable
NAP stands for Name, Address, and Phone number. These three details must be identical everywhere online - your GBP, your website footer, every directory listing. Even a small difference, like "St." versus "Street," sends mixed signals to Google.
Check your current GBP against your website footer right now. Honestly, most beginners discover at least one mismatch on the first pass.
Categories, Hours, and Photos
Choosing the right primary category is one of the highest-impact decisions on your entire profile. It tells Google what your business fundamentally does. Secondary categories cover additional services you offer.
Update your hours for every public holiday. A listing showing incorrect hours during a bank holiday frustrates customers and damages trust - Google notices when users report wrong information.
Add real photos of your local office or storefront. Profiles with genuine location photos perform better in local results than those using only stock images.
Reviews and Service Area
Respond to every review - positive and negative. A short, professional reply to a one-star review shows Google and potential customers that a real person manages this business.
Define your service area precisely inside GBP settings. Setting an area that is too broad signals to Google that your relevance is vague, which works against you in local rankings.
- Verify your legal business name matches your GBP exactly
- Confirm NAP details are identical across all directories and your website
- Set accurate primary and secondary business categories
- Update holiday hours before every public holiday
- Upload real photos of your physical location
- Respond to all recent reviews, both positive and negative
- Define a precise service area - avoid setting it too wide
Once your profile data is clean and consistent, the next step is reinforcing those local signals directly on your website - which is exactly what embedding maps and local keywords achieves.
Embedding Maps and Local Keywords
Most local businesses lose search rankings not because their service is bad, but because Google cannot confirm they are real. Proving your physical location to search engines takes deliberate, specific steps - and most websites skip half of them.
Local SEO works by sending local signals - small pieces of information scattered across your website that tell Google exactly where your business operates. The more consistent and clear those signals are, the more confidently Google places you in local search results.
Honestly, this is one of the most underrated parts of an SEO audit. Two hours of work here beats months of chasing backlinks.
How to Strengthen Your Local Signals
- Embed a Google Map on Your Contact Page - Go to Google Maps, search your business address, click "Share," then select "Embed a map," and paste the code into your contact page. This single addition confirms your physical location directly to Google's crawlers.
- Check NAP Consistency in Your Footer - Your NAP (Name, Address, and Phone Number) must appear in the footer of every page on your site. More importantly, it must match your Google Business Profile exactly - same abbreviations, same phone format, same spelling. Even "St." versus "Street" creates a mismatch Google notices.
- Optimise Service Pages for Local Keywords - Every main service page needs a specific service-plus-location combination. "Plumber in Bristol" beats "plumber" every time. Review each service page and add the city or region naturally into the title tag, H1, and first paragraph.
- Run a Local Citation Audit - A local citation is any online directory listing your business details, such as Yelp, Yell, or Thomson Local. Use a tool like Semrush or BrightLocal to find every listing and fix any inconsistencies. Wrong phone numbers or old addresses actively damage your local rankings.
- Build High-Quality Local Backlinks - Links from local newspapers, community organisations, or regional business directories carry serious weight. One link from a respected local source outperforms ten generic directory listings.
Each step above targets the same goal: making it effortless for Google to verify your business location. When your map, footer NAP, service pages, citations, and backlinks all point to the same address, search engines stop second-guessing you.
After fixing these signals, revisit your Google Business Profile and confirm every detail there matches your website footer exactly. Mismatches between your site and your profile are the most common reason local rankings stall despite doing everything else right.
Conclusion
An SEO audit is not a one-time project you finish and forget. It is an ongoing maintenance routine, like changing the oil in a car - skip it long enough, and things break down quietly until they stop working altogether.
Modern audit tools offer more than 140 technical checks across crawling, speed, content, and links. That number is not meant to overwhelm you. It is meant to show you that there is always something to improve, and that small, consistent fixes add up faster than one big overhaul.
Here are the most important things to take away from this audit:
- Crawling comes first. If Google cannot read your pages, nothing else matters. Check your robots.txt file and submit a clean sitemap in Google Search Console before touching anything else.
- Speed is a ranking signal, not a bonus. Your Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS) scores directly affect where you appear in search results. Check them in PageSpeed Insights now.
- Every page needs one clear job. One H1, one target keyword, one topic. Pages that try to rank for everything end up ranking for nothing.
- Bad backlinks are a real threat. A link from a low-quality site can drag your rankings down. Use Semrush or Ahrefs to check your backlink profile at least once per quarter.
- Local businesses need two things to match exactly: the Name, Address, and Phone Number on your website must be identical to your Google Business Profile - letter for letter.
Today, open Google Search Console and run the URL Inspection Tool on your five most important pages. Note which ones are indexed and which are not.
Then set a calendar reminder for 90 days from now. Quarterly audits are the recommended standard - not because the rules change every week, but because your site does.
A site that gets checked regularly will always outrank a site that does not.
