Website technical audit

We do website technical audit to find any technical error on our website that are hindering the SEO performance and fix them properly.

You  can say technical audit as health checkup for website.

While some errors can be fixed without any technical knowhow, but some requires web developer help.

There are hundreds of SEO tools that provide one click solution to find technical errors, but I prefer manual audit as online tools can’t always accurate enough.

Below are the factors that we should look for while doing a manual technical SEO audit.

There are various levels of technical audits, but I am going to point out without grouping them to make the list simple.

1.Findability of website

You need to make sure your website is easily found by both google bot and users. If it’s not findable, putting efforts in SEO is not worth it.

To do this you can do the following : 

Go to google search console and find Index report. It shows how many pages are indexed to be shown in the future.

Also search “site:your site url” to see indexed pages.

You can also use screaming frog tool.

2. Server uptime

This is the measure of your host and server how long it can keep your site online and reduce downtime.

If your site is offline, you need to think about changing the hosting.

Free tools such as pingdom is there to help you. Also check uptimerobot.com

3. Check robot .txt file

Robot.txt is a text file uploaded to main directory of a website. It works as directive to the search crawler or bots. 

It instructs them which folder to crawl and which to not. We can also hide specific pages, folder using proper command.

Having a robot.txt file improves index rate. 

This robot.txt file should be publicly available.

4. Meta robot tag

Meta robot tags has the same functionality of robot.txt but these are used as index directive of individual pages. They are placed within the head tag.

Parameters are follow, nofollow, index, noindex.

5. Client-side errors

Status codes are issued by a server when it fails to response a request from client. Severs shows these errors as a http status code.

40x errors are shown when errors takes place in client side.

Most common error is 404 error. 

The HTTP 404 Not Found Error means that the webpage you were trying to reach could not be found on the server. 

It is a Client-side Error which means that either the page has been removed or moved and the URL was not changed accordingly, or that you typed in the URL incorrectly. 

Having too many such pages is a bad signal for SEO and provides bad user experience.

So these pages need to be de indexed or parmanently redirected to a similar page so as to keep the valuable links.

Check these in Google search console.

6. server side 50x errors

These errors takes place when a server failed to perform a request made by a client.

Like 500 Internal server error

501 not implemented

502 bad gateway 

Check in google search console crawl errors. Moving website to a good hosting provider may solve this issue.

7. Html sitemap

Html sitemaps are a list of links of all the pages in a website on a specific page. 

These are created so that bots can reach the deep pages easily and distribute the link equity equally.

For larger sites and ecommerce sites this is a must for good seo practice.

8. XML sitemap

Xml sitemaps are created for search engine crawlers so that they can easily index all the urls in that website.

These includes not only URLs but other metadata like, frequency of changes, last modified date, how important it is etc.

Adding a xml sitemap you also protect the content of your site.

There are various sitemaps such as, image sitemap, video sitemaps etc.

Many online free tools are available to create xml sitemap, then upload to your public folder.

Don’t forget to add this on google search console.

9. Pagination

Some content on a website are split into multiple pages, but these should be treated as a single page. Here comes the pagination tags. 

We need to include the “rel=prev” “rel=next” in respective pages.

10. Custom 404 pages

Custom 404 pages are necessary to avoid bad user experience and send the traffic to a similar page when a particular page is not available for a moment.

It helps to keep the traffic.

11. Breadcrumbs navigation

Breadcrumbs is a trail of navigational link placed above the content of a page.

This is useful for better user experience, tells user where actualyy they are within the site and helps to get back them to previous menu, category or page.

Need for e-commerce or large paged websites.

12. Top level navigation

We all know about this. The horizontal or vertical menu bar where links to important pages are placed in a logical order.

The TLN should be in HTML, no JS or other code should be used.

13. Footer analysis

Footer of a website is the bottom most part, we all know about it.

We can provide important non  sales pages here like Privacy policy, career, sitemap, FAQ etc. these pages can’t be placed at tln bar.

It also help to pass equity to various pages equally.

14. Site architecture

While analyzing this we need to focus on one thing, maximum how clicks away from home page user has to reach to find a page within your site.

The more the clicks the bad user experience. Try to keep it within 3 to 4 clicks away.

Check screaming frog SEO tool.

15. URL delimiter check

URL is a major part of SEO. In URL we have to use hyphen ‘-’ instead of underscore ‘_’. Hyphens are treated as space by search engines, underscores are seen as a character in URL. 

So, to make the url structure of a page or blog post more user-friendly and readable, we need to use short, clean and specific URL ( using keyword ).

16.  Check relative URL and Absolute URL

For fast work, developers use a type of url for internal linking of web pages, it is called a relative url. The benefit of using this URL is it is short, hence helps pages load fast. 

But, from SEO perspective, it is bad practice.

Absolute URL are written in full form i.e <a href=”#url”>Anchor text</a>

It should be used for internal linking.

17. Meta hreflang

Meta hreflang is used to make search engine understand the language of content used in your website so that it can show that page to the user using those languages.

Use this if your website has multiple version of the content in different languages.

<link rel=”alternate” href=”http://example.com” hreflang=”en-us” />

Check page source t find this tag.

18. Check various on page SEO factors

Various onpage factors that must be checked are:

Meta tags: meta title, meta descriptions are shown to the search engine result page. 

Length of meta title is 65 character and meta description is 150-165 characters.

Avoid duplicate meta title and description issues, as it has a bad impact on SEO.

H1 tags tell the search engine about the content present. Use LSI keyword there.

19. Structured data markup

A snippet of code placed in the header. It says what your content is all about to the search engine crawler. 

Having this code on your website enables rich snippet shown on SERP page which may increase the CTR of your website.

Test on google structured data testing tool if it is enabled.

You can generate the code online for free.

20. Content analysis

Next comes analyzing the content used on the page. 

Check whether the content written relevant to the keyword, satisfy user intention, grammar and spelling of the content, keyword density, word count etc.

Duplicate, keyword stuffed or thin contents may degrade your site even it can be de-index by google.

It is always good practice to scan your content in Copyscape while outsourcing content to avoid plagiarism and penalty.

Check duplicate content with this tool: siteliner or screaming frog SEO tool.

21. Images 

Images used on your site should be hosted on your server, not on the third party.

Size should be less than 100kb.

Should contain relevant alt tags ( may include keywords )

22. Redirect issues

Check for redirect issues like 302, 301, preferred domain to non-preferred domain, and non-secure version to secure version 301 redirection.

Break any redirect chain.

All 302 redirect should be changed to 301 since it doesn’t pass link value.

Www to non-www or vice versa should be 301. Otherwise you may face duplicate website issue.

Also make sure http:// to https:// redirection is enabled.

Check in this tool: redirection-checker.org

23. User experience check

See how users are interacting with your website. 

Analyze Google analytics data such as bounce rate, exit page rate, goal completion, avg. time spent on pages, brand search, returning visitors etc and make necessary changes.

24. Link profile audit

Check website backlink profile. See link relevancy, link authority, link diversity, anchor text diversity etc.

There are various types of backlinks, 

contextual links

site-wide footer/sidebar links

directory links

resource page links

niche profile links

forums links

relevant blog comment links


Also find dofollow and nofollow link ratio.

25. Citation audit

If you are a local business, it is always must to check citation.

Having a consistent NAP-W ( name, address, phone and website ) is one of the ranking factor in local SEO.

Similar Posts