SEO Audit: Quick Tips & Tricks to Sky Rocket Your Keyword Ranking
07 Jun 2018
Getting Half yearly or yearly website audits is now history ever since you have started to look into hundreds of factors for getting your website ranked on the first page. Knowing that Google gets more firm regarding penalties from link spam to content to code ratio. If your website is not as per google specifications you may experience a dead drop in your keywords ranking and eventually traffic. To make sure that everything is perfect on the website and is according to Google standards one should carry out an SEO audit.
But, lets first understand what is SEO Audit, its type? And how frequently you should be auditing your website to ensure your website don’t lose your spot on SERPs.
What is SEO Audit?
And If I had to tell you that you may need a quarterly audit to check these 200 rankings factors outlined by Brian Dean.
These SEO ranking factors are easy to understand for people that are practicing SEO for the last 2-3 years. You may need a couple of resources depending upon your what fixes you need in your website. Issues that related to indexation, redirection .htaccess file so you may need a web developer.
Now let’s, study a little about the benefits of SEO audit.
Benefit of SEO Audit
- Know where you lack: SEO audit helps to identify the weak spots of the website that create hindrance in ranking high on Search engines. For eg. While carrying out audit you come across content duplication issue or speed is low. Finding out such issues helps in first step i.e. implementing a good SEO strategy.
- Know what competitors are doing: Audit helps to find out the strategy or activities competitors are implementing, and which is helping them rank on Google.
What does SEO Audit Includes?
SEO audit is the in-depth analysis of the website. There are various factors taken into consideration while carrying out the audit. Some of them are mentioned here:
- Technical SEO audit:
- Crawl errors
- Site structure
- Sitemaps (XML or HTML)
- Canonical Issues
- Redirection Issues
- On-page SEO Audit
- Meta Description
- Image Alt
- Internal Linking
- Content quality
- Off-page SEO Audit
- Backlink profile
- Opportunity to create new backlinks
- Content Marketing
- Competitive Analysis
- Keyword Research
What one should expect from SEO Audit?
There are 3 points that one should expect from SEO Audit. Which are as follows:
So now when you have complete understanding on what is a SEO audit and what does it includes. Here is the comprehensive checklist that you must consider while doing the SEO audit. The analysis of the website is broken into 3 steps
- On-page Ranking Factor
1. On-page Seo Audit
The characteristics of a website page influences the search engine ranking. For each page you audit, and you check for page level characteristics and domain level characteristics for entire website.
The page level analysis becomes important to identify places that needs optimization and the domain level analysis helps to calculate the efforts needed to make correction throughout the website.
a. URL Structure
The URL are entry point to the page. There are few things to keep in mind while analyzing the URL.
- The length of the URL: The length of the URL should be between 75-115 characters
- The URL should contain relevant keyword; it helps to define the content of the page properly.
- The URL of each page should be unique and formatted properly. The bad URL format will have unnecessary folders, Underscore (__), question mark (?) etc.
For eg: http://www.example.com/index/folder/?badURL/Format_SEO/ The URL that search engine prefers have keywords in the URL with “- “separating the keywords and no unwanted folders are present For Eg. https://www.example.com/url-format-seo/ Here are some more examples for you to understand.
b. Content Duplication based on URL
Once you have analyzed the URL and fixed the issue the next steps are to check for content duplication based on URLs. URLs are the unique entry point to the website and they are mainly responsible for content duplication.
Content duplication happens when two distinct URL points to same page, and search engine consider it as same page.
The identifying character for any page is its title. It’s the first thing noticed on Search engines and social media. Therefore, evaluating titles becomes important for any website audit. There are few things to consider while evaluating the page title
- The length of title should be maximum 70 characters. If the length is more than 70 characters the title is truncated on search engine listing
- Title tag should describe the content of the page. So, you not fooling the user in any way and will also help you have higher CTR’s
- Title Tag is one of the important ranking factor on the page so make sure that you use the targeted keyword right in the starting of the title.
- Make sure that you have unique titles throughout the website. You can check for duplication of meats in your webmaster account. When you click on HTML improvements under Search appearance.
d. Meta Description
The best practice for meta description is like that of Title Tags. Meta description not only is one of the ranking factor, but it also affects the click through rate of the page in search engine results.
For Eg. The meta description of Minds Metricks homepage talks about the company and the services it offers. With all important keywords used properly in the given character limit
e. Canonical Tags
Canonical tag defines the source URL of a given page. Canonical tags are used to declare a single page as its own source or for duplicate pages to reference their source/originating page. The duplicate content issue arises due to multiple URL pointing to same page is tacked with the help of canonical tag.
Here’s a Canonical Tag example:
<link rel= ”canonical” href=” https://www.mindsmetricks.com/”>
f. Images Optimization
It is generally said a picture is equal to 1000 words for users. But it is different in case of search engines. For search engines pictures doesn’t speak, meaning search engines doesn’t read images. Therefore, it becomes important to give meta data to the images for the search engine to read it.
g. Header Tags
Header tags are the HTML markups used to distinguish heading and sub heading on a website page. The header tags are from h1 – h6. Header tags are generally keyword rich.
Your website doesn’t not exist if it’s not accessible by search engine bots and users. Below are the few things that you should keep in mind to make sure your website is accessible.
The Robots.txt file instruct search engine crawlers to crawl the pages that are necessary and block the ones which are not important. Below is the robots file of Minds Metricks
It is suggested to manually check the robots file to make sure it is not restricting any of the important pages from getting crawled.
Wrong creation and submission of robots.txt file can give you nightmare with traffic and ranking. It has the power to throw your website from 1st page to 10 pages of google.
If a site's robots.txt file disallows crawling these resources, it can affect how well Google renders and indexes the page, which can affect the page's ranking in Google search.
We at Minds Metricks faced similar problem when our blocked resource count suddenly increased to 44.
We saw a considerable drop in ranking once the blocked resources were reported on 8th May 2018. We saw a drop of 4% in keyword ranking which impacted us hugely.
On recognizing the issue, we made changes in the robots.txt making sure that Google Bot can access all the resources.
The no. of blocked resources gradually decreased helping us to regain our ranking.
Always keep an eye on your blocked resources to avoid complication like rendering and indexation issue. Leading to drop in ranking.
b. Robots Meta Tags
The robots meta tag inform crawlers whether they can index specific page and follow its links. Make sure while auditing the website you check for pages that are blocked due to meta robots. This block in
c. HTTP Status Code:
It becomes difficult for users and search engine to access the website if the URL returns to errors such as 4xx(http) and 5xx(server) errors.
Screaming frog is a tool that helps you to identify the http status of all the pages that exist on your site. The one showing 4xx and 5xx errors should be fixed immediately. If a broken URL's corresponding page is no longer available on your site, redirect the URL to a relevant replacement.
Also, while analyzing the website using screaming frog, ensure all your redirects are 301 and not 302 because most of the link juice is passed to their destination page by 301 redirects which is called as permanent redirection.
d. XML Sitemap
XML sitemap is the roadmap for the search engine crawlers to find all the pages on the website. There are few things that one must keep in mind while creating an XML sitemap.
- There is a specific format for the sitemap that search engine expects. If the sitemap of the website is not in that format crawlers may not crawl the website correctly.
- Submitting sitemap to webmaster is another important step. It notifies the crawlers about the location of the sitemap.
- If these pages still exist on the site, they are currently orphaned. Find an appropriate location for them in the site architecture, and make sure they receive at least one internal backlink.
Below is the example of the sitemap of Minds Metricks
3. Off-Page Factor
Off-page factors support the on-page activities through external sources. They help to generate ranking from external sources. Creating backlink is one of the most important factor to rank your website. Here’s a quick idea on how much each factor accounts for.
a. Backlink Profile
It is highly important to analyze the backlink profile of the website as the site’s authority is determined by the quality of links associated with the website. There are different tools available to get the backlink data like Ahref, backlinko, Majestic SEO, blekko and open site explorer.
Here’s a quick snapshot of the dashboard from , which is a perfect tool and help you analyze a lot of things on your website like Backlinks New, Old and lost, Referring domains, Anchor Text profile, Organic Traffic and many more
A website authority is determined by various combination of factors. To evaluate the websites authority SEOMoz has provided us by two important metrics viz. Domain Authority and Page Authority.
Page authority is the prediction of how well the page will perform on search engine. Whereas Domain authority predicts the performance for an entire domain. Both the metrics aggregate various features such as Moz rank, moz trust, quality and quantity of links, trustworthiness etc. to give an easy way to compare the relative strength of page and domain.
c. Social Engagement
With Web getting social, the success of the website depends on various social platforms on its ability to attract more social mentions. Each Social network has different form of social currency such as like, follow, retweets, +1 etc. The more social currency you have for your website the better. But you should also keep a check on the quality of the profiles that is sharing your website content on social platforms.