Recently significantly expanded the list of aspects that must be considered when analyzing the quality of the site. First of all, this: mobile website optimization; regional resource optimization; the speed of loading pages and other components.
We have tried to collect in the article the factors that you can directly affect your website and have not considered external factors such as receiving and quality of backlinks, guest posts, and promotion articles.
For whom this article:
- for web designers and developers who want to create sites that are originally optimized for SEO,
- for owners of Internet resources, who want to understand by yourself SEO to increase search traffic.
Well-structured check-list will allow producing a thorough website audit and finding out what specific factors are negatively affecting performance, and get technical recommendations that should be applied in further work on the resource.
- Duplicates of pages (carefully look for different ways all the duplicates and work with them using the redirect and closing in robots.txt and using the attribute rel=”canonical”). Possible duplicates – http://www.site.ru/index.php (index.html, default.asp), closing slashes, pagination pages, the session variables in the url, search filters, tags of context and other advertising, etc. Other options for the duplicates: http:// and https://, /cat/dir/ and /dir/cat/, site.com and test.site.com, /session_id=123 and so on.
- Check for blank pages (which do not contain content). Blank pages can be:
- closed from indexing (in file robots.txt, see below),
- filled with content.
- “Endless pages” means such pages, where you can set the parameters to some other number and they will allow you to create an infinite number of duplicates for any page (this is often happen in calendars, pagination, products, etc.)
- Robots.txt (in Robots spelled out host, sitemap, closed all the service sections, pages to print, admin panel, test domains, URLs of pages from a previous website management system, etc.). Often, also recommend closing: the user profiles, page of creation new topic on forum, login page, personal account, personal messages page, search result page, shopping cart of online store.
- Check for redirects on the website (if there are superfluous that need to be removed). Types of redirects:
- 301 — requested document permanently moved to new URL
- 302 — the requested documents are temporarily available at another URL.
- Sitemap XML. In sitemap no service page and banned by the file robots and meta-tag robots.
- Is specified encoding in the code.
- Check the presence and the uniqueness of the tags title, description, keywords on every page (all these elements must be different for each page). Are there any missing Titles, meta description, keywords. How much Titles and Description are attractive for clicks.
- Try to add the title tag all the keywords, most popular keyword should be closer to the beginning of the tag.
- Use the symbol “|” to separate the different phrases in the tag title (for example, name of the page and site).
- The tag description does not affect the ranking of a website directly, but search engines may use the content for the snippet of a website in the results. The description length can be from 100 to 250 characters, ideally 155 characters. It is usually one or two meaningful sentences to describe the page, including search queries.
- Analysis optimized images on the website. To learn about how to optimize images, it is possible in the help section of Google.
- Failure rate (viewing one page), in which pages are many failures, make a list, fill them better.
- What are the main points of entry and exit of the site (better to study the content and usability of the main entry points, to analyze where and why they go).
- How many pages the average user views (maybe add interesting blocks on the page, “see also”, etc.).
- Add the favicon of the site.
- Styles and scripts should be loaded in head as separate files.
- The page can have only one header h1. The h1 heading should not copy title. Title was created to describe the page, not part of the content of the page. H1 describes the whole page, but h1 only certain content. They carry different meaning, which may coincide.
- The attributes alt and title for images should be different. Alt is alternative text for the image if it is not loaded. Title is the title of the picture that POPs up when you hover over the picture and goes in search.
- Use HTTPS. Google representatives say that the transition of the site on the HTTPS Protocol – with the addition of 2048-bit SSL key – improve your position on the results pages of the search. Google advises web developers to test their sites with HTTPS using the Qualys Lab tool.
Technical audit of the website
- Check the speed of website loading. This is one of the important factors that affect the ranking of a website in search engines. You can check using Google PageSpeed or Google Search Console.
To check what items take the longest to load and to consider options for optimization.
- Check your website on mobile devices. This can be done in Google Search Console.
- The presence of errors in Google Search Console->Crawl->Crawl Errors.
- For some sites it makes sense to check the load testing by service http://loadimpact.com/ (free test the behavior of the hosting and the site load up to 50 visitors at a time).
- Using the Gzip Test tool, make sure that gzip compression on the server for the website is included.
- Check the website for viruses:
- Norton Safe Web, from Symantec– So, how can you find out if a Web site is a safety risk before you visit it? Norton Safe Web is a new reputation service from Symantec. Our servers analyze Web sites to see how they will affect you and your computer.
- AVG Online Web Page Scanner – lets you check the safety of individual web pages you are about to visit. LinkScanner will examine the web page in real time to see whether it’s hiding any suspicious downloads.
- Checks your web pages for broken links.
http://www.siteliner.com/ – Find duplicate content, broken links, and more. The free scan is limited to 250 pages.
- Checks your Cascading Style Sheets (CSS).
- Checks HTML for errors or warnings
Powerful Website Audit Tools You Should Check
SEO Site Checkup
SEO Site Checkup is a free analysis tool that audits the entire website with 45 checks in 6 different categories (common SEO issues, server&security, mobile usability, social media and semantic web). With due consideration of these results, the tool shows up a general score and a number of failed checks.
Seoptimer is a free auditing tool that helps you instantly report critical errors on your website in seconds and recommend what you should do to improve your search rankings.
The tool tests your website’s effectiveness based on 50 parameters, instantly identifies problem areas to fix and shows up all SEO technical mistakes.
SE Ranking Website Audit
Not for only identifying website errors at a glance, but also for preparing a list of tasks for website developers, content writers and web designers to prevent from technical issues. It crawls your entire website based on over 70 parameters such as Domain overview, Pages, Meta, Content, Images and Links Analysis, Mobile Optimization, Usability and Technologies.
Моz Crawl Test
Moz runs its own site crawler that helps webmasters to check out critical issues, HTTP status codes and other useful data. It also figures out duplicate content, errors in the title tag, server redirects and many other factors that can affect website crawlability.
With its platform called SimilarWeb, the company uses big data technology for the collection, measurement, analysis and provision data on behavioral patterns and statistics of user engagement of websites and mobile applications. A similar tool Alexa collects statistics from users who have installed a special toolbar. In the case SimilarWeb data comes from their own crawler + data from millions of users with a special toolbar. Is possible to know the list of websites that your visitors have visited with your site. This will allow you to learn more about the interests of the audience and be one step ahead of competitors.
About the CleanTalk service
CleanTalk is a cloud service to protect websites from spambots. CleanTalk uses protection methods that are invisible to the visitors of the website. This allows you to abandon the methods of protection that require the user to prove that he is a human (captcha, question-answer etc.).