Kristina Green
Oct 24, 2018 | 6 min read

This entire update is dedicated to one single module: SE Ranking’s Website Audit. Why you might ask, is it taking up all the spotlight? First of all, our entire team has worked long and hard for several months on getting this release ready! Secondly, frankly speaking, this is practically a new module, given the fact that it’s packed with additional tools, features, and a brand new design. For these reasons, we’ve put together a detailed overview of each section in the module and in the report so you can get familiar with all the new things we’ve added to the tool. We are also planning to post a short tutorial video – stay tuned for those updates as well. But for now – let’s dive into what’s new in our updated Website Audit.

Usability


Let’s start off with the simple stuff: we have totally revamped the interface of the Website audit module.

The new design is aimed at helping users better navigate the blocks, quickly assess the overall technical status of the site, and the number of errors that need to be fixed.

When running an SEO audit, you will be able to monitor the crawling process, see the completion percentage, as well as the queue position, if several projects have been created in your account.

Scanning speed

The scanning is now 10 times faster than before! Additionally, users can configure the maximum number of pages, the scanning depth and the number of requests per second, which in turn can speed up the scanning even more.

The Report Section

Website audit SE Ranking report with SEO errors

The main “Report” section has been equipped with graphs that show the dynamics of changes made to the website compared to the previous period.

Graphs that show the dynamics of changes made to the website

The analyzed parameters were left untouched:

Health check

Analysis of the site’s main technical features: site mirror, HTTP to HTTPS redirect, robots.txt, XML sitemap, duplicate pages, etc.

Page analysis

Tracking on-page errors: URLs are too long; pages blocked by robots.txt; size is too large; pages with Noindex meta tags, with rel=”canonical”, with rel=”alternate”, with redirect, etc.

Meta analysis

Checking title and meta description uniqueness, compliance with restrictions on the number of characters, duplicates.

Content analysis

Audit of HTML headings (h1-h6), content volume and uniqueness.

Links analysis

Monitoring the inbound and outbound links for each page. Recommendations on using Nofollow tags, anchor texts for key queries, etc.

Images analysis

Comments on image Alt texts and image optimization tips.

Optimization

Checking the mobile and desktop versions of the site, as well as their optimization process with regard to the latest search engine recommendations.

Usability and technologies

Checking the presence of a branded favicon, correct markup and a 404 error page on the site, as well as providing an analysis of the site loading speed and security status.

Generating an XML sitemap

A new feature has been added to the “Report” section: now you can quickly generate an XML sitemap to enable search engine crawlers to find the list of pages to be indexed. When generating a sitemap, you have an option of choosing the types of pages to include in the sitemap; specifying the page change frequency and the priority for different crawl depths.

Generating XML sitemap SE Ranking

The Crawled Pages Section

In this brand new section, you will find all of your site’s crawled pages, external links, and images, as well as their analysis against the most important SEO parameters.

Crawled pages

The analysis of each page separately ensures that not a single warning is missed. In case an error is found on a page, the platform will highlight the parameter that needs to be fixed.

Crawled pages analysis

The “Crawled pages”, “External links” and “Crawled images” subsections now have filters that enable you to conveniently work with the necessary selection. For example, you can easily sort pages by a specific error type and work only with those pages. You can create filters for one or several parameters, and export the results in the .xls file format.

Filters for one or several parameters to analyse your website

External links

Here you can find a collection of website links that lead to external resources, as well as the results of their analysis against the following parameters: server response, presence of the Nofollow tag, anchor text, crawl depth, web page that links out to an external resource.

Crawled images

Here you can find all the images placed on your site, as well as the results of their analysis against key parameters, such as server response, Alt text, size, web page where the image was found.

The Compare Crawls Section

Once two or more audits have been completed for a project, you can compare their results. You can see the points that improved and the ones that got worse.

You can choose the audit dates for comparison and see the trends of all analyzed indicators in an easy-to-interpret form.

Compare crawls of website audit

The Settings Section

The “Settings” section gives you the freedom to create convenient crawling conditions, specify the audit frequency, limits and restrictions, upload your own lists of pages to be audited, etc.

Schedule

Here you can create a schedule that will tell the platform when to run audits.

The following frequency settings are available: weekly, monthly or manually (i.e. manually restart the audit at any convenient time). You can also choose the audit date and time.

schedule that will tell the platform when to run audits

Source of pages for audit

Under settings, you can choose the pages that the system needs to crawl:

  • all pages of your site, like Google or Yandex bots;
  • include or exclude subdomains;
  • only crawl pages from the XML sitemap;
  • upload the XML sitemap;
  • upload your own list of pages in the .TXT or .CSV file format, for manual crawling (if, for example, you need to crawl new pages or pages blocked in robots.txt).

Rules for scanning pages

You can select specific rules for crawling your web pages or create them independently. For example, specify whether to take robots.txt directives into account or to ignore some URL parameters. Here you can exclude all link variable values or independently set the exclusion parameters.

Rules for scanning pages

Parser settings

In this section, you can choose a crawling bot, as well as provide access to pages that are blocked for web crawlers.

Parser settings

Limits and restrictions

You can set the maximum crawling depth, the number of queries per second, as well as the number of web pages to be crawled according to your data plan.

You can set different limits for each site under your account.

Report setup

When running an audit of website parameters, SE Ranking relies on current search engine recommendations. In the “Report setup” section, you can independently change the parameters that are taken into account by the platform when crawling sites and compiling reports. For example, the length of the Title meta tag or the maximum number of redirects.

Website audit report

Run an audit of your sites in the new “Website audit” module and let us know what you think in the comments below.

 

Share article
11 comments
  1. I love that I can track the progress of my audit, as in the past there was no info regarding that. Great! And the interface…. love it

  2. This is great guys. Big improvement on an already excellent audit. Compare crawls gives us a great baseline to check improvements. Love this.
    Keith

More Articles
SE Ranking News
WebSummit as it is, or why you need to put this event on your marketing roadmap
Jan 03, 2019 8 min read

This year was our second year at participating in WebSummit, and from the very beginning this conference became a turning point in incorporating events into our marketing roadmap. So let’s review what’s been working (and not working) for us so far at WebSummit which might help you to make a decision on whether this particular gathering is your cup of tea.

Julia Karnaux
SE Ranking News
SE Ranking updates: Fall 2018
Dec 13, 2018 6 min read

Were you missing our monthly updates about new releases? We bet you did and we missed sharing them with you too! That’s why we’ve decided to collect all of them in one digest. This way you’ll get a chance to savor all the new things we’ve rolled out in the last three months. Behold – we’ve got a lot of news to share and so the post ahead is pretty lengthy!

Julia Jung