We’ve scanned over 80,000 websites with our Website Audit tool within the last 12 months. Upon analyzing the reports on each one of them, we compiled a list of the most common SEO issues.
While some of these issues can harm your SEO, others are more like reminders that there are things you should take care of on your site.
This post is for everyone who prefers to learn from other people’s mistakes.
Check out what problems you most likely have on your website and learn how to fix them.
How we did the research and the results we got
87,028 websites — the exact number of websites we analyzed in our research. All of them were scanned within the past year, so all the numbers are fresh and relevant.
But I want to note that, in our research, we took into account a single mistake only once per website. So, if a website has the same issue on different pages, it’s included in the stats only once.
For this reason, our data reflects the “popularity” of tech problems among the analyzed websites, regardless of their region, business niche or size.
We’ve decided to merge all tag-related issues under one roof, because usually if you have problems with one tag, you have problems with all of them.
This is raw data that desperately needs to be commented on in detail. Below, you’ll find an explanation along with tips on every issue that appears in our stats (and most like appears on your website too).
#1. Images with missing alt text
The most common SEO problem detected by our Website Audit is the missing alt tag. So, why is it a problem at all?
The thing is, while Google is very good at understanding the text content of a page, it’s still quite a challenge for it to figure out the images.
The alt text (alternative text) is an HTML element designed to explain the meaning of images to search robots as well as make them more accessible to users.
In 2018, Google Images’ search was the second biggest search channel, falling right behind the classic Google SERP. In fact, the alt text is one of the main elements of image optimization and your ticket to getting your pictures associated with the proper keywords. This is especially important for eCommerce websites, since each offered item is illustrated with an image.
In other words, alt texts help search engines better understand and rank your images.
The tag also comes in handy for users when the picture doesn’t load up for some reason, and also for people with impaired vision who have trouble perceiving visual information.
Then why do so many websites have issues with missing alt tags? Because it requires additional time and effort. Most website owners simply overlook this issue due to the fact that lots of CMSs don’t provide the option of adding alt texts by default. In such cases, you need to manually add the tag to the HTML code of your pages.
#2. H-tags up to funny business
Although Google claims that even the h1 tag is not that critical for SEO, h-tags are still very appreciated by users since they help them scan content and enhance the reader experience.
H1 tags help both search engines and users understand what the page is about at a glance. While H2 tags help structure the content of the page and make it easier to scan the page within several seconds. Your blog articles or product landing pages will most likely require h1 and h2 tags.
Also, pay attention to duplicate h1 tags. They can confuse search engines or make your pages compete for the same keyword. But if your h1 tags duplicate headings of supporting pages or pages of different language versions — it’s definitely not something to worry about.
Here’s the main takeaway from h-tags: Everything that makes your pages better for users makes a difference to Google. Besides, properly optimized h-tags help Google extract information from your page for Featured Snippets.
So, when you see h-tag problems on your website, the first thing you should look at is the page that has these issues. If it’s a target page in your SEO strategy, then, of course, you need to take “health measures” to keep your h-tags spotless, as in no empty tags, no duplicates, and properly optimized.
#3. Pages with a low word count
64 percent of the analyzed websites have pages with fewer than 250 symbols. There are different website types and, for example, website-galleries won’t have lots of text content on most of the pages — just a caption for each image. Just take a look at the websites of Founded agency, Awwwards or CSSwinner. They get the job done with a minimum amount of text on their homepages.
But 64 percent of websites cannot all be galleries. So, this can only mean that most of these websites have empty pages (or almost empty). Why would you need a page containing nothing? Even if your main content is multimedia files, they still need some text to accompany them, because that’s what search engines understand best.
If you have pages with a low word count, check whether you need them at all. And if you do, expand their content to get a higher ranking.
#4. Missing or empty meta description
Google says that even though sometimes they use the description meta tag for the snippets, they still don’t use the description meta tag in the rankings.
Should we then forget about it? No! We should focus on Google using “the description meta tag for the snippets” part.
Since it’s largely used for snippets in Google’s SERP, the meta description affect the CTR of our links featured in the search results. If optimized properly, the main keywords will be highlighted in the description, which signals the user — this page is what you are looking for.
Neil Patel summed it up the right way: Stop thinking about meta descriptions as a ranking factor, and start thinking about them as a conversion factor.
Using plugins helps remember to add a description tag to the pages you are promoting. For example, for WordPress websites, you can use the Yoast plugin. It includes a snippet editor in your admin panel, where you can check whether or not the length of your description is fine and does, in fact, include the target keyword.
#5. Pages with no inbound internal links
It’s quite worrying to see that almost half of the analyzed websites (42 percent) have pages with no links connecting it to other website pages because internal links serve three important goals:
- They enable users to navigate your website. No inbound internal links to the page — no chance the users will find it.
- They help structure the website content hierarchy. By leading users link by link from page to page, you are pushing them to follow your predefined conversion path.
- They help distribute link juice across the website.
Having pages without internal links can be justified, for example, if you create a landing page for a particular promo campaign such as Black Friday, Christmas sale, and so on. Such pages only exist during the period of the sale. As a rule of thumb, people can find this landing page by following the link posted on social media or sent in a newsletter, so there’s no need to have links leading to them from other website pages.
Other than that, it’s simply wrong to have isolated pages from the SEO perspective. Search engines won’t find them while scanning your website, and users won’t get there when browsing your site.
The question is, why would you need to have a page like this then?
If the Website Audit identifies isolated pages on your website, the first thing you have to do is decide whether the page is valuable. If so, revise your website structure and figure out the best way to build links to this page from other sections and pages of your website. By “the best way” I mean the way users and search engines can get to the page without applying extra effort.
If there’s no point in having the page, it’s better to delete it altogether.
#6. External links with 4xx status
In most cases, a link with a 4xx status is a link that leads to a page with a 404 error, meaning it’s a broken link. This usually happens when the destination website removes the linked page or changes its URL without redirecting users to the new one. As a result, you have a link leading to nowhere, which is ridiculously annoying for users. Google also won’t appreciate you sending users to dead pages — if you have plenty of out-of-control links, it’s a signal to the search engines that your content may be outdated.
The thing is, if you don’t monitor your outbound link status, you won’t notice a dead link. So the best thing to do is to run a Website Audit regularly. You can choose how often you want to automatically get audit reports — this will help you spot broken links before users start complaining. And the final step: Remove the link or change it.
If right now you’re feeling a bit frustrated with the websites you link out to (why in the world do they keep error pages up without redirects?!) — hold on. There’s a big chance you also have 404 pages that you don’t know about.
#7. Pages with 4xx response
Over a third of the analyzed websites (38%) have pages with a 4xx response. In most cases, a 4xx response appears for a 404 error, and having pages with the 404 error is not that terrible in itself. However, you should take care of every internal link to such pages. Linking out to dead pages within the website is much worse than having broken outbound links.
An internal broken link is one of the Four Horsemen of the “bounce rate” Apocalypse, so you need to either delete links to pages with the 404 error or set up a 301 redirect.
As was mentioned above, a page with a 404 response doesn’t harm your SEO directly, it’s all about links and user comfort. And to reduce user frustration as a result of seeing an error page, you should make your 404-page look cute!
There are lots of design options to do that, so set your imagination free 🙂
#8. Frame is present
The problem is that most site owners don’t even know what a frame is, even though 38 out of 100 websites have it.
The frame is an HTML element that lets you display the content of other websites on your own web page. Putting it in other words, it embeds a page within another page. The main purpose of the tag is to use a component of other websites’ content without duplicating it. For example, when embedding a YouTube video or a Twitter post (e.g. this BuzzFeed post has lots of elements embedded via iframe), gifs, or even a PDF file.
SlideShare uses iframes for presentations and Gmail uses the iframe-tag for many things, for example, to embed Hangout chats. Besides, with frames, we can insert tools for user analytics (e.g. Google Tag Manager) or dynamic maps into our pages.
The key negative thing about iframes is that you don’t control them — not the way content works, nor it’s secureness. Your page’s CSS doesn’t apply to content within the iframe. So, if you embed the content of a malicious website, you can jeopardize not only your particular page but your entire website.
Besides, having a frame tag means having several URLs on the same page, which can be confusing for search engines. Google stated that it doesn’t guarantee that they associate framed content with the page containing the frame.
Finally, framed content is terrible for user experience. The look and style of such elements are not adjustable, so most of them look awkward on the page. On top of that, framed content is not flexible for users — they can’t change their sizes and have to interact with them within a small framed region on the page. It’s also hardly adaptable for mobile users, which is no longer tolerated by search engines.
Our Website Audit is smart and knows that not all frames are actual errors, so it doesn’t count as a mistake iframes with URLs from YouTube, Facebook, Vimeo, and Google Tag Manager.
If the audit found frames on your website, don’t freak out. Just check to see if the tags embed secure URLs and look decently on your page. But in the future, try to avoid using frames. Include them only if there’s no other option.
#9. No redirect from HTTP to HTTPS page version
Since Google announced HTTPS to be a ranking signal, lots of websites moved to the secured protocol. Foremost, it was critical for websites storing and dealing with sensitive information (such as user personal data, credit card details, etc.). Then Google started marking HTTP websites as non-secure, which pushed more and more websites to transit to the secure version.
Today, statistics show that a third of websites have trouble redirecting users from HTTP to HTTPS. And why is this a big deal?
The thing is that if you don’t set a correct redirection, you won’t get the boost promised by search engines for HTTPS websites. Besides, there’s a chance that your pages will lose their ranking positions in the SERP.
So, what you should do is find every HTTP page you forgot to redirect to HTTPS. Google strongly recommends using 301 redirects on a URL by URL basis. It makes your migration plans clear to the search giant and shows that you’re doing everything the same way on your secure version as on the HTTP one: you have the same content, you noindex the same pages, and set the same rules in the robots.txt file for both versions. This is what Google calls a clean migration.
#10. Missing XML sitemap
An XML sitemap is a file with a list of a website’s URLs and information about how they are related to each other. It helps search engines spot every page on your website, even isolated ones or those without links from other sites.
Technically, you don’t need an XML sitemap to get your pages indexed because search engines scan your website link by link. But there’s always a “but”.
First, are you sure your pages are linked properly and that Google will be able to find all of your pages by scanning them link by link?
Well, an XML sitemap is a way to introduce all of your pages to search engines. Even if your website is a million-page e-commerce portal, it becomes an open book for Google if it has a sitemap.
Also, if your site is full of media content (images, videos) that you want to get indexed, adding them to your XML sitemap will speed up the process.
If our Website Audit found out that your website is missing the XML sitemap, you should generate and submit it to Google. The former you can do right in our Website Audit panel with a single click:
Note that our audit bot searches for the sitemap by looking in your robots.txt file and by adding sitemap.xml to your domain. It can miss the sitemap if you keep it in any other format than XML or on a non-traceable URL.
Using the Website Audit tool is like running a regular checkup when you spot minor health problems and prevent them from developing into something bigger. Some issues require immediate reaction, some may have a deferred effect, but the worst mistake is always the one you don’t know about.
I strongly advise you to develop the habit of checking your website’s technical condition to make sure that none of the small mistakes mentioned here can harm your long-term efforts.