Guide to Google Penalties: Full list and ways to avoid them

Written by
Maria Yefimenko
May 30, 2021
20 min read

Google is constantly working on its algorithms to improve search quality and give users the best possible answers to their search queries. However, it would be naive to think that algorithms can do the whole job flawlessly. 

In addition to algorithms, Google also has a Search Quality team that manually reviews websites that show signs of unethical behavior. In fact, it is a full-time job for the Google Search Quality team to decide whether a particular website has or has not violated the rules. Thus, webmasters should take seriously any piece of information related to manual actions that have been found on a site. 

AUDIT YOUR WEBSITE
Check your website's quality and get suggestions on how to improve it to avoid penalties

In this article, we’ll overview all Google penalties and explain how to fix them to reverse the damage.

But first, let’s get the basics straight.

What is a Google Penalty

Google penalties are like penalties in professional sports. Basically, their main purpose is to punish sites that break the rules. When a website, whether intentionally or not, is abusing quality guidelines, Google can use algorithms to downgrade it or issue a manual action. 

Google update or penalty

The most common mistake is to think that algorithms and penalties are the same thing. For example, Panda and Hummingbird are not penalties but algorithms, which are based on instructions and complete a particular task automatically.    

Marketers often mistake algorithms for penalties for the following reason: when Google decides that a website doesn’t meet the quality criteria, its rankings seem to be pushed down, which also happens when a website is penalized. However, Google hasn’t literally pushed down your site; instead, it has rewarded other more quality sites and fairly scored your site.  

Unlike programs, people can deal with non-standard situations and are good at understanding context. Thus, when algorithms mistakenly find or miss violations, humans can play a decisive role in determining that websites play fair or not. If according to a reviewer, particular pages violated webmaster quality guidelines, then Google would issue a manual action against this site. 

If a website has been flagged as one that tried to manipulate search results, it can be ranked lower or excluded from Google Search.

Algorithms vs Penalties

Although manual actions are considered to be quite precise in detecting spam or other bad behavior, they have one significant pitfall: they will remain unchanged until a reconsideration request is successfully processed. That’s why it is so important to treat these notifications seriously and immediately take action. 

Why Google Penalties exist

When a website’s behavior is obviously affecting search results, Google’s team steps in and takes action against these bad SEO techniques:

  1. User-generated spam
  2. Spammy free host
  3. Structured data issue
  4. Unnatural links to your site 
  5. Unnatural links from your site
  6. Thin content with little or no added value
  7. Cloaking and/or sneaky redirects
  8. Pure spam
  9. Cloaked images
  10. Hidden text and/or keyword stuffing
  11. AMP content mismatch
  12. Sneaky mobile redirects
  13. News and Discover policy violations

Each penalty can affect websites differently and should be treated accordingly. Let’s first discuss how you can check if your site has any manual actions.

Find out if you have manual actions against your site

Google Search Console is the fastest way to know if your site’s been penalized. Besides being a very handy tool for checking a website’s performance and indexation issues, GSC is also a great penalty checker. If you haven’t added your site on Google Search Console yet, follow this step-by-step guide. Otherwise, choose your property and click on the Manual actions tab. 

Manual actions report

If you have no issues, you’ll see the following cheerful green checkmark:

No issues detected

If your site has penalties, they will be displayed in red in the same window.

1 issue detected

So, how to know if a penalty is affecting your site and your traffic? 

Google Analytics will help you with it. For example, knowing the date when a manual action was issued, you can view your rankings and search traffic before and after the penalty. If you see abrupt changes in search traffic or big fluctuations in your rankings—that can be because of a manual action. That’s why it is recommended that you conduct Google keyword ranking checks on a regular basis to identify any notable drops in your ranking positions.

Changes in search traffic

Learn what Google Analytics is and how to set it up to track the impact of penalties on your site in our GA guide.

With the help of SE Ranking’s Analytics & Traffic tool, you can track your organic traffic and see changes in rankings compared to the selected period, see the source of your website traffic and other metrics, such as the total number of sessions, page views, bounce rate, and more. All the details are conveniently placed on one page and visualized in the form of graphs and tables. You can try this tool for free during the 14-day trial period

SE Ranking organic traffic

Note that Google will notify you about a manual action via the Google Search Console message center. So when your search ranking drops but you don’t see any notifications, that’s probably because of an algorithm. 

Basic principles of Google’s Webmaster guidelines 

Google’s Webmaster Guidelines were created so that webmasters could know what Google needed to rank a particular site. Any website owner and SEO specialist should regularly check these guidelines to make sure that their websites are clean from Google’s perspective. There are general guidelines, which describe how to help Google find, index, and rank your site, and quality guidelines, which tell about prohibited actions that may lead to manual spam actions. Let’s look at both of them.

General guidelines include three instructions: 1) how to help Google find your pages, 2) how to help Google understand your pages, 3) how to help visitors use your pages.

So basically the first tip is about making sure that crawlers and humans can reach any of your website’s pages. 

After making your site reachable, you should try to make the content as SEO-friendly as possible: that includes useful and quality content on pages, descriptive <title> elements, alt attributes, clear page hierarchy, etc. 

And the last thing is to make your pages user-friendly: see if all links go to real pages, optimize the page loading time, make your site adaptable to all kinds of devices, etc.

Go to Webmaster Guidelines to check the updated information.

Google also advises the following basic principles of quality guidelines:

  • Optimize your pages for users in the first place, then for search engines.
  • Don’t try to trick your users.
  • Avoid unethical methods to improve rankings. 
  • Make your website different from your competitors and valuable for users. 

General steps to fix a manual action on your site

After receiving notification about identified issues, you can see the list of steps to be taken to resolve the particular problem. Depending on the penalty, these recommendations may differ.

To see this information, open the manual action description and check which pages are flagged. Then click Learn more to read about the issue in detail and find steps to resolve it. 

Make sure that you fixed all the issues if you had multiple manual actions. Allow Google crawlers to access the affected pages: check if they are not hidden from indexation with noindex directive and the X-Robots-Tag. With the help of the URL Inspection tool, you can easily check if Google can index your pages.

To identify the problem, you can export the data over long spans of time and see when the decline took place. Then, find URLs or groups of URLs that lost traffic. With this information, you can analyze what happened before the decline and what actions could have resulted in penalties.

URL Inspection tool

When you’ve eventually fixed all the issues, contact Google to review the manual action and revoke the penalty—this is called a reconsideration request. Before doing this, make sure that you can provide supporting documents that prove your actions and intentions. 

For example, if you were doing paid links, you should not only disavow and remove them but also persuade Google that such actions will never happen again and you’ll not get back to spamming. For this, you should provide the analyst team with detailed steps you’ve taken to make your site compliant with Google Webmaster Guidelines.  

How long to wait for a manual action removal

Unlike algorithms, Google penalties last until you take care of them. Once you have resolved the issues and submitted a reconsideration request, the final decision may take from a couple of days to several weeks. Google will notify you via email when they start looking into the matter and when they complete the review.  

Google’s analyst John Mueller has also mentioned that when a webmaster removes a manual action, the website is instantly free of any penalties. However, it can take a while for Google Search to process it as well as its effects.

Full list of Google Penalties and how to fix them

If Google thinks that you violated any of its guidelines, some of your pages will get a manual action. Depending on what rule has been broken, you might get one of the following notifications.

User-generated spam

We bet you have seen a lot of the annoying and inappropriate posts in forums or blog comments that are actually a hidden advertisement. This type of content can be either automatically generated or manual, and is submitted by site visitors.

How to fix

  1. Find pages on your site where users could add spam content.
  2. Find profiles with suspicious usernames, for example commercial names, posts that are not related to the page content, advertisement, automatic posts, etc. 
  3. Search your website for spam, or keywords that are unrelated to your site’s niche. For example, type in Google search for [site:website.com casino] to find the word “casino” on your pages.
  4. Delete all off-topic content.
  5. Go to the Manual Actions report and select Request Review.
  6. Check your messages for the review status.

Spammy free host

On some websites that use free hosting services, you can see a lot of spam users and advertisement comments. If a large portion of the pages on a web hosting is spammy, Google may issue manual action on all the websites that use this hosting service.

How to fix

  1. Check your free hosting service for spam signals. That can be a lot of ad sections, redirects, spammy keywords, etc.
  2. Delete any spammy profiles from your service.
  3. Report the manual action to the technical team at your hosting service.
  4. If other sites on the hosting server use spammy techniques, it’s better to choose more secure and reliable service.
  5. Go to the Manual Actions report and select Request Review.

Structured data issue

Structured data is a “behind-the-scenes” instruction that tells browsers how content should be organized on the website and tells the search engine what’s on the page.

One of the examples of structured data is rich snippets. If you want your article or video to get more visual appeal in search, you can use structured data to describe what is on your page. Still, if a website uses structured data to manipulate the visitors, that may result in a manual action. For example, if an online shop that sells shoes provides different prices in the structured data and on its website for the visitors; or when a website displays rating scores and stars but doesn’t have a voting system anywhere.

How to fix

  1. Make sure that you do not violate General structured data guidelines and check the following:
  • all content is visible to the readers
  • there is no irrelevant or misleading content
  • your pages don’t contain content that engages in illegal activities
  • structure truly represents the page content
  1. Remove any markup that violates the guidelines.
  2. Go to the Manual Actions report and select Request Review.

Link schemes are aimed at manipulating search engines so that websites could get better rankings. However, if Google discovers that your site has engaged in any spammy tactics, it can negatively impact your ranking. Among the deceptive techniques are sending money for links, excessive link exchanges, using automated programs to generate links to your site, and more. Please see the guidelines on link schemes for more details.

How to fix

  1. Go to Google Search Console and download the list of links to your site: Top linking sites > Export.
Top linking sites
  1. Go through this list and see if any of them violate Google’s guidelines. For large sites, first find the properties that link to you the most, or find recently created links.     
  2. After discovering all the unnatural links, contact the webmaster of a site and ask to remove the bad links. Also, you can use the Disavow links tool in GSC if you could not remove the links. Note that it’s much better to remove the links than adding them to the disavow file, and that will bring you better chances of revoking the manual action.
  3. After removing or disavowing the links, choose Request Review and add the list of the links you’ve deleted as well as an explanation why you could not delete certain links.

You can also identify low domain rating sites and spam domains that link to your site with the help of SE Ranking’s Backlink Checker tool. Here, you can filter URLs by domain, quickly locate how many of them come from a particular website. With the help of Domain Trust, anchor texts, and other metrics you can find potentially spam sites. 

Backlink Checker
CHECK ANY SITE'S BACKLINKS
Check which domains link out to any website and get ideas for enhancing your backlink profile.

This manual action is applied to properties that have an extreme volume of outbound links to other websites, or deceptive anchor text. 

How to fix

  1. Find links on your site that violate Google’s linking guidelines, such as paid links. You can export the full list of links with the help of the Website Audit tool. Select your project, go to Website Audit > Found Links. Here you can choose the External tab to see all the links that go from your site and filter them by status code, link type (image, CSS, hyperlink, etc), and other parameters.
Found links
  1. Remove these links or change them by adding a rel=”nofollow”.
  2. After taking these steps, select Request Review in GSC and describe your actions: what bad links you’ve removed and what quality content you’ve added. 

Thin content with little or no added value

Thin content, or low-quality pages, are those that add little or no value to a user. For example, a manual action may be issued against sites that have automatically generated texts, poor-quality guest posts, republished content, etc. So, hiring someone to create several badly written articles is not a good idea and can get you penalized.

How to fix

  1. Check your content for plagiarism
  2. Read the information on these web pages and think if it brings any value to users. 
  3. Look for thin content that contains affiliate links and auto-generated posts.
  4. After identifying these pages, try to evaluate them and think about how you can improve the quality of the content. 
  5. Change your pages so that they become more valuable for visitors. 
  6. Choose Request Review and send examples of low-quality texts that you lifted and useful articles that you added.

Cloaking and/or sneaky redirects

Showing different content to users and search engines is not considered a good practice and can lead to penalties. Your pages shouldn’t contain invisible or hidden text, flash pages, or redirect users to spam domains. 

How to fix

  1. Use the URL Inspection tool in Google Search Console to check pages that were reported as cloaked or redirecting users. Here you can check if Google can access a page and see what content is not showing up normally.
  2. Visit your site and compare what you see with what is shown by the URL Inspection tool.
  3. Remove parts of your site that show different content to search engines and users. 
  4. Find and remove URLs that bring users to spam sites.
  5. Go to the Manual Actions report and select Request Review.

Pure spam

Pure spam is considered to be the severest manual actions of all as it is kept for the spammiest of websites that are engaged in “black hat” practices, such as churn and burn SEO. Even if just one of the pages contains automatically generated content, Google issues this manual action against the entire site. It requires a tremendous effort to recover from Pure Spam. You need to check that there is nothing left on the site that could violate the Webmaster’s Guidelines. Basically, it’s like creating a brand new site with high-quality content.

How to fix

  1. Check the quality guidelines and update your site by removing any signs of spam: automatically generated content, link schemes, pages with no original content, sneaky redirects, and other aggressive practices.
  2. When you file a reconsideration request, describe all the steps that you’ve done to clean up your site. For example, if you have bought this site from someone else and then discovered spam, describe the situation in detail so that Google could investigate it.

Cloaked images

Image cloaking can be implemented in different ways. For example, if some of your pages serve images that are covered with a block of text or another image, or serve different images to the search engine and to visitors.

How to fix

  1. Check if your website shows the same pictures to visitors and in search results. 
  2. If you are sure that your images are the same when looking in Google Search and directly on your site, then submit a reconsideration request.  

Hidden text and/or keyword stuffing 

Such action is issued against sites that try to manipulate Google algorithms by trying to “stuff” a lot of keywords into one page to get ranked. This is called keyword stuffing and that is strongly prohibited by Google. Another unethical practice is adding content for crawlers that is hidden from users. 

How to fix

  1. Find content that is visible to crawlers but hidden from visitors with the URL Inspection tool.
  2. Select all the text on a page by pressing Command + A and reveal the invisible content.
  3. Either delete or make this content visible to humans and the search engine.
  4. Go to your site and check the main content, <title> tags, and alt text for the repeated keywords or phrases that don’t make much sense.  
  5. Delete such words and make sure there is no keyword stuffing on your pages.
  6. Submit a request. 

AMP content mismatch

AMP (Accelerated Mobile Pages) were designed by Google to help develop pages that are fast, user friendly, and high-performing across all devices, unlike regular mobile pages.

These pages should contain similar information as their canonical web pages. Even if the text is not the same, make sure that the topic remains unchanged. AMP should also allow users to accomplish the same tasks as on the canonical page.

How to fix

  1. Check if the AMP is referencing the right canonical page.
  2. Make the content of a canonical page and AMP similar. 
  3. Make sure that users and Google see the page in the same way. Use the URL Inspection tool to check the AMP and the canonical page. 
  4. After fixing the issues, choose Request Review.

Sneaky mobile redirects

Redirecting mobile users to different content can worsen the user experience and violates guidelines. These can be redirects to other sites, which can’t be accessed by Google, or malicious redirects.

How to fix

If you’re unintentionally engaging in this activity:

  1. Go to the Security Issues report and check if your site has been hacked. 
  2. In case it is not hacked, try to find and remove any third-party scripts that are out of your control. Then, check your site from a mobile device to make sure that there are no more sneaky redirects.

If you’re intentionally engaging in this activity:

  1. Remove the redirects from the flagged pages.
  2. Visit these pages from a smartphone and confirm your fix.
  3. Select Request Review on the Manual Actions report.

News and Discover policy violations

Content policies for Google News and Discover forbid using adult, harassing, hateful, misleading, or any other type of content that violates Webmaster’s Guidelines. Check all the violation types that might inflict manual action on your site. 

How to fix

  1. Review and change your pages according to the policy by removing the forbidden content.
  2. After fixing the issues, choose Request Review.

Conclusion

Manual actions can be a big headache and can affect a website’s search visibility, traffic, and profit. That’s why you should regularly make checks in GSC and ensure that your site is compliant with Webmaster Guidelines. If you think that your site has lost its search visibility, start with checking the Manual Actions report in Google Search Console and then take further steps. 

Try not to take manual actions as the end of the world. Although they can cause you to lose traffic and rankings, any website can recover. If you’ve experienced an unexpected drop in search, take this opportunity to make sweeping changes. Once you overcome the shock, there are many ways to improve search rankings, visibility, and other metrics, or make them even better than previously. 

Subscribe to our blog!

Sign up for our newsletters and digests to get news, expert articles, and tips on SEO

Thank you!
You have been successfully subscribed to our blog!
Please check your email to confirm the subscription.