Staying at the top of the ranking ladder is tough. Your competitors will always try to do things better than you and take you down a few steps. Or at least that’s the case if they’re playing by the rules. If your rivals are desperate and believe in the “all is fair in love and war” rule, they may try to throw you off the ranking ladder altogether by resorting to negative SEO methods.
Negative SEO refers to manipulative techniques aimed at lowering a website’s rankings. It comes in all shapes and sizes, but the goal is always the same – to make Google believe that you are the one violating the rules and disregarding Google’s guidelines. The question here is if the search giant is smart enough to recognize such attempts early on and protect you from unethical rivals?
Let’s take a look at the most popular negative SEO practices and check how Google algorithms deal with such issues. Moreover, I’ll share some tips on how to react to every type of attack and which actions to take to stay safe.
Dealing with spammy links pointing to your website
Whenever the topic of negative SEO comes up, the discussion would often revolve around spammy backlinks. This is when your competitor builds links from so-called link farms or PBNs to your website hoping that Penguin will take you down. So, does it really work?
Many SEO specialists believe it does. You just need to build thousands of spammy backlinks and use the “right” anchor. Since Google penalizes websites for overusing anchor text that is an exact match to the keyword they want to rank for, competitors can deliberately build thousands of such links to your website.
Bombarding a website with adult backlinks is another popular approach. In the past, such attacks were actually pretty successful. In 2014, a podcast site about WordPress, WP Bacon, was attacked by thousands of links with the “porn movie” anchor text. The outcome was devastating: in just 10 days the website lost its top ranking spot, ultimately falling 50+ spots for most target keywords.
Later on, WP Bacon reported that it has managed to get out of Google’s disgrace by disavowing every bad link. At this point, many articles would start telling you in detail how to spot an influx of bad links and how to disavow them. I’ll just say that any backlink checking tool can help you discover bad links: you can get a paid solution or simply use GSC, which is readily available but may lack some data. Further on, you may use Google’s Disavow tool to tell Google which links it should disregard.
But what about WP Bacon, you may ask. According to the information from the Web Archive, at the end of July 2014, the project was closed.
It is not clear, whether negative SEO is to blame here or not, as the website clearly showed some positive dynamics after disavowing. But that’s not the question that matters.
What we need to understand, is whether disavowing was effective back in 2014 and whether it is still necessary to manually disavow links in 2020. To answer these questions, let’s first see how Google algorithms have evolved over time.
What Google says about spammy backlink attacks
Google’s stance on spammy backlink attacks has gone through an evolution. Prior to 2003, Google stated there was “nothing a competitor can do to harm your ranking”. After that, Google representatives kept repeating they were working hard to make Google algorithms resistant to negative SEO attacks. The bottom line is that succeeding with such attempts is difficult, but still not impossible.
In 2014 (around the time WP Bacon was attacked) John Mueller of Google said:
“It’s a tricky situation and not something where I’d say that we can guarantee that we always get it 100% right. But, from the cases I’ve looked at, I think we’ve done a pretty good job.”
So, back in 2014, though reluctantly, Google admitted that its algorithms were not yet perfect at intercepting spammy link attacks.
Five years down the road, Gary Illyes seems to be much more confident in Google’s capabilities to protect websites from negative SEO. When speaking at Pubcon Florida in March 2019, he claimed that out of the hundreds of negative SEO cases he looked at, none were actually negative SEO. So, let’s see what the grounds are for his confidence.
In the last five years, Penguin has become real-time, which means Google is now constantly hunting for bad links. Besides, Google is now pretty good at singling out bad links as it distinguishes between different link communities. Spammy websites normally link out to each other and to “good” websites, but “good” websites don’t link out to spammy ones. This way, bad websites get isolated, and, as a result, Google disregards links coming from these sites.
Finally, Google is also pretty good at discriminating between the bad backlinks you’ve built on your own and those built by competitors. To put it simply, if a website has a long history of manipulative link building, Google will notice it, and if a website gets bombarded with bad links, Google will blame the website and not its competitors.
So, let’s finally answer the question I raised earlier. Is there a need to disavow backlinks in 2020? Is it still possible to successfully hit a website with spammy links attacks?
To disavow or not to disavow
Let’s start with Google’s official position on the matter. Currently, the search giant discourages the active use of the Disavow tool. According to Google, there’s no need to disavow every time you see a bad link pointing to your website. The tool should only be used if you know for sure the links are spammy as you’re the one responsible for building them. And Google is supposed to catch all the rest.
At the same time, SEO forums are still full of threads where people claim they lost their ranking due to a spammy link attack and seek advice. In some cases, fellow forumers would suggest to keep calm and rely on Google algorithms plus check if a website is sending the wrong signals to Google for some other reasons. Others still advise disavowing bad links.
So, to disavow or not to disavow? Well, even though Google claims disavowing is obsolete in most cases, it won’t harm your website in any way – all you lose is some time of yours. If you believe your past linking patterns could backfire on you, do disavow. If there’s no way the search giant can associate you with manipulative link building practices, but a few backlinks from adult sites are keeping you up at night, you might as well disavow them. Sometimes taking any action just feels good. And at least you’ll be able to report to your boss or your customer that the website’s backlink profile is now adult-link-free 🙂
Guard your most precious links
In addition to spammy backlink building, there’s one more link-related practice you should be aware of. It’s not a massive thing and it won’t hit you that badly, but I still want you to be forewarned. So, what I’m talking about is the practice of robbing you of your best backlinks.
Your competitors may get in touch with the owner of the website where you have a backlink, pretend to be you and ask them to remove the link because there’s no need for it anymore or it does not fit into your current SEO strategy. Another approach would be for your competitors to try to substitute a backlink to your website with one to their own site offering something of value to the donor site. In most cases, website owners would ignore such requests, but sometimes your competitors may have some luck.
Anyway, wouldn’t you love to know if you’ve lost one of your most precious links? SE Ranking Backlink Monitoring tool can help you with this one – just upload a list of your best backlinks to the tool and it will keep an eye on them for you. Check the status of the links regularly to make sure they were not removed or configure the platform to get weekly reports. And if you see something like this, contact the website owners ASAP.
Stolen content may bring you down
Building spammy links is the most popular negative SEO tactic, but there are other ways for competitors to hurt your website. You may see your rankings drop because someone has stolen your content. This technique is rather hard to carry out, and here’s why.
If someone simply copies your content pretending they are the creators – it wouldn’t fool Google and won’t impact your rankings. Google “remembers” which site published a piece of content first and will know if your website offers original or copied content. Pages with stolen content will never rank high for target keywords and may not even get indexed at all.
To hurt your rankings, competitors will have to steal your newly created texts before they were indexed, republish them and make Google index their version first (for this to happen their website should be more authoritative). While rather hard to execute, such attacks happen every now and then, especially in highly competitive and somewhat shady industries like gambling or essay writing. So, what happens is you publish a new piece of content and then find your competitor ranking for the content you’ve created.
Going after content thieves
Now, let’s see what should be done to mitigate the consequences of such attacks. First of all, you’ll have to figure out that your content was copied. If you run a small website, you’ll probably spot on your own that the content you’ve published isn’t getting the attention you expected it to get. In this case, you can manually check if the page was stolen using Copyscape.
For larger websites, it may be more efficient to use a monitoring tool like Awario to set up alerts for exact matches of your content pieces published over the Internet.
If you’ve discovered your content’s been stolen and it ranks higher than your own page, here’s how to can try to handle the issue.
First, try to contact the website owner and ask them to remove the stolen copy from their website. If there’s no way to get in touch with the website owner or negotiations don’t produce any results, you can file a DMCA takedown notice to make Google remove the stolen content from SERP. It is worth noting though that DMCA enforcement is not an easy process and often involves copyright litigation. Besides, if the website that has stolen your content is outside the USA jurisdiction, DMCA will be of little use.
This is why prevention is better than cure, and I strongly recommend you to take measures that will protect your website from scraping – this is how competitors get their hands on your content. You have a lot of options here. You can start by tweaking your website code to disable RSS and limiting access to the XML sitemap for the bots – just don’t place your XML sitemap in the robot.txt file and instead submit it directly to Google via Search Console. You can try out Google’s own anti-scraping tool called reCAPTCHA Enterprise (still in Beta). Finally, you can invest in advanced paid solutions like Cloudflare Bot Management. If your website runs on WordPress, you can also get one of the security plugins like Wordfence or Blackhole for Bad Bot.
Sometimes stolen content is just stolen content
Once you protect your website from scraping, you should not really worry about someone stealing your content. But before we move on to the next negative SEO tactic, there’s one more thing I want to clarify. Stolen content does not necessarily equal to negative SEO attacks – black hat SEO practitioners may scrape your content, modify it a bit, add loads of spammy links and publish it on hacked or link farm websites. Just because they need some content to publish and they won’t create it on their own.
Copyscape will flag such websites and users often feel concerned about such duplicate pages as they may rank for some random snippets of text. So, if you copy one sentence from your text and Google it in quotation marks, you’ll see the website that has stolen your content in SERP right below your own website. While unpleasant, such cases should not really worry you.
Google treats text snippets as gibberish and ranks them differently than “normal” keywords. Overall, this kind of copied content won’t really harm your website. These pages will never outrank you, and should not really rank for the target keywords at all.
What you can do is notify the site owner if the website was hacked. You can also report the page full of spammy links to Google so that it gets deindexed.
Keeping malicious bots at bay
Now, let’s get back to malicious-bot-related dangers. I’ve already mentioned that blocking such bots from accessing your website can save you from content scraping. This preventive measure can also protect you from such negative SEO practices as click fraud and heavy crawling. Let’s take a look at every technique closely.
Click fraud means competitors will use bots to spoil your CTR rate. Bots would choose your page in SERP and quickly leave it to check other search results. That way, Google will think users do not find your page useful and you’ll lose rankings.
The technique is rather controversial as, firstly, Google does not admit CTR impacts rankings and, secondly, it claims to be quite good at detecting botnet traffic. Nevertheless, real-life experiments prove such technique may actually work.
Heavy crawling is a practice of forcefully crawling your website by numerous bots to increase the server load. This results in slower loading speed and in the worst-case scenario it may even lead to a website crash. In the latter case, Google bots won’t be able to crawl your website, you’ll have your crawl budget wasted and if such cases persist, you may even fall out of Google’s index.
So, let me once again stress the necessity of protecting your website from malicious bots. Paid WordPress plugins and Cloudflare protection I recommended for content scraping protection will work here as well. Generally, all the bot protection solutions will protect you from all kinds of bad bots including advanced ones that mimic human behavior.
Don’t rely solely on Google’s algorithms that should stop automated traffic, as those involved in black hat SEO always keep looking for new ways of outsmarting Google. Make sure to add an extra layer of protection to your website to feel safe.
Making your website hack-proof
Now comes our last argument on why website security should be your first priority. Malicious bots can do your website lots of harm, but the most devastating issues may arise if you get hacked. As long as your competitors have no access to your website, their hands are tied. But once they find a loophole in your website protection, they can abuse it in so many ways. Let’s take a look at the most popular practices of the kind.
Putting your website on Panda’s radar or out of Google’s sight
Once your competitors gain full control over your website, the typical scenarios would be the following. The more blunt way to hurt you would be to alter your website content to make it look spammy – that way you can get hit by the Panda algorithm. This approach works well for larger websites as it’s harder for the website owner to notice that some pages have turned into an unreadable piece of trash with lots of spammy links. Smaller websites have better chances of spotting and fixing changes before it gets too late.
Still, small website owners should not feel relieved. Hackers may be sneakier and hide some bad links within the code of your website so that you won’t be able to easily spot them. Meanwhile, Google will definitely notice those links.
Another trick your competitors may use is altering your robots.txt file in such a way that forbids Google to crawl your website and get you out of the index. You’ll definitely notice this kind of attack, but, unfortunately, won’t be able to regain your rankings overnight.
I bet right now you would like to hear about the different ways you can protect your website from hacking. But let me first describe a few scenarios when a hacked website is not a sign of a negative SEO attack. Remember how black-hat SEO may steal your content for the sake of stealing? Well, they may as well hack your website and use it to boost their own sites. No personal offense intended.
Falling victim to black-hat SEO hackers
Here, once again there are a few scenarios. The most common one – hackers may turn your website into a low-quality backlink donor filling your pages with dozens of spammy links to the hackers’ projects.
So, if you see that your website got new pages on topics unrelated to your business niche and those pages feature low-quality/stolen/machine-generated content, know that your website has been hacked. Change your login credentials immediately and get rid of spammy pages.
If you have a really authoritative websites, black-hatters may use it to pass some link juice to their own website. Here’s how it works.
- First, just like with the previous scenario, they would hack your website and create a page or lots of pages related to their niche – content quality may be better in this case.
- Next, they would build links to this page(s) – links would often be spammy – to make the pages rank for their target keywords.
- Finally, they would redirect the page(s) they’ve created to their own website.
Marie Haynes of Moz describes here how one of her websites was once hacked and, as a result, link juice was sent to Michael Kors affiliate website.
So, even though black-hatters may hack your website with no intent of dropping your rankings, their actions may still get your website flagged by Google algorithms, which will drag you down. In order to prevent this from happening, you have to build an unbeatable protection against hackers on your websites.
The first thing you should remember is to always keep your CMS up-to-date – older versions may have vulnerabilities that hackers can abuse. The strong passwords rule also applies here. Enabling a 2-step verification with Google Authenticator Plugin is also a good idea. Finally, on top of this, you can use security plugins to create a truly impenetrable security wall.
Keeping an eye on your GMB listing
This point is probably of interest to local businesses trying to rank in local search, so if this is not your case, go ahead and skip this part.
Alright, if you run a local business like a café or a flower shop, you should have created a Google My Business profile to rank in local search. GMB signals are currently the most important factor impacting your rankings in local pack/finder, so you want your business profile to be filled with accurate information and feature positive customer reviews. Your competitors, on the other hand, may be eager to tweak your GMB data and add fake negative reviews. It can impact both your business reputation and local rankings.
In the past, there were cases when business competitors altered the business name, phone number, and address or suggested their own version of the working hours. Once Google allowed new businesses to indicate its future opening date, there was a lot of abuse with competitors setting a future opening date for long-established businesses.
Google is constantly trying to make it hard for wrongdoers to hurt local businesses. The editing procedure has been updated and now you’ll have to submit some photo evidence to have your GMB edit suggestion verified.
At the same time, your competitors can still ruin your business reputation and hinder your rankings by posting dozens of negative fake reviews. Naturally, Google has clear guidelines for GMB reviews where fake and defamatory content is strictly forbidden. So, what you can do is watch out for fake reviews and, once spotted, flag them for Google. Ideally, several people should report the fake review so that is gets removed faster. In the meantime, a good practice is to answer such reviews to show your customers who may bump into them that all the accusations are far from the truth.
Overall, my recommendation to all GMB users is to check your inbox for email alerts from GMB daily and to regularly check your profile itself – sometimes you just won’t get email notifications.
The very thought of losing everything you’ve achieved over years of hard work because of one nasty negative SEO attack is terrifying. Sadly, there’ll always be easy gain seekers trying to drag you down by using some dirty means. The good news is that Google is constantly improving its algorithms to make them resistant to all kinds of manipulations. Besides, there’s a lot you can do yourself to not let dodgy competitors set you back – having a hack-proof website with strong protection against bad bots really makes a difference.
Chances are you’ll never have to deal with negative SEO attacks – that’s what we sincerely wish for all of our readers. If your competitors ever tried to come at you with negative SEO practices, do share your experience in the comments section – we’d love to know which negative SEO tactics you encountered and how you dealt with them.