When Negative SEO Happens
Should you be excited when you notice that your site got bunch of backlinks?
Definitely not, if they are bad backlinks, piles of bad backlinks to your site. But you have no clue who put those back links everywhere in the first place. Could be your competitors.
Those bad backlinks could harm your organic rankings in search engines. Should you be worried.
What You Could Do About Negative SEO
First of all, you should know if the number of bad backlinks would have to be overwhelmingly huge compared to the number of good links, your site would be penalized by search engines and would lose your organic rankings.
It could take weeks or even months to recover from negative SEO attacks. It’s a bad and good news. At least it could be recovered, but takes fairly amount of time and efforts.
Fortunately, there are steps to prevent negative SEO attacks.
#1 Step, Regular Site Health Check Up
Use Google Webmaster Tool to get email notification whenever there’s any foul play issue happens, including suspicious backlinks.
#2 Step, Monitor and Report Fake Reviews
This mostly applies to local business, although any website can implement it.
Review sites like Yelp, UrbanSpoon, TripAdvisor, Amazon, are authoritative sites in the eyes of Google. Meaning the quality of the link you get from them depend on how your business is weighed compared to other businesses in your industry.
Also, the higher your business ranks in those review sites, the higher your business website would be ranked in search engines. Because search engines like Google take your business ranking on those authoritative sites seriously.
Therefore, you have to monitor all reviews on those sites. And report those fake ones as soon as possible before it could harm your business rankings.
This will help you not only protect yourself from negative SEO but also maintain a good brand image in general.
#3 Step, Disavow Bad Links
This final step makes sure that Google doesn’t count those bad backlinks when the algorithm is deciding where to rank your pages.
However this could be a risky approach if not being used properly. Follow Google’s guide and still proceed with caution.
Basically what you should do, is to create a text file that contains all the URLs that should be ignored, and Google will do that the next time it crawls any of those pages.