A few days ago, I just found out that someone’s trying to pull one of my top keywords’ search rankings by building thousands of spam links to its designated page. Obviously, that douche has plenty of time to waste.
What is Negative or Reverse SEO?
Negative SEO is the process of downgrading a site/page’s search result rankings through implementing tactics that violate Google’s Webmaster Guidelines and to trigger negative ranking factors/signals, for it to be penalized or be affected by algorithmic updates.
Robin Good: At SMX Advanced this week, Matt Cutts mentioned that Google is considering offering a tool that would let webmasters disavow certain links.
Here's what he said as reported by WebProNews:
"The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines.
People have asked questions about negative SEO for a long time.
Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.
Some have suggested that Google could disavow links.
Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three."
...
Once Google launches this tool, assuming that it actually does, it will be very interesting to see how the rankings shake out.
It should be an indication of just how important links actually are these days.
As you may know, Google has sent out a ton of Webmaster Tools warnings this year, and such a tool would help users take quick “manual action” on links rather than spend a ton of time sending link removal requests to other sites."
"My gut reaction is that something bigger happened here than just a Panda data refresh, but I honestly can’t prove that."
From SEOMoz: "On June 5th of 2012, at around 9:00am Central Daylight Time, I spotted what appeared to be a major Google algorithm update in the wild.
Unfortunately, I was alone… and the photos all turned out blurry… ok, and I had had a few beers.
Still, that doesn’t mean it didn’t happen. This is the true story of an update that I honestly believe we missed, and why we’re just not as good at spotting them as we like to think."
Robin Good: If you have been submitting your blog, web site or RSS feed indiscriminately to tens of web directories, you may want to give a serious read to this report from SEOMoz.
According to the research done, Goolge has started banning and penalizing many web directories as many of these offer next to no value at all to final readers, and have been created mostly with the purpose of providing a promotional/visibility/linkback resource.
It looks like Google has decided to stop allowing these sites to pass "juice" to its listed member sites by penalizing them and in the worst cases (tens of them) to completely ban the directory from the SERPs.
From the SEOMoz original article: "...Google deindexed several directories a few weeks ago.
This event left us wondering if there was a rhyme to their reason. So we decided to do some intensive data collection of our own and try to figure out what was really going on.
We gathered a total of 2,678 directories from lists like Val Web Design, SEOTIPSY.com, SEOmoz's own directory list (just the web directories were used), and a few others, the search for clues began.
Out of the 2,678 directories, only 94 were banned – not too shabby.
However, there were 417 additional directories that had avoided being banned, but had been penalized."
From the original article: "The good news, whether you were hit by Penguin the first time or this time, is that you can recover.
We’ve now seen that this can happen, and since we know that Google will continue to push data refreshes for Penguin, there should be plenty of chances to do so.
Just think about all the Panda refreshes we’ve seen since February 2011.
We recently reported on WPMU, a seemingly quality site with plenty of fans on social media channels, which got hit by the first Penguin update. The site has now made a full recovery."
Josh Bachynski reports in this video that the biggest surprise for most people will be the fact that Penguin has NOTHING to do with your backlinks, as it only targets on-page factors.
Key take-aways from the video:
1) You need to fix on-page issues as the top priority
2) Penguin-based negative SEO is not possible
3) No need to delete links - Google is already taking care of that by devaluing those
4) Add quality links to your key content in ways that make them look "natural" to Google (30% exact match query, 30% partial match, 30% url-based, 10% generic/other stuff)
5) Do not overoptimize - Google knows what your page is about - don't overdo it with keywords. Check with Google Webmaster Tools and see what Google thinks your page is about.
6) Try always to look and be as "natural" as you can be.
Robin Good: Here is an excellent set of web traffic graphs showcasing the impact on traffic of 16 different google penalties, updates and filters.
From the ScreamingFrog website: "...you can argue the virtue and accuracy of any data set, but internally we have found this measure to be extremely useful as a guide for competitive landscape analysis and trends."
From the official press release: "This special Panda and Penguin report fully examines this issue with examples, videos, links and helpful information on the topic. A whole range of sources were used in gathering data for the report, including some YouTube videos released by Matt Cutts, a Google spokesperson.
The report discusses many of the triggers or problems which could result in one's site being penalized by Panda or Penguin. The report also offers many different ways to recover from these updates, if one's site was affected.
In addition, this handy report, looks at the broader issue of marketing on the web in the wake of Panda and Penguin. How webmasters and marketers must now adjust both SEO and marketing tactics if they want to prosper on the new web, especially if they have been penalized by these updates. For those affected by the Penguin and/or Panda Update, this report may prove very beneficial and helpful."
Robin Good: If you have ever been wondering whether Google ever makes algo decisions based on its own economic benefits, what Google Fellow Amit Singhal said at the recent opening keynote at SMX London should put your doubts to rest.
"Singhal was adamant: “no revenue measurement is included in our evaluation of a rankings change.”
Listening to him explain how excites he gets about search improvements and how changes are evaluated, you realize there’s no spin here. He’s absolutely telling the truth."
From WebProNews: "...“We’ve been wanting to work on this for a long time, but our data scientist was previously tied up on other items (and we’ve just hired a research assistant for the project),” Fishkin tells us.
“The original catalyst was the vast quantity of emails and questions we get about whether a page/site is ‘safe’ to acquire links from, or whether certain offers (you know the kind – ‘$100 for 50 permanent text links guaranteed to boost your Google rankings!’) were worthwhile.”
“Tragically, there’s a lot of money flowing from people who can barely afford it, but don’t know better to spammers who know that what they’re building could hurt their customers, and Google refuses to take action to show which spam they know about,” he continues. “Our eventual goal is to build a metric marketers and site owners can use to get a rough sense of a site’s potential spamminess in comparison to others.”
“A score (or scores) of some kind would (eventually, assuming the project goes well) be included in Mozscape/OSE showing the spamminess of inlinks/outlinks,” he explained in the Google+ announcement."
Robin Good: Basic guide to the tools you can use to identify all of the backlinks and the keywords used by other sites to link to your web site, and to the few simple steps you need to
"It was until Google’s penguin update while all SEOs were busy in building links. Now, they are busy in removing those spam links, is not it funny. At list not funny for those who were became a victim of that update."
From the WebProNews article: "Another thing on the quality guidelines list is: “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.”
Of course, like the rest of the guidelines, this is nothing new, but in light of the Penguin update, it seems worth examining the guidelines again, if for no other reason than to provide reminders or educate those who are unfamiliar.
Duplicate content seems like one of those that could get sites into trouble, even when they aren’t intentionally trying to spam Google.
Even Google says in its help center article on the topic, “Mostly, this is not deceptive in origin.”
“However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic,” Google says. “Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.”
Google lists the following as steps you can take to address any duplicate content issues you may have:"
From the article: "Today the Google Analytics team announced that we will start seeing backlink URLs in their newly released Social Reports. According to the announcement post, written by Ilya Grigorik, Software Engineering Manager, Google Analytics (and PostRank Founder):
“These reports provide another layer of social insight showing which of your content attracts links, and enables you to keep track of conversations across other sites that link to your content.
Most website and blog owners had no easy mechanism to do this in the past, but we see it as another important feature for holistic social media reports.
When you know what your most linked content is, it is then also much easier to replicate the success and ensure that you are building relationships with those users who actively link to you the most.”
A new infographic detailing the differences between the Google Panda and Penguin updates and how you can optimise your website to avoid being penalized by any of the two.
Robin Good: If you are ready for some special kind of good news, here is a Panda recovery story worth reading. It took a lot of efforts, perseverance and patience, but in the end full recovery was achieved.
From the original article: "In the fall of 2011, I was contacted by the Director of Marketing for a B2B company.
The company’s website had been hammered by Panda, and he didn’t know what to do. I could tell very quickly that his team was truly baffled.
The company and website have been around for a long time, the site contains a boatload of ultra-high quality content, and used to rank for thousands of keywords.
The Director of Marketing made sure to point me to their top articles, whitepapers, blog posts, etc. after our initial conversation. I can tell you that he was right; they had a ton of great content.
In addition, the site’s link profile was not only clean, but it was ridiculously impressive.
They had earned tens of thousands of links, many from relevant and powerful sites in their industry.
Needless to say, I was fascinated by this story, and I was eager to begin assisting them.
Although the company will remain anonymous, I received approval to write this post covering the details and key learnings.
Everyone involved agreed that there are some great points here for others hit by Panda, so they were cool with me covering what happened."
Search Engine Land reports from SMX Advanced in Seattle: "Is it a penalty? Or is it just a change to Google’s algorithm?
That’s been one of the hot topics in search marketing in recent months thanks to the Panda and Penguin updates, and it was one of the topics of discussion tonight at our SMX Advanced conference in Seattle.
During the annual “You & A with Matt Cutts” keynote session, Google’s web spam chief told Search Engine Land Editor-In-Chief Danny Sullivan that Google’s definition of a “penalty” is when manual action is taken against a site — and that Google doesn’t use the term “penalty” as much as they say “manual action.”
Cutts went on to say that neither Panda nor Penguin are penalties; they’re both algorithm updates.
He also mentioned — and this will be good news to many search marketers — that Google is considering offering a tool that allows web masters to disavow certain links, but that may be months away if it happens.
...
Other topics included why some spam reports aren’t acted on, whether Google+ and +1 votes are a strong SEO signal right now and much more."
Read the full report blog with all of the Q&A here:
From WebProNews: "This week, Google posted a new Webmaster Help video featuring Matt Cutts talking about a potential duplicate content issue. This time, he even broke out the whiteboard to illustrate his points.
Specifically, Cutts addressed the user-submitted question:
"Many sites have a press release section, or a news section that re-posts relevant articles. Since it’s all duplicate content, would they be better off removing these sections even with plenty of other unique content?
“The answer is probably yes, but let me give you a little bit of color about the reasoning for that,” Cutts says in the video.
“So a lot of the times at Google, we’re thinking about a continuum of content, and the quality of that content, and what defines the value add for a user.
So let’s draw a little bit of an axis here and think a little bit about what’s the difference between high quality guys versus low quality guys...”"
A newly granted patent describes a machine learning process that identifies features from known data sources to compare against unknown sources from very large data sets.
From the original article: "Google was granted a patent today that could be used to collect a seed set of data about features associated with different types of mushrooms, to “determine whether a specimen is poisonous based on predetermined features of the specimen.”
The patent also describes how that process could be used to help filter email spam based upon the features found within the email, or to determine whether images on a page are advertisements, or to determine categories of pages on the Web on the basis of textual features within those pages."
"This patent presents a way of examining features on a seed set of known pages, and developing comparisons of those features with features found on an unknown set to determing a classification of those pages based upon the examined features.
It also allows for the introductions of new features to be used while the classification process is ongoing."
Barry Schwartz reports on SearchEngineRoundtable: "About a month ago, we polled our readers asking how they were impacted by the Google Penguin update.
We have well over a 1,000 responses and I wanted to share them with you. Keep in mind, those who were negatively impacted are probably more likely to take the poll.
That being said, 65% said they were negatively impacted by Penguin, while only 13% said they were positively impacted.
This is way more than the Panda update where only 40% said they were negatively impacted by the Panda update."
From Search Engine Land: "No matter that it’s late Friday night on the start of a three-day holiday weekend in the U.S., Google has just pushed out the first update to its recent webspam-fighting Penguin algorithm. Let’s call it Penguin 1.1.
Google’s Matt Cutts announced the news a short time ago on Twitter, calling it a “data refresh” that impacts less than one-tenth of a percent of English-language searches.
Minor weather report: We pushed 1st Penguin algo data refresh an hour ago.
Affects <0.1% of English searches. Context: goo.gl/4f7Pq — Matt Cutts (@mattcutts) May 26, 2012"
From the original article by Barry Schwartz on Search Engine Land: "Duane Forrester, Senior Product Manager at Microsoft Bing, wrote a blog post on the Bing Search blog named Penguins & Pandas Poetry.
The post is about Google’s latest Penguin update and how SEOs and webmasters need to be better prepared for such updates.
The basic advice is simple, SEOs and webmasters need to do one thing – diversify.
Duane is not just saying, focus on Bing and make sure your site does well there too.
Yes, if you lose all your Google traffic, ranking well in Bing is nice but since Bing only has about 30% marketshare, you are still missing out on a lot of traffic."
Robin Good: If you have been wondering whether you have been hit by one of Google recent algorithm updates (Google Panda and Google Penguin), this in-depth article by Glenn Gabe provides lots of valuable insight and specific advice on how to verify whether your site has been hit by an algo update and specifically by which one.
"Based on how Google rolled out Penguin and Panda recently, I’m finding it’s common for webmasters to be confused about which algorithm update hit their websites.
Penguin 1.0 and the latest Panda updates were so close that it’s easy to believe you were hit by one, when in fact, it could have been the other.
Use the techniques I listed in this post to help you determine which update really hit your site..."
"It’s been about two weeks since Google launched its Penguin Update. Google’s happy the new spam-fighting algorithm is improving things as intended."
From the original article by Danny Sullivan at Search Engine Land some key points I extracted:
"...Penguin, like Panda, is a filter that gets refreshed from time-to-time. Penguin is not constantly running but rather is used to tag things as spam above-and-beyond Google’s regular spam filtering on a periodic basis.
...To further confuse matters, some who lost traffic because of Penguin might not be victims of a penalty at all. Rather, Google may have stopped allowing some links to pass credit, if they were deemed to be part of some attempt to just manipulate rankings. If sites were heavily dependent on these artificial links, they’d see a drop just because the link credit was pulled, not because they were hit with a penalty.
...if you know that you were hit by Penguin (because your traffic dropped on April 24):
-> Clean up on-page spam you know you’ve done
-> Clean up bad links you know you’re been involved with, as best you can
-> Wait for news of a future Penguin Update and see if you recover after it happens
-> If it doesn’t, try further cleaning or consider starting over with a fresh site
-> If you really believe you were a false positive, file a report as explained here
From the original article on WebProNews: "Google has already launched another Panda update. By already, I mean since the Penguin update.
After the Penguin update was announced, and Searchmetrics put out its lists of winners and losers, Google revealed that there had actually been a Panda update a few days prior, and that this was strongly influencing those lists.
The update reportedly hit on Friday, April 27. With all the Penguin chaos out there, one has to wonder how much the Panda update has skewed webmaster analysis.
Barry Schwartz over at Search Engine Land reports that he has confirmed as much with Google, sharing the following statement from the company:
"We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users.
This most recent update is one of the over 500 changes we make to our ranking algorithms each year."
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.