Your new post is loading...
Your new post is loading...
Robin Good's insight:
If you are trying to verify whether your site has been effectively penalized or not by one of Google anti-spam algorithms, the best way to proceed is to go and look at the traffic reporting stats from your analytics tool and check whether your site has had a sudden drop of visitors from one day to the next over the course of the last year.
If you have such a sudden and sharp drop of traffic in your traffic stats, then it is time to go and check the Google Change History algorithm page by SEOMoz, which cronologically lists all of the Google algo updates week by week since the year 2000. If there is a correspondence between your sudeen traffic drop and the Google algo change date, then you know what has been the cause of it.
Very useful. 9/10
Check it out here: http://www.seomoz.org/google-algorithm-change
Robin Good's insight:
Excellent guide by Kent Lewis about what Google Panda and Penguin are, what are their symptoms and what you can do to recover from these kind of Google penalizations.
Informative. Useful. 8/10
Robin Good: PlagSpotter is a free web service which allows you to rapidly identify which specific sentences of your articles have been duplicated elsewhere on the web and at which specific URLs they have been republished.
After analyzing a specific URL, PlagSpotter presents you with a report, that includes a copy of the text published at the URL you have specified, in which the sentences that have been duplicated elsewhere, havebeen highlighted.
By clicking on any of the highlighted sentences you can get a list of the URLs where that specific content has been duplicated.
From the official site: "PlagSpotter is an online duplicate content checker that allows users to put their webpage URLs in to start an Internet duplicate content scanning and detection process.
The user can get a list of sites that duplicate the original source by displaying excerpts of the plagiarized text in a sentence by sentence format."
Free to use.
Try it now: http://www.plagspotter.com/
"Wondering if some human at Google has reviewed your web site and decided it deserves to be penalized in Google’s search results? Google’s now reporting such cases nearly 100% of the time."
Robin Good: Danny Sullivan reports on Search Engine Land from the Pubcon conference where Matt Cutts said: “We’ve actually started to send messages for pretty much every manual action that we do that will directly impact the ranking of your site.”
“If there’s some manual action taken by the manual web spam team that means your web site is going to rank directly lower in the search results, we’re telling webmasters about pretty much about all of those situations,” he added.
This article reviews the two types of actions that Google applies most of the time: a) manual and b) algorithmic. Google prefers to refer to them as “actions” rather than penalties. Here their key traits and characteristics:
Robin Good: This is pretty interesting.
“Given the fact that Google updates impacted at maximum 12-13 percent of U.S. searches, how is it that 40 percent of SEOs and website owners are reporting an impact?”
How is the overall impact of a Google Panda or Penguin update/data refresh measured, beyond Google’s own numbers?
PM Digital’s Clay Cazier proposes a method of measurement using Google Organic Click Turbulence and invites SEOs to participate.
"The purpose of his research was to determine whether Panda and Penguin actually had the negative impact reported by SEOs. Early in 2012, digital marketers were surveyed to determine which of Google’s search changes had affected their business. Fifty-four percent voted for Panda. In May, 65 percent of SEOs reported less traffic after April’s Penguin update.
Do opinion-based surveys reveal the true state of search after an algorithm change, though?"
Here is the hot take:
"This fear and doubt Google has put into organic with these updates has certainly resulted in increases in paid activity.
There may be an echo-chamber effect, where activity in forums and on blogs results in decision-makers moving budget to paid”
This is what most Savvy SEO's and webmasters have been saying all along. It has nothing to do with fear and uncertainty. Google pushed out the quality sites specifically to get the site owners to pay for clicks.
If you can afford to pay an SEO top dollar and afford to pay for premium content, you can afford to pay Google for clicks. If Google takes your $100,000 investment and pushes it to page 3 of results you are left with only one thing to do, pay for clicks."
Read the full article here: http://searchenginewatch.com/article/2216573/Google-Panda-Penguin-A-New-Way-for-SEOs-to-Measure-True-Impact
(Thanks to Giuseppe Mauriello for suggesting this article)
From the original article on SER: "As you know by now, Thursday/Friday of last week, we had two pretty important Google updates.
We had an EMD update on Friday and a Panda updatebegin on Thursday and continue throughout this week.
The question webmasters and SEOs are asking themselves, if they saw a drop in Google traffic, is... Was it the EMD update that hit me or the Panda update that hit me?
Clearly, if you have a keyword-less domain, for example, rustybrick.com, and if you were hit (was not), then you know it was Panda and not EMD.
But what if you are searchengineland.com and you were hit (was not), how do you know if it was EMD or Panda?"
Find out by reading the full article here: http://www.seroundtable.com/google-emd-panda-15792.html
Check also this forum discussion inside the Google Webmaster Help Forum: http://productforums.google.com/forum/#!category-topic/webmasters/y4VsfZ7fo_c
From the original article on SEL: Google’s Matt Cutts just announced a new Google algorithm change via Twitter. He says it will reduce low-quality “exact-match” domains in search results.
The head of Google web spam fighting team Matt Cutts announced on Twitter that Google will be rolling out a “small” algorithm change that will “reduce low-quality ‘exact-match’ domains” from showing up so highly in the search results.
Robin Good: A new patent awarded to Google promises to give the search engine tools to confuse spammers and to identify gradually good original content from web sites "optimized" to game Google.
From the original article: "Google is planning to take actions on website which try to manipulate the organic rankings by doing changes on their website or building links within a specified time period. This time period is measured and compared using a method which is more appropriately known as ranking documents.
Now Google would not fall prey to the tactics used by the webmasters in manipulating the rankings, instead it would use the power of ranking documents to measure correctly the kind of changes that a webmaster did for his/her site and the reactions in ranking correspondingly.
This technique confuses a spammer and helps the genuinely popular websites to rank well.
This is a strict warning to those webmasters who constantly change the on page elements on their website. Google would be able to find those elements which had remained unidentified by Google’s spam prevention algorithm."
Robin Good: Excellent guide by Neil Patel on how to combat and fight back against content scrapers.
To find out how to investigate, identify, block and report whoever attempts to do this without your own consent, I highly recommend reading this guide.
Informative. Resourceful. 8/10
Robin Good: Although most website owners think that their web site is fine as it is, if you prompt them to go and check what Google Webmaster Tools says about their web site load times, they will generally remain surprised to discover that their site is slower than most others. And often by a large margin too.
Thus, if your speed test indicators inside Google Webmaster Tools say that your site is slower than 70% or more of the other sites out there, it is time to do something serious about this, as Google has clearly long announced that slow sites will be filtered, demoted and not given as much visibility as fast ones (that's what I'd still call a penalization).
Here is a great resource article that provides you with all of the info you need to check and verify whether you are doing the best to make your web site as fast as it can be.
Well done. Informative. Resourceful. 8/10
Robin Good: If you are curious to see how a negative SEO spammer can be tracked and uncovered, here is a good story to read.
From the original article: "Google recently updated its claims regarding the ability of other webmasters to affect your rankings via negative SEO.
While questions about the efficacy of negative SEO continue to exist, it does not seem to be slowing down the growth of what is arguably the most contemptible part of the search industry.
On July 9th, a good friend of mine reached out to me with a problem. As a very risk-averse webmaster, he constantly plunges into the numbers, especially anchor text diversity, in order to make sure his site is as penalty-proof as possible.
The latest updated data in SEOmoz's MozScape revealed a massive shift towards anchor text over optimization for several primary terms. It took only a few minutes to identify the culprit."
The rest of the article contains a detailed and illustrated story of how the author was able to backtrack the attacker(s) and how he went about searching for hints.
From the official announcement: "...we’re announcing the Disavow Links feature in Bing Webmaster Tools.
Use the Disavow Links tool to submit page, directory, or domain URLs that may contain links to your site that seem "unnatural" or appear to be from spam or low quality sites.
This new feature can be easily found in the Configure Your Site section of the navigation."
Robin Good's insight:
If you are trying to understand better whether your site has been losing traffic sue to Google Penguin, and possibly some not very high quality links pointing to your site, here is an informative article, summarizing for you what type of links you need to be avoiding to steer clear from any risk of being hit by Google Penguin.
Robin Good: If your site has suddenly disappeared from search engine result pages you may likely have overlooked one of these 17 typical mistakes, that can get your site de-indexed in matter of seconds.
Nothing new under the sun in this list, but always a good reminder to look at if you are new to web publishing or if you have been recently penalized and are wondering what you may have done wrong.
If you are a seasoned SEO, let this be a reminder—or a crib sheet you can forward to anyone who is suggesting you do WHATEVER it takes to rank them.
Here’s a list of absolute “don’ts” where ranking is concerned."
Good reminder for everyone. 7/10
Robin Good: If you have been wondering whether your lowering traffic is a consequence of Google penalties or whether your blog or website is at risk of being soon hit by one of Google filtering algorithms, here is a great set of questions to ask yourself before deciding on what course of action to take.
From the original article: "Let’s say you’re a sole proprietor who’s hired someone to do SEO for you.
Or maybe you manage a marketing team, and SEO has always been one of those things you wish you had time for but decided to outsourceinstead.
How do you know that the SEO you're outsourcing is truly legitimate and won’t result in your website being considered spammy?
...So, how do you know if your website is spammy?
And what should you do to make sure it isn't considered *GULP* web spam?"
Among others, four typical traits of web sites "at risk" of being classified as spammy are:
1) Low use of social media
2) Over-optimized content, use of keywords, text manipulation
3) Many 404s and broken links
4) Lots of ads on the page
But there are at least four more that you should be aware of.
Useful. Good reminder to those too concerned with SEO and too little with creating unique, valuable content. 8/10
Robin Good: Google has launched its own version of (Bing) Disavow Links tool which has been designed to help webmasters reduce the negative impact of inbound spammy links when these cannot be easily removed by simply contacting the original web site owner.
N.B.: "Google recommends to remove your unnatural backlinks by either removing the links yourself or by sending a request to the concerned webmaster for the removal of the links. If still there remains some extra backlinks which you are unable to remove then you must use the Disavow tool."
Find out more about it here: http://click.xydo.com/toolbar_view/2038/7/25
Good, rational analyis by Gianluca Fiorelli of what could be the key reasons triggering the new EMD (Exact Match Domain) filter on your website (English-speaking websites only) and why most of our emotional negative reactions to it could be seen from a more rational viewpoint.
Extracted from the original article intro: "I don’t deny that Google should have tried to refine better this new algorithmic update, having somehow replicated the same “black or white” mistake done with Penguin, but its purpose is surely laudable and the real complaint should be why the hell Google waited so long before rolling it, if it is true, as Bill Slawski with many reasons suspects, it was something that was patented almost 10 years ago.
It’s this unjustified delay what is causing that also respectable and not spammy EMD sites are now dropping like flies.
Few days ago Dr. Pete wrote an excellent “instant post”, describing and commenting what the Mozcast’s metrics are saying in relation to EMDs and the update. If you have not read it yet, I really suggest you to do it.
And read also the comments, a sort of anthology of everything can be said against Google and Matt Cutts. Somehow it was like reading Webmaster forum into SEOmoz.
The most common reaction was something like this:
"Hey Google, my EMD was totally fine. It had gazillions pages of content with gazillion words. It was all White Hat and legit and useful and You – tricky b*ast*rd – You screw it!"
Surely it is an understandable reaction. I’d react the same way if I were seeing my site falling from the first page into the Index Limbo.
But, let’s try to analyze what that kind of reaction actually tells us."
From the original article: "Google has refreshed its algorithm, this time with Panda update number 20.
This update would affect 2.4 % of the English queries while other languages have a 0.5% impact that's rarely noticeable. Panda 20 update went live on September 27th, 2012.The first Panda update went live on February 2011.
What to do next?
You don't need to panic with this update. This refresh has been done to improve the quality of search results so your strategy as a search engine optimizer would be to present the most relevant content to the user in an appropriate manner. Its better to stop overdoing seo and take natural steps towards making your website look genuine. Concentrate on adding fresh content to the website as Pandas love fresh content."
Your site must appeal to people.
From the original article on FastCompany: "Both the Panda and Penguin updates contained very clear messages for marketers: stop focusing on technology and tricks and start focusing on people. If your website appeals to people, it will appeal to Google's algorithms too.
But the Panda and Penguin messages go deeper. With them, the search engines are openly acknowledging that a website isn't the only place on the Web that a brand needs to maintain a strong presence.
The interactive exchanges that people have with each other and with the brand--online--are happening in the social media channel, and the search engines are placing an increasing importance on how these conversations influence their views on brands and how their websites should rank.
This means that a brand can no longer rely on a well-optimized website to earn Google's attention.
A brand must be a conversationalist, going where the people are and engaging them in discussion, and by doing that earn a wonderful reputation.
Smart brands are doing this by fully leveraging each social channels particular properties".
Full article here: http://www.fastcompany.com/3000283/seo-isnt-what-you-think-it
Via Antonino Militello, Deanna Dahlsad
Robin Good: Stefan, from Pandacode just wrote to me saying:
Try it here:
Yes, it is part of a public relations campaign from Microsoft but I really liked the way the test is set up.
You can try it for 5 search queries and then see the result.
Have fun trying."
"An article recently published on Fast Company has caused a bit of a stir in the content marketing and search engine optimization (SEO) communities. Written by Veronica Fielding, CEO of Digital Brand Expressions, it explains how the recent Panda and Penguin Google algorithm updates mean that social engagement rather than search engine trickery yields top results.
Social media, including Facebook, Twitter, Pinterest, Google+, YouTube and LinkedIn, also plays a role in the updated algorithms but not in the exact manner Fielding described in her Fast Company article.
First, some social-media channels weigh far more than others, mainly because of technical barriers that search crawlers see when indexing them. The weight of these channels are based on how much information Google can crawl without being stonewalled by the social channels.
So, Let’s take a look at what Panda, Penguin and social media really mean for brands".
Via Antonino Militello
Google has confirmed they have pushed out a Panda refresh this past Monday. This updated affected less than 1% of search queries and is a “minor” Panda refresh.
Robin Good: If you are wondering how your web site can be easily penalized by Google, here is a great review of the most popular and frequent types of penalizations.
Sujan Patel, co-founder of Single Grain, a SEO agency based in San Francisco, has put together this useful annotated list of search engine penalities which brings together both the recent new algorithm updates Google has introduced as well as classical troublemakers.
From the original article: "Have you seen a recent drop in your website’s traffic levels? Perhaps you’ve received a notification of unnatural SEO practices in your Google Webmaster Tools account?
Unfortunately, SEO penalties can happen to any website, at any time. While it is possible to repair the damage incurred by these negative effects, it’s ultimately much more effective to take a proactive stance on penalty prevention by avoiding the following known penalty causes:..."
"...keep in mind that things change all the time in the SEO world – so this list shouldn’t be construed as the “end all, be all” of penalties your site might experience in 2012."
Useful reminder. 7/10
Robin Good: Useful advice from John Doherty on today's Whiteboard Friday at SEOMoz for using internal links and links among multiple web properties in ways that are not going to be penalized by Google Penguin.
Video and full text transcription: http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday