From the original article by Lisa Buyer on Search Engine Watch: "Contrary to popular belief, Google says the Penguin intent is to help the overall search experience versus put legitimate businesses in jeopardy of losing precious web traffic and bottom line sales.
Unfortunately, innocent bystanders report they are taking a hit with little defense against Google, the largest search engine boasting 66.4 percent of the search market share and not to be ignored.
How can a business protect itself from the potential crush of Penguin or the next Google algorithm change? There is something to be said for not putting all your SEO eggs in Google's basket.
Deep SEO Inhale... Long Social Media Exhale
There is life beyond Google for gaining online visibility. The opportunities are greater than ever to take part in some healthy SEO living from other organic marketing sources in places like social media networks.
Read on advice from veteran online marketers."
Good alternatives when Google hits and you are about to give up.
Robin Good: If you are looking for key answers about which triggers have unleashed the new Google Penguin on your site, here is much very good food for thought.
From the original article: "One common factor thus far appears to be the signals of links that are pointing to your website, early analysis indicates."
If you want to understand better what type of links are the cause of this new penalization, check whether you have at least one of these types of links pointing to your penalized web site:
From the article intro: "Having overly optimized web pages could soon get your websites in some hot water with Google and their search results. It has recently been announced that Google will start to penalize websites that engage in over-optimization practices.
In this week's Whiteboard Friday, we will be covering some changes that you should be making to your SEO practices in order to avoid this type of penalization.
...
This week we've been hearing a lot of chatter in the SEO blogosphere and on Twitter and on the forums about this new potential Google penalty that's coming down the line around over-optimization.
...
But before this penalty hits, for goodness sake, SEO folks, let's make these changes to our websites because we could be in real trouble if we don't impact these things beforehand.
I think these are some of the most likely candidates to be hit by Google's over-optimization penalty, some of the most likely patterns they're going to try and match against in this upcoming change. So let's talk through them."
From the original article: "This question should interest every entrepreneur and someone who is active in one way or another on social media. These different options will give you the necessary support. [note mg]
Imagine that you’ve spent years building a business and growing a website. You launch a promotion to email a list of potential customers with exciting new opportunities to save lots of money by doing business with you. Did you email too many people, or did you email the wrong people? Did someone turn you in as a “spammer” to one of the many spammer blacklist organizations out there?
Everyone hates real spammers, and the last thing you want to do is get labeled as one. That’s one kind of blacklist. The other kind is worse – the search engine blacklist. That’s the one that is basically a death sentence for your site because Google and other search engines stop crawling your site or even listing it in search results. No blacklist is good to get, because ISP’s and many content filtering services access Internet blacklists to figure out not only what email to block, but also what websites to block or to mark as potentially dangerous.
How do you know you’re on the Google List or any other database of blacklisted sites?"
From the original article: "Google has been working on a new penalty that targets site’s that overly optimize for search engines for the past few months.
Matt Cutts said the new over optimization penalty will be introduced into the search results in the upcoming month or next few weeks.
The purpose is to “level the playing field,” Cutts said. To give sites that have great content a better shot at ranking above sites that have content that is not as great but do a better job with SEO."
Check also out this video from Matt Cutts, dating back to 2009, and illustrating how much Google's take on this issue has profoundly changed, if not reversed altogether. http://www.youtube.com/watch?v=Bz0KQNPDUoc
Must listen-to. 8/10
Here is a full text trasncription (and the audio recording) of what Matt Cutts has exactly said: http://selnd.com/FTwher
February 24, 2011 was a day that will live in infamy for the team here at Viewpoints. That was the day of the Google Panda update. Up until that point we had enjoyed four years of consistent traffic growth to Viewpoints.com.
From the article: "The truth of the matter was that it could have been any and all of these. Although we had hundreds of thousands of great reviews, we had probably let some slip through that did not deserve to be published. And we had not paid enough attention to speed, ad density or other hall marks of a good user experience.
So we set out to fix our Google problem but at the same time, we resolved to think bigger and longer term and use this opportunity to create a better user experience, regardless of what Google was looking for.
So now you can judge the results for yourself. We have spent the last 12 months as a team of 25+ professionals reinventing Viewpoints. From March to June we removed 40% of the ads across the site. We improved the speed of the site by 3x. We moderated out 80,000+ reviews that did not meet minimum quality standards.
Unfortunately these changes had only a nominal impact on traffic.
So we decided it was time for a more radical approach."
From the article intro: "We know how hard the Google Panda update has hit some site owners.
As much as 87% are still suffering from the Panda update, 10% of you lost your jobs and many more are afraid to, tons of owners let people go and people even lost their homes.
Panda along with the recession and housing slump has not been fun for many.
I spotted this WebmasterWorld thread with someone who was a top top AdSense publisher, with a staff and a great life.
His site was hit by Panda and his life changed forever.
His story is both sad, uplifting and a bit scary."
Robin Good: Search Engine Land celebrates one year of Google Panda with a cute infographic summarizing the Google algorithm traits and its history on the web over the last 12 months.
If you have not yet heard about Google Panda, and you have a web site, it is a good idea to start getting familiar with this stuff. It can save you lots of time and frustration.
From the article: This is a guide on how to find and fix Google's Panda algorithm update, based on our Panda fighting experience at SEOgadget in 2011.
Key takeaways:
1) The world is not as it once was.
Crap websites trying to masquerade as decent websites are being hunted down and sunk below the quality line.
2) Google owes you nothing.
Tactics to just barely raise your quality enough to recover your rankings are unlikely to pay dividends. You may well find yourself a loser again the next time the quality bar is raised.
3) Google is judging you.
Google is going to continue to raise the quality bar with future updates. When your competitors improve their websites, you will be weakest and in line for the chop at the next quality update.
From the article: "We've asked when will Google roll the Panda algorithm more seamlessly into their algorithms - where Google does not have to manually press a button to run the Panda algorithm, but rather where it runs all by itself (I am pretty sure I am oversimplifying it).
Google made an announcement late Friday afternoon with 17 search quality updates.
"High-quality sites algorithm improvements. [launch codenames "PPtl" and "Stitch", project codename "Panda"] In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites.
We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda."
...
It seems like this means Google has made Panda a bit more integrated into the mainstream algorithm, allowing it to possibly run more frequently and who knows, maybe more real-time?
From the article: "Google today listed changes it made to its algorithm in January.
As previously discussed, the biggest takeaway from that (at least in my opinion) was an increased focus on freshness through not only updates to the “Freshness Update,” but also through changes to universal search, which focus on the queries that deliver news results.
The company also addressed a recent Panda tweak:
High-quality sites algorithm improvements. [launch codenames “PPtl” and “Stitch”, project codename “Panda”] In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites. We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda."
Robin Good: If you are curious to see which sites have been badly hit by the new Penguin Google filtering algorithm, here is interesting information.
"The official goal was “to take care” over-optimized websites, containing too many unnatural links, automated content (spinning), keyword stuffing etc. Google tries to kill webspam altogether.
The impact on all keywords queries is about 3.1%, which compared to Panda (with around 12%) is much less.
But Google said more short-head/visible keywords should be affected."
Interestingly web sites that were most negatively impacted, included above all three specific categories:
"a) Database-driven websites – they mainly aggregate information and use large database systems to create as many pages as possible. Sites such as songlyrics.com, great-quotes.com, cubestat.com or lotsofjokes.com fall into this pattern.
b) Press portals and feed aggregators such as pressabout.us, newsalloy.com and bloglines.com were also affected, which makes sense from a Google point of view since these are the website types that are very often created by very aggressive (possibly overly aggressive) SEOs and often contain similar content.
c) A couple of heavily template-based websites were also affected – ticketnetwork.com/ticketcity.com, hotelscombined.com and customerservicenumbers.com fit Google’s anti-SEO bill perfectly when it comes automatically (possibly also spun) content.
d) Furthermore, a lot of sites that copy or rehash other peoples’ content (or are used by their users to do that) were demoted – examples include mayor sites such as digg.com, folkd.com and pastebin.com."
Robin Good: Google is about to release globally and for all languages a new algorithm change that will significantly penalize web sites utilizing "black hat"
SEO techniques, to rank inside Google search engine result pages.
From the article: "In the next few days, Google will be launching an important algorithm change targeted at webspam.
The change will decrease rankings for sites that are violating Google’s existing quality guidelines.
This algorithm represents another improvement in Google efforts to reduce webspam and promote high quality content.
...
Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.
The change will go live for all languages at the same time.
For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice."
"Did you know that hundreds of Googlers work around the clock to make sure the ads you see on Google are safe?
David Baker, Engineering Director at Google, explains more about our fight against scam ads and our process for keeping you safe." (from Google)
From Search Engine Roundtable: "Google has uncovered a bit on how they manage ad safety with a new blog post yesterday afternoon. Google announced they have shut down about 800,000 advertisers and blocked over 130 million ads from showing up on their network.
They have come a long way in this regard and it is a constant battle between Google and rogue advertisers.
Google has documented some of the general steps they take to detect bad ads and advertisers.
(1) They monitor ad and landing page content to detect scams, malware and such.
(2) Manual reviews of ads after being flagged by an algorithm
From the article: "Since the Panda update, more and more people are trying to control their Google index and prune out low-quality pages.
I’m a firm believer in aggressively managing your own index, but it’s not always easy, and I’m seeing a couple of common mistakes pop up.
One mistake is thinking that to de-index a page, you should block the crawl paths. Makes sense, right? If you don’t want a page indexed, why would you want it crawled?
Unfortunately, while it sounds logical, it’s also completely wrong."
From the original article: "Understanding the way that search engines like Google and Bing crawl your sites for duplicate content is not always easy to follow.
Google Panda evaluates the quantity and quality of the content housed on your site and assigns a value to your website or section of your website.
It’s important to note that just because you’ve updated your content, changes will not necessarily be reflected by Google until they update your Panda rank."
Robin Good: Andy Atkins-Krüger at Search Engine Land has written a very interesting and insightful report directed both at Google and at owners of multilingual websites which may have been affected by Google Panda.
According to the reasoning put forward in this article, there is a possibility that Google recent emphasis in requiring webmasters to adopt
the canonical and hreflang tags for multilingual websites may be a rather clumsy Panda fix.
"Canonicals and Hreflang tags are visible on the page to Panda and say “Please leave me in – I’m not just a duplicate and have a specific local market purpose and this is the market.”
"Many large websites rely on machine translation (not a good solution for SEO at any time) and they are particularly affected by Panda.
Google, if you disagree with me, please explain why all of this extra coding is suddenly needed."
From the article intro: "I wasn’t expecting this to come until early March, since the month isn’t even over yet, but Google has gone ahead and released its monthly list of updates: 40 changes for February.
While we’ll take a deeper look into the list soon, it’s worth noting right off the bat that there is a Panda update listed.
Late last week, in light of Panda’s one-year anniverary, I asked Google if the Panda adjustment from January’s list had been the most recent adjustment to Panda.
The response I received from a spokesperson was:
“We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines.
We also released a minor update to refresh the data for Panda.”
This was basically what the company said in January. Now, in today’s list for February, Google says:
“This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.”
So between January’s and February’s Panda news, it sounds like Panda is more ingrained into how Google indexes the web than ever before, and may even be pickier about quality."
Google has confirmed a new Panda update at the same time that it’s announcing 40 search updates that happened in February (or are in progress right now).
Here’s what Google says about its latest Panda-related change:
Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.
This sounds very similar to Panda 3.2, which happened in mid-January and was described only as a “data refresh” and not related to new or changed ranking signals.
Robin Good: This is an excellent and thorough guide on how to find out whether you have been hot by Panda and on how to fix Google's most feared automatic penalization algorithm to date, based on the Panda fighting experience at SEOgadget in 2011.
From the article:
"Panda is about dealing with bad content, not bad links. Bad content comes in different flavours: duplicate, weak, thin and template.
Panda acts like a domain wide penalty: your whole site is affected and your good stuff is dragged down by your bad stuff.
Web crawler accessibility issues affect how search engines see, and therefore assess, your content.
Often, badly designed Information Architectures compound the problems with already weak content.
Large sites that have many pages, templated content and lots of sub-categories are the most at risk.
If you haven’t been monitoring and fixing your accessibility issues, as highlighted in Google Webmaster Tools, you are at risk."
Three key takeaways:
"1) The world is not as it once was.
Crap websites trying to masquerade as decent websites are being hunted down and sunk below the quality line.
2) Google owes you nothing.
Tactics to just barely raise your quality enough to recover your rankings are unlikely to pay dividends. You may well find yourself a loser again the next time the quality bar is raised.
3) Google is judging you.
Google is going to continue to raise the quality bar with future updates. When your competitors improve their websites, you will be weakest and in line for the chop at the next quality update."
Robin Good: Nonetheless Google has declared a full-blown war on "thin" and "shallow" content, since just about one year now, you like me, may be still confused about what constitutes that low-quality type of content that Google does not like.
Brian Ussery has recently published a good guide focusing specifically on this very aspect: Understanding exactly which could be the low-quality content signals that your web site or blog may still be sending out.
Robin Good: Now that Google pays so much more attention to "user experience" and satisfaction on our site, it becomes a must to start looking at all the ways in which you can improve the key metrics Google increasingly looks at.
Among these "bounce rate" is among the important ones and it represents the number of visitors that abandon your site after having looked at one page.
Though the bounce rate per se, it is not in absolute terms to be considered a negative element, if paired with very short times on the page and with a frequent use of the browser "Back" button from the visitors abandoning your page, than it is a major red flag for Google telling it your site may not be worth so much.
To improve this situation, Kelly Shaver has written a valuable short guide for the WebTrends sponsored Customer Experience Series on Mashable, covering key strategies you should invest time on, to help bounce rates and overall user experience on your site.
The seven key strategies include (my short explnation below each one):
1. Be Mindful of Ad Placement
Be careful of where you put your ads and how many you use.
2. Lazy-Load Third-Party Content
Load third-party widgets and stuff later than the main stuff.
3. Contrast Is Key
Make your pages as readable as they can be.
4. Have Clean, Accessible Navigation
No more words needed.
5. Your Message Should Be Immediately Obvious
State clearly and immediately what you are about
6. No Distractions, Please
Quiet your site distractions down
7. Have a Responsive Layout
Use the latest technology updates to code your site so that it is visible across all kind of devices.
To get all of the details for each one of these, please go and check the full article.
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.