“You ain’t gonna work on Maggie’s spam farm no more.”
Fed up with sites that produce “shallow or low-quality content,” Google has finally released a new spam detection classifier to help prevent on-page spam content from ranking in the Google search results pages. And although the news was met with cheers and tears (given some collateral damage), we all knew this was coming.
Matt Cutts, head of the Webspam team at Google, earlier hinted at this spam-catching action last November – during his talk at the PubCon search and social media conference in Las Vegas. A long-time opponent of spam (in all its unsavory forms), Cutts vilified sites that scrape and copy original content from other pages as well as those sites that offer negligible levels of original content. In his talk, Cutts promised that 2011 would be the year that Google invested in fresh spam-catching efforts and that they would also take a hard look into identifying hacked sites that routinely push poor content into the SERPs.
So, on January 21st, the Webspam team at Google, made good on their earlier threat to enact new techniques to fight content spam and altered the algorithm to drive down spam levels with the new classifier.
So, does it work? Maybe.
Some early detractors believe that this is job too detailed to give over to an automated algorithm. Others think that Google is having a difficult time with even basic link schemes. In the end, Google has the last word. Eliminating content spam sounds like a great idea. Only time will tell if it’s truly effective and beneficial to Google’s searchers. Clearly, Cutts thinks it’s time for a change. And his views are often driven by feedback he receives from SEOs and search users alike.
“People are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content,” said Cutts on his recent blog post. “We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.”
When it comes to sorting out webspam or content spam, Cutts explained that the new classifier is more adept at identifying spammy content and spammy words on derivative, self-promoting web pages and blog sites. Google also improved their ability root out hacked sites that are pushing spammy content.
Cutts himself defined the content farms as those with “shallow or low-quality” content and has promised to evaluate forthcoming changes that may help drive spam levels even lower.
For those of us in the content and search marketing industry, it’s common knowledge that original, relevant information will always increase interest, traffic and profits. The new spam detection classifier may be the first of numerous changes from Google that will benefit us all. But, as stated before, only time will tell.
Yes, our opening line was a tip of the hat to “Maggie’s Farm” by Bob Dylan.
Google Webmaster Tools has a new HTML message that talks specifically about monitoring for duplicate content. Here is a screen shot:
Update: This Topic Was Well Covered at SMX West 2011.
Prior to the dawn of the computer age, when starting a new business you’d hang out your shingle, turn on the lights, and open the doors. You’d display your wares and advertise wherever you could. [...]Read More »
Refers to the content of a web page that can be viewed on your screen without scrolling.
RRC was looking to generate more online exposure and patient prospects for our practice. When Sweet Spot first assessed our website, they found that it didn't rank well for many terms. Sweet Spot grew our traffic by 17% and gave us a foothold with several #1 and Top-10 rankings for local search terms.Read More »