Google Gets Tough on Black-Hat SEO
I am IBM’s representative to the Google Tech Council, a consortium of B2B tech companies that gets together quarterly to share best practices and learn how to do a better job with search engine marketing (SEM) on Google.
In our most recent meeting, one of my counterparts raised this issue: His biggest problem is in dealing with content farms. When he buys advertising around words his company cares about, he doesn’t want the page on which those ads appear to be full of links to content farms, which everybody will admit are just junk sites. He said his boss gets very angry when he pays to have his company’s links associated with junk sites.
Content farms are sites that aggregate links and other content from all over the Web related to some keywords. Owners of these sites make money by getting traffic from Google and selling Google Ad Sense advertising on them. Because of the way these sites are optimized, often using black-hat SEO tactics (which I explained in a previous column called Search Is Not Just a Tactic), they tend to rank well for Google despite being devoid of any original content.
Users hate content farms. They are just a collection of links, often loosely relevant to the keyword. As such, the search engine results page (SERP) in Google is more useful than any content farm page. But it is much less useful the more content farms it returns. Like you, I have often clicked into one of these content farms and, when I do, I can’t bounce out of it fast enough. Google has a kind of machine learning in its algorithm that will tend to punish sites with high bounce rates. So no one content farm will rank well in Google for very long. But there are just so many of them out there, for the past few years, they have managed to maintain some presence on the top page in Google for thousands of keywords.
All that changed recently when Google announced sweeping new alterations to its algorithm to deal with content farms. In a blog post titled Finding more high-quality sites in search, Google outlined:
Many of the changes we make are so subtle that very few people notice them. But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.
The key phrase in the quote is how they define high-quality sites: “Sites with original content and information such as research, in-depth reports...” (emphasis mine). This is music to my ears. Our entire content strategy at IBM centers on creating high-quality original content geared towards an audience that primarily looks for content though search. Yet we often struggle to get this content ranked on the first page in Google, in part, because the content farms and other sites deploying black-hat techniques cheat the algorithm and we don’t.
Content farms in one form or another have been a problem for search engines for years. One way Google beat the competition was in being rather ruthless about punishing the black-hat SEO implicit in content farms. But like virus writers, black-hat SEOs get ever more sophisticated in how they cheat the algorithm and avoid the scrutiny of Google’s large squad of spam police. I don’t think our feedback to the Google Tech Council had that big of an impact on these algorithm changes. Rather, several high-profile cases in recent months led to Google taking decisive action.
The Curious Case of J.C. Penney
Recently, the New York Times published an expose called The Dirty Little Secrets of Search about the black-hat SEO practices by J.C. Penney, or rather, by their former SEO consulting firm, whom J.C. Penney fired after the link farms were discovered by Google. When Google found them, it stripped all of J.C. Penney’s pages of any link juice, resulting in a fall from the top position for hundreds of keywords to somewhere in the 70s—for all intents and purposes out of the organic rankings.
In the story, it became clear what transpired to earn Google’s wrath. Links were bought from utterly irrelevant, low rent websites into key pages for JCPenney.com—millions of links. In so doing, it made it appear that lots and lots of sites found JCPenney.com particularly relevant for the keywords in question. And JCPenney.com rankings shot to number 1 for hundreds of keywords.
Link swapping and link purchasing are not new tactics. But they got more prevalent after Google changed its algorithm to give more weight to external links into a page. When Cutts’ team started looking for reciprocal link relationships as a way to sniff out link swapping schemes, link purchase got more sophisticated. It appeared sites were sought that were off of Google’s radar and offers were made to buy links on their pages.
The scheme worked particularly well because the link anchor text on the sites where links were bought could be tuned to correspond to the keywords on the JCPenney.com pages. This is the single most important aspect of the Google link algorithm. So even though the sites were not well ranked by Google, they got as much credit for the links as they could. The other factor was volume: Millions of cups of link juice is greater than hundreds of gallons of the stuff. Higher profile sites pass link juice by the gallon, but they are also within the radar of Cutts’ team. Lower profile sites pass it by the cup, but were not on Cutts’ radar.
That all changed after J.C. Penney appeared to profit signficantly with link purchases on such a massive scale. Google won’t tell us what the poison pill is, but Cutts’ team developed one after this incident. This kind of link purchase is now self-defeating. Sites discovered selling links, such as Forbes.com, also lost much of their link credibility.
The message for white-hat SEOs: Any kind of link-swapping scheme, however genuine, is subject to the same poison pill. Your linking efforts must happen organically. At IBM, we forbid our content people from directly reaching out to other relevant sites for the sole purpose of link building, unless we have a business relationship with the owners of partner sites. Outside of partners, we don’t build links at all. The links we get happen naturally through content best practices. These include:
- Ensuring press releases have the properly coded and annotated links in them
- Ensuring that all related assets such as videos and podcasts are coded with the appropriate links and link anchor text
- Ensuring that partner sites use the appropriate anchor text when linking to relevant IBM pages
- Helping IBM subject matter experts (SMEs) gain credibility and develop external connections by surfacing them on ibm.com
- Enabling IBM SMEs to be more effective with the links they build into their social media activities
- Ensuring that content owners connect with SMEs to facilitate linking best practices
- Building high-quality Wikipedia pages about our original research contributions to the industry
These practices will tend to yield better link equity as time goes on. Not all of them result in direct link juice because of the no-follow attribute on social sites and Wikipedia. But marketing your original content in these places will ultimately result in high-quality links, as third-party sites pick up on the buzz you generate and link to your pages. As Google learns to weed out the black-hat link building sites, IBM’s natural position on Google SERPs will improve.