So in a blog post today, Matt Cutts, Principal Engineer at Google, says Google is cracking down on content farms whose sole purpose is getting high ranks in search results, thus generating clicks, and thus click-throughs on ads.
We hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect.
Makes sense to me. Google’s only asset is the quality of its search results. As long as Google delivers me search results that answer my questions, I’ll come back. Why would I bother trying some other search engine?
Google has always seemed to be working hard to anticipate what I’m looking for, versus what someone’s trying to shove in front of me.
For instance, I just now Googled “restaurant” and yeah, OK, there’s restaurant.com, and the Wikipedia entry for “restaurant,” but most of the page displays restaurants around my current location. Google understands that when I search “restaurant,” I’m probably not asking how to start a restaurant, or what a restaurant is, or who makes the best restaurant equipment.
Point being, Google’s priority is delivering results that the user values – that’s basically Google’s definition of “quality.” If anything starts diluting that quality, according to Google, they’ll push back. And “webspam” is diluting the quality of search results. Cutts writes:
Just as a reminder, webspam is junk you see in search results when websites try to cheat their way into higher positions in search results…”
The obvious implication is that these content farms could have a brief lifespan. They exist only because of Google’s search algorithms, and Google can change those algorithms anytime.
The second implication is that Google is paying attention to tactics that drive results higher than their “natural” ranking. That’s called “search engine optimization” – but it’s a very broad topic. At one extreme, it simply means avoiding things that will keep your site unnaturally low in Google rankings. Examples include not assigning unique URLs to every page, not using unique titles for each page, not putting relevant terms in titles and headlines.
At the other extreme is what I consider aggressive SEO: figuring out some secret sauce that gets your site into a high rank that makes no sense to the casual observer. Try this. Google “Maine real estate.” I offer this example because I’m familiar with the architecture of the sites under the MaineToday brand (where I worked until 2008). You’ll see those sites in the first page of Google results (Numbers 3 and 4 – sweet!) . They have home listings from all over Maine, so logically, they should be ranked high on a search for “Maine real estate.” But check out the other sites on your first page of Google results. On this day, I’m seeing three sites that have only a few listings for a couple of very small areas of Maine, and another that’s nothing but a set of links to other sites. So out of 10 search results on the first page, 40 percent are not very useful for someone generally looking to buy a home in Maine.
Under Google’s definition of “webspam,” and Google’s goal of “perfect” search results, that’s a 40 percent error rate.
So the second big implication of Google’s crackdown on “webspam” is this: Does aggressive SEO have a future?