The NoFollow Fix-All

August 29th, 2008 · 2 Comments

When all you have is a hammer, every problem looks like a nail.  Or something like that...

When all you have is a hammer, every problem looks like a nail. Or something like that…

We’ve been hunting for bad neighborhoods. In fact we’ve been hunting desperately for them. As I’ve mentioned before, we’ve clearly been “punished” by Google and we’re trying to find the cause. At about the time of our last spider, our inbound Google traffic basically dropped to nothing and the only thing we could come up with was that we were linking to a “bad neighborhood.”

Walt Disney World for Grownups links to quite a few sites. We have painstakingly culled a terrifying amount of Disney World information down to the “Best of the best.” We fear however that somewhere in the process we’ve linked to a bad neighborhood. So in a completely twisted maneuver we’re modifying our content, in the pursuit of Pagerank.

Our status with MSN, Live and Yahoo seems to be improving consistently, but Google is now all but invisible, and we’re fairly desperate to fix this problem. Unfortunately, trying to figure out which of our many outbound links (we are a link directory after all) is the troublemaker is time-consuming to say the least. Do we ditch content we think is worthwhile in the pursuit of pagerank, or do we take a stand against the entity that is Google and keep our content?

Enter “nofollow”

Google’s Information on rel=”nofollow” is roughly in line with what I’ve been led to believe by the rest of the ‘net. The implication is that by adding this attribute to my links I am no longer responsible for the content at the destination. So our obvious solution is to take most of our outbound links and make them nofollow.

Of course one wonders: If Google is so concerned with identifying pages that link to “bad neighborhoods”, why would they allow us to wash our hands of all our links? If they’re trying to identify and penalize spam sites, are they really ignoring these links? What’s their incentive to do so in this matter? Obviously there is an incentive to allow sites like Wikipedia not to confer their Pagerank to contributors (more on this in another blog), but why would they really want to prevent themselves access to information that is useful to them and at their disposal.

Google seems to be pretty explicit on the matter:

Untrusted content: If you can’t or don’t want to vouch for the content of pages you link to from your site — for example, untrusted user comments or guestbook entries — you should nofollow those links. This can discourage spammers from targeting your site, and will help keep your site from inadvertently passing PageRank to bad neighborhoods on the web. In particular, comment spammers may decide not to target a specific content management system or blog service if they can see that untrusted links in that service are nofollowed. If you want to recognize and reward trustworthy contributors, you could decide to automatically or manually remove the nofollow attribute on links posted by members or users who have consistently made high-quality contributions over time.

However I can’t bring myself to believe they would deny themselves a tool for identifying spam. As such, for the time being we are removing the links we fear might have contributed to our penalty and nofollow-ing most of the rest. Hopefully we can add them back later, but we’d rather get some traffic to an inferior product than none to a superior one.

Categories: Main blog narrative

Related Posts:

2 responses so far ↓

  • debra mastaler | Aug 30, 2008 at 12:22 am

    Try using this tool, it identifies bad neighborhoods!

  • Brad | Aug 30, 2008 at 8:04 am

    We did actually use that tool, however it’s results are always equivocal. It was very useful however in getting an idea of what might qualify as a bad-neighborhood and left me with the feeling that bad outbound links weren’t really the problem.

    Great advice, thanks!

Leave a Comment