Posted by pedrochristopher on January 16, 2009 at 5:19am
One feature I would like to have (I think!) is that I would like the URLs to any internal pages that are outlawed in robots.txt to be nofollowed.
I'm not sure what the best way would be to do this, but I would like to know!
Peter
Comments
l() function
If all links were consistently generated using l(), it may be possible to include a bunch of exceptions based on rules parsed from robots.txt (or RobotsTxt module's data). I can't think of a particularly efficient way to do that, though, and it would require hacking core.
Todd Ross Nienkerk
Co-founder, Four Kitchens
Todd Ross Nienkerk
Digital Strategist and Partner
Four Kitchens: Big ideas for the web
IRC: toddross
first hack proposal
I'm thinking that for the first hack at this I will just copy and hack the nofollowlist module - maybe for starters just nofollow any URL that contains [domain].*/reply/ -- sound reasonable?
why copy and hack?
Why copy and hack when you can patch and contribute?
--
Growing Venture Solutions | Drupal Dashboard | Learn more about Drupal - buy a Drupal Book
knaddison blog | Morris Animal Foundation