how to avoid links like ?page=10

eule's picture

i have a panel page for the front. i look in to google via he shows me urls like
what is wrong ..but can access and crawls the urls ...thats dc and very bad for my seo. i think this is not usefull to add urls like this to the robots.txt will index it anway.
any help ?


fiendly/clean urls

josebrito's picture

You should configure fiendly/clean urls.
I suggest you use path and pathauto modules and configure site frontpage path on site configuration to the panel path

i use clean urls...but

eule's picture

i use clean urls...but pagination isn´t helpfull under know node/?page=1 this isnt fixed with path or pathauto module

panel path is /home and is rewritet via globalredirect module to /

how i can panels config to give the bot a 404 on crawlings like this?
any how to for d7 around?

Greets from Germany
Prepaid Tarifvergleich


silkogelman's picture

Googles information about this

rel="canonical" for nodes can be accomplished with

and if you go for the 'my-view?page=2' to 'my-view/2' option
you may want to have a look at this module
in this case you may want to consider adding rel=prev / rel=next tags too.
Code way:
(could not find easy module integration for this atm)

first thanks for your replay

eule's picture

first thanks for your replay s1l

  1. i use metatag rel="canonical" but i mean googlebot handle this not very well. if i look to my log files googlebot trys the urls and do not understand or read the canonical url..if he does she have not came back to this url..but he does

  2. good point i will try..have not know yet the module ..i have try module but its not working well with the url rewrite

  3. i try and put the code to my template.php of my omega subtheme
    this brings me this error
    Fatal error: Cannot redeclare theme_pager_link() (previously declared in ./includes/ in ./sites/all/themes/spar/template.php on line 80

Greets from Germany
Prepaid Tarifvergleich


emreque's picture

I d also suggest you edit robots.txt in order for google to skip the pagination. That s sort of double link to the same node.


satter9's picture

By any chance do you have an example of the syntax to use in the robots.txt file to do this?


emreque's picture

This should do the job:

Disallow all URL variables except for page

Disallow: /?
Allow: /
Disallow: /?page=&*
Disallow: /?page=0