I run a number of sites through a Drupal multisite setup, using different subfolders, different subdomains and different domain names off of the same code base. In some cases, I own the domain names and in other cases I'm hosting a subdomain for a domain that my customer owns.
I'd like to be able to have the sites crawled by Google, Yahoo, Bing, etc., but I'm not sure what the best way is to setup XML Sitemaps and/or Robots.txt files for the different properties. I think I can only have a single Robots.txt file, and in that case, it would have to reference all of the properties per the standards for Robots.txt files. Is that correct?
As to XML Sitemaps:
-- First, I believe I can have a different XML Sitemap for each of the sites in my multisite implementation. Is that correct?
-- Where do XML Sitemaps get stored for multisite sites?
-- Is it possible or advisable to reference each of the unique XML Sitemaps in my Robots.txt file?
-- Should I do things differently for subfolder sites vs subdomain sites vs domain sites?
Thanks in advance for any recommendations and help!