17 May 2010
15 Feb 2011
Link to this post
The SEO people upstairs are using an SEO tool to give suggestions about pages. The software is finding issues with pages that aren't found in the sitemap.xml file.
My thought is that there is a directory in Sitefinity that is referencing these pages, and isn't being dis-allowed in the robots.txt file. Is this true, and if so, what is the directory?
Otherwise, what could be the reason for this?
Note: The robots.txt isn't dis-allowing anything at the moment, it just references the sitemap.xml.