valuationbreak evenprofitcontributionforecastshiftqueuingbusiness analysis
accountbasbudgetBAS-I.Cshare valueshort salesinvestment
site mapform1
Browse All Question and Answer Items
Items selected where Item is 10999
Questions & Answers
Q: The problem I am encountering is that I can't seem to get it to work. I have a fairly large site with about 20000 pages. The original settings of depth limit 5 and link limit 10000 returned a site map with only about a fraction of the pages indexed.ÃâàI then played around with changing the limits and set the depth to 10 and increased the link limit in steps of 1000 from 1000 upwards. But I always got exactly the same result. When the limits were depth 10 and links 10,000 the script stopped working altogether and the output section stayed empty. Any suggestions on how to get it to work?
A: SiteMap XML is best suited to medium size sites of up to 5000 pages. It sounds like when you use a limit of 10000 the crawl takes too long and the sever times out (which is not unexpected). On the missing pages make sure they do match the Include URL's pattern , do not match the Exclude URL's pattern, are on the same domain, and are valid href links in a crawled page which is accessed within the depth set. If all these conditions are meet then the pages should be used. If not then your site may be to large for the generator to crawl within the server time out setting. You should only need to set the depth to the number of levels of your site ie if all pages have href links on your Start URL page then the depth should be set as 0, the deeper you go the more time it will take to crawl. Hope that helps?
Question and Answer Item 10999 - Browse All Question and Answer Items