valuationbreak evenprofitcontributionforecastshiftqueuingbusiness analysis
accountbasbudgetBAS-I.Cshare valueshort salesinvestment
site mapform1
Browse All Question and Answer Items
Items selected where Item is 2058100
Questions & Answers
Q: Thanks for your reply yesterday. I'm still having major problems with the generator, I did what you told me and by putting the "/" in to allow urls and it works a treat if i set the limit to 100 but that is no good for me as i have way more urls than that. As soon as i set the limit to anything above 100 it takes forever to get started then it seems it just goes through the same urls over and over again followed by a message saying warning: unresponsive script, with the details, i have tried clicking continue and the message comes no matter how many times i click continue. Do you know what the problem is?
A: First make sure you do not have a link to a dynamic sitemap from anywhere you are trying to crawl or make sure you exclude it from being crawled using the Exclude URLs with https://bizpep.com/sitemapxml/ you end up with a loop ie it calls itself when run. If this is not the issue it maybe that your site structure and size is not suitable for the generator ie it basically crawls every link on every page to check for links if you have repetitive menu structures / links this means that it will crawl the same pages every time a it finds a link .... it is not designed for large sites... however one option for large sites is to break the sitemap up into sections (ie using different start urls and settings) and submit to google multiple sitemaps .. see for info http://www.google.com/support/webmasters/bin/answer.py?answer=75712 . The suitability/ease of this will depend upon your site structure. When you are testing use the static mode to fine tune your settings and then if required apply these settings to a dynamic sitemap. They run exactly the same so a static sitemap will generate the same as a dynamic sitemap with the same settings. Using static mode (and not having a dynamic sitemap) also ensures you do not have a loop. FYI When I run for your site it returns Completed in 35.75737 seconds. 3228 Links Extracted. 283 Unique Links Used. Depth 1. This is limited by the Limit which is 1000 and it needs to be > 3228 to crawl further in which case it times out on the server.
Question and Answer Item 2058100 - Browse All Question and Answer Items