valuationbreak evenprofitcontributionforecastshiftqueuingbusiness analysis
accountbasbudgetBAS-I.Cshare valueshort salesinvestment
site mapform1
Browse All Question and Answer Items
Items selected where Question includes site map or includes sitemap or Answer includes site map or includes sitemap
<< < Page 1 2 3 4 5 6 7 8 9 10 > >>
Questions & Answers
Q: Been using SiteMap XML Software for a couple years and have been extremely pleased with its results. But a couple weeks ago, and every time I've attempted to build an XML site map since, I receive a prompt indicating I need to use a 5th Generation browser with JavaScript enabled. Fact is I am using that. I have even attempted to build maps with a different browser (e.g.: IE 7.0.7) thinking perhaps I've got a glitch in my regular Firefox 3.68 browser. Is there some advice you can render to help get my SiteMap XML Software working properly again? I really love the product.
A: It does sound as though JavaScript is not enabled or the JavaScript file included in the software package is missing. Try running the online version at https://bizpep.com/sitemapxml/ If this is fine then you have an issue with the software on your computer and you can redownload/install. If the online version indicates JavaScript is disabled then as it is occurring on both browsers you may have some windows security or anti virus software automatically disabling JavaScript?
Question and Answer Item 2058150 - Browse All Question and Answer Items
Q: Created a static site map and uploaded to my server. Seems to be fine except the following appears at the top of the site map:"This XML file does not appear to have any style information associated with it. The document tree is shown below." Shall I just live with that or does Google require style information associated?
A: This is only a browser message and simply indicates the xml document is displayed without formatting. This is the way it is required for google.
Question and Answer Item 2058151 - Browse All Question and Answer Items
Q: The problem I am encountering is that I can't seem to get it to work. I have a fairly large site with about 20000 pages. The original settings of depth limit 5 and link limit 10000 returned a site map with only about a fraction of the pages indexed.ÃâàI then played around with changing the limits and set the depth to 10 and increased the link limit in steps of 1000 from 1000 upwards. But I always got exactly the same result. When the limits were depth 10 and links 10,000 the script stopped working altogether and the output section stayed empty. Any suggestions on how to get it to work?
A: SiteMap XML is best suited to medium size sites of up to 5000 pages. It sounds like when you use a limit of 10000 the crawl takes too long and the sever times out (which is not unexpected). On the missing pages make sure they do match the Include URL's pattern , do not match the Exclude URL's pattern, are on the same domain, and are valid href links in a crawled page which is accessed within the depth set. If all these conditions are meet then the pages should be used. If not then your site may be to large for the generator to crawl within the server time out setting. You should only need to set the depth to the number of levels of your site ie if all pages have href links on your Start URL page then the depth should be set as 0, the deeper you go the more time it will take to crawl. Hope that helps?
Question and Answer Item 10999 - Browse All Question and Answer Items
Q: I am evaluating your sitemap software. It does not seem to honor robots.txt. Is there a way we can configure it to honor this (or some variation)? We run an eCommerce website and do not want the software to walk the entire site. I also noticed that during my tests, it does not accurately extract dynamic content which is of course a bigger issue.
A: The SiteMap XML generator is not a robot crawler and does not use the robots.txt file. It crawls your site structure using the href links in your pages (starting at the Start URL). Links that are formatted as href links in the web page code will be extracted this applies to dynamic links or static links. If the extracted link is not correct it will be an issue with the href link structure.So for a page to be crawled it must be part of the site structure and have a link from one of the crawled pages or be set as a Start URL. The urls included in the site map are determined by the settings used, only urls that match the Include URLs with pattern will be included. Urls can be excluded using the Exclude URL's with pattern. You can use this to exclude some links but if a search engine can crawl them then they will most likely still be indexed even if they are not in the sitemap. Hope this helps.
Question and Answer Item 10997 - Browse All Question and Answer Items
Q: I'm interested in your product. I can't, on this computer take a look at a simple of your site map of our site as our customer would see it. Can you show me a link to a sitemap, on-sites, that is 2 and 3 pages deep, so I can see if it is a tiered output. My current site map is linear and doesn't show the tiers of the site. Also, can I program, with your software, automatic submission to Google, Yahoo, and Microsoft weekly? I only add a page once a week or so.
A: A range of example sitemaps are provided at https://bizpep.com/sitemapxml/ , these will display in your browser. The sitemap generator does not provide tiered (indented) output, all pages are listed in the same format regardless of depth. Using a Dynamic sitemap every time the sitemap is called the content is rebuilt to reflect the current site structure/pages. You do not program it to submit to an index (ie google etc) but submit initially and then the search engine in it's normal crawls will re checked it on a regular basis. For more information on how search engines use site maps see the Google Webmaster Help http://www.google.com/support/webmasters . It provides a lot of information regarding indexing and sitemaps.
Question and Answer Item 10992 - Browse All Question and Answer Items