You want to implement a sitemap.xml feature in out Content management systems system. There's some argument inside our designers this feature will impact performance because of every time a change is completed in this content a complete listing of links from the site must be produced and put into the sitemap.xml.

The concept is the fact that every time a public viewing page is edited or added it's immediately put into the sitemap.xml which makes it current using the site.

When you are responding to, and when you've time, the other Content management systems system open or otherwise have built-in sitemap generation?

Thanks,

Upgrading the sitemap each time you improve your Content management systems will certainly create performance issues, because sitemaps are usually large, and pricey to create (from the CPU &lifier disk i/o perspective).

Things I would do is:
1. pre-plan the dwelling of the site
2. pick which areas you have to connect to within the sitemap
3. add the title from the sitemap index for your robots.txt file
4. write a script which will read in the database and generate static xml sitemap files
5. produce a cron job which will re-run this script regularly
6. submit your sitemap hyperlink to the search engines like google

Considering google does not read your sitemap that frequently, its safe to re-generate on the cron job every single day, if you plan a rebuild from it each evening within the quiet hrs, google will select the changes up the next time you poll.

For that Content management systems-powered sites Sometimes on, with 70,000 to 350,000 pages/folders each, we typically regenerate the sitemap XML once every 24 hrs. We have didn't have any issues with that. Unless of course your internet site is as common as Stackoverflow - and Google sees that it will get up-to-date around SO - it will not re-crawl your website frequently enough to warrant getting a completely up-to-date sitemap file.