I have to create a dynamic sitemap for any huge advertisements website and I'm not going the dog owner to get this done job manualy and generate day to day sitemaps for those groups. With this I believed of creating a parent or gaurdian index sitemap.php (that generate sitemap XML specific code) page, page that split and consists of links with other sitemaps.php (that generate sitemap XML specific code),based by groups. Basically had more by 50000 rows/sitemap the script goes lower on groups tree and moment subcategory.Which means I will have a large quantity of sitemaps children's ,a number of them having a single record.

On my small search, oodle.com have this tactic amazom.com must, although not dynamic .php - it's .xml.

it is possible to limitation of google or any other internet search engine based on how manny links to sitemaps can submit inside a index sitemap file ?

Example :


  <sub href="sitemap-1-auto.php"/>

  <sub href="sitemap-2-real-estate.php"/>

  <sub href="sitemap-3-jobs.php"/>


  <sub href="sitemap-112-software.php"/>


final question how do i submit index sitemap.php to any or all important internet search engine

I neead your proffesional opinion relating to this


google includes a max policy of 50,000 web addresses and also to ping you are able to

function ping($sitemap_url)world wide web.google.com/website owners/sitemaps/ping?sitemap=" . $sitemap_url)

@file_get_contents("http://search.yahooapis.com/SiteExplorerService/V1/updateNotification?appid=YahooDemo&url=" . $sitemap_url)

@file_get_contents("http://distribution.request.com/ping?sitemap=" . $sitemap_url)

@file_get_contents("http://world wide web.bing.com/website owner/ping.aspx?siteMap=" . $sitemap_url)

well, when the mimetype is XML you are able to most likely have .php extension. You simply need the "index" file following the XML sitemaps protocol for xml sitemap index files. Which individuals individual XML sitemap files also output content that suit protocol.

You could also consider trying something like A1 Sitemap Generator, but when your webite is really a pure DB website that's simple to code against from PHP then that's obviously the ideal choice too.

There's pointless for dynamic sitemaps. Search engines like google don't update their search index that frequently, there's simply not enough bandwidth. Search engines like google reading through your sitemaps and adding this content for their engine are two completely different things. You need to create static xml files increase them monthly. Google is not going add all of your Web addresses for their index in one day.

You could have as much as 50,000 web addresses or 10MB in one sitemap file. So you'll have a sitemap index file which includes links to as much as 50,000 other sitemap files. I run a site with more than 7 million Web addresses within the sitemap files, this is the way we all do it also it calculates well. Apart from Google overtaking per month to include everything for their search index.