Hi Mudassar,
I see you have, great! It’s working as intended ??
It does put extra strain on the server, that’s why it was always limited at 1200; small hosting packages would falter under higher pressure.
However, with the transient caching option (found under General -> Performance), this strain is negligible. It’s just that the generation might fail; thereafter, it’s just a string-fetch from the database.
The sitemap’s updated whenever you update a page, term, permalink setting, site setting, SEO settings, or when 7 days have passed. Search engines crawl the sitemap occasionally, and they’ll process all the URLs therein afterward.
Note that the sitemap isn’t needed for discovery anymore. Search engines now have intelligent spiders that crawl your pages and follow the links therein automatically, without the need for a sitemap.
Since you’re sharing the search engine services with billions of other websites, you’ll have to wait in their queue. When you’re next in line, they’ll crawl and process your sitemap, which will add the newly found/updated URLs into their queue. This process is repeated indefinitely.
Note that sites have a “crawling budget”. This budget is determined by the popularity and classification of your website. Small local-business sites tend to get crawled slower and less often than large news corporations’ sites.