You should tell RankMath not to include them in the sitemap just to avoid unnecessarily wasting crawl budget and filling up Search Console with unnecessary errors.
Listing the pages in the sitemap doesn’t mean they will be indexed. it just means Googlebot will try to visit and crawl them to decide if they are worth indexing. But because the pages can only be accessed after logging in (which the bot obviously can’t do) it will list the pages as crawl errors in Search Console.
It’s never a good idea to feed Googlebot pages in the sitemap that it is blocked from crawling – this is called “dirt” in the sitemap and of there’s too much of it, Google may start to trust the sitemap less.
Rankmath by default offers all page types for sitemap inclusion – it’s up to the site owner/SEO to decide which ones actually should be allowed.