Google codedly introduced Sitemap for static pages in blogger platform and am sure many bloggers are not aware of this. Well, before now, you used to submit your blog
sitemap.xml file which contains link-list of all "published posts" on your blog but now you can submit your static page sitemap to search engines in other to be crawled and indexed.
It is also important to note that pages doesn't appear in the sitemap generated at atom.xml or rss.xml before now and there is no way to publish a file on blogger until this recent update. Sitemap is very important to a site as it helps your SEO rank and also provide your visitors with a simplified rundown on the articles on your website.
In Blogger platform, sitemap for blogger pages are now auto generated in a separate dynamic range and it contains all the list of published static pages on a blog.
The sitemap for blogspot static pages is located at /sitemap-pages.xml.
How To Find Or Locate Your Site Static Pages Sitemap
Just go to your web browser address bar and type the following:
For Default Blogspot Domains:
For Custom Domains:
Just replace Your-Domain with your domain name.
I used mine as an example below
How To Submit Static Pages Sitemap To Google Search Console (Webmaster Tool)
You can submit your static pages to google search console to instruct search bots to crawl and index them in SERPs. Use the below steps
1. Login to Google Search Console (aka as Google webmaster tools). Sign up if you haven't; otherwise, just login and continue
2. Select your blog's name from the list provided
3. Select Crawl > Sitemaps from the left sidebar menu
4. Now click "ADD/TEST SITEMAP" button towards yours top-right side
Insert sitemap-pages.xml inside the text field.
5. Finally, just Hit the "Submit" button and you are good to go
Same method applies for all other search engines like Bing/Yahoo and Yandex.
Google would visit your blog static pages, index them just like the normal sitemap.xml file.
Why Some Pages Are Not Indexed
At times, after submitting the sitemap, I noticed that some pages weren't indexed while others were successfully indexed. What might be the cause of this? Well, i understand that Google choose to index pages that are not tagged "noindex" either through meta tag or custom robots tag. Sometimes, bloggers put themselves in search engine trouble by playing with their site Robot.txt file. This file is very sensitive as it can make or mar your blog visibility in search engines so I advice you to leave it at default if you are not sure of what you are doing.
So, if you have chosen the "noindex" option from page settings while publishing a Page using blogger editor then this page will be submitted to Google by the sitemap but it will not be indexed unless you uncheck this "noindex" box. Once pages sitemap has been successfully added, it will be included in the list of your blog sitemaps