Alan Bleiweiss shared a scree shot of how the Google Sitemaps report in Google Search Console shows over 11,000 URLs that were blocked by robots.txt as an issue and warning. Alan asked why is the new Google Index report in the new Google Search Console not reporting on these errors?
John Mueller said on Twitter that the new report won’t report on an error on sample URLs at the sitemap submission level. He said “those are sample URLs tested before being submitted to indexing – this is done at the sitemap submission, so it wouldn’t be in the indexing report in the new SC.”
Those are sample URLs tested before being submitted to indexing – this is done at the sitemap submission, so it wouldn’t be in the indexing report in the new SC.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) July 3, 2018
Here is Alan’s screen shot:
I should note, that on this site, I block only one URL via the robots.txt file and it shows as an error both in my Sitemap submission and in the Index report.
Sitemap report:
New Index coverage report:
Forum discussion at Twitter.
This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author: barry@rustybrick.com (Barry Schwartz)
For more SEO, PPC, internet marketing news please check out https://news.scott.services
Why not check out our SEO, PPC marketing services at https://www.scott.services
We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737
The post New Google Index Report Doesn’t Show Blocked URLs From Sitemaps appeared first on Scott.Services Online Marketing News.
source https://news.scott.services/new-google-index-report-doesnt-show-blocked-urls-from-sitemaps/
No comments:
Post a Comment