I’m only aware of the fundamentals of optimizing a site to appear highly in Bing, however, it may be useful for you to know that Duane Forrester was interviewed back in 2011 (very, very long time ago) about best-practice optimization techniques to appear in the Bing search results. He was asked about sitemaps and he said:
“Your Sitemaps need to be clean. We have a 1% allowance for dirt in a Sitemap. Examples of dirt are if we click on a URL and we see a redirect, a 404 or a 500 code. If we see more than a 1% level of dirt, we begin losing trust in the Sitemap.”
If the same rules apply in 2017, Bing has a 1% tolerance for dirty URLs in the sitemap file. This includes links that are broken (404), redirected (301 or 302) and even a 500 code (server error). So it’s important to ensure that your sitemap files are updated frequently and maintained with healthy links.
Does the same apply over on Google? That’s the question, if it does, I’m sure a ton of webmasters would be screwed as I’m always seeing broken links in sitemaps. Luckily for us, John Mueller has decided to help us, webmasters, out. He answered a similar question on Twitter:
@agking No. We evaluate on a per-URL basis — if we can use a URL from a sitemap file, we’ll do that. If not, we’ll move on to the next URL.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) January 5, 2017
John said and I quote:
‘No. We evaluate on a per-URL basis — if we can use a URL from a sitemap file, we’ll do that. If not, we’ll move on to the next URL.’
Interesting stuff, according to John, Google evaluate each and every URL on a page-level. Therefore if one URL doesn’t work, they won’t bother trying to crawl that one individual link, instead of the whole sitemap. On the other hand, if 5 of your links in a 100 link sitemap file didn’t work, Bing would lose the trust of all 100 links. Thanks, John.