Google Sitemap 和 Robots.txt 问题
我们的网站上有一个站点地图,http://www.gamezebo.com/sitemap.xml
网站管理员中心报告网站地图中的某些网址被我们的 robots.txt 阻止,请参阅 gamezebo.com/robots.txt!尽管这些 url 在 Robots.txt 中并未被禁止。还有其他此类网址,例如,gamezebo.com/gamelinks 出现在我们的站点地图中,但它被报告为“URL 受 robots.txt 限制”。
另外,我在网站管理员中心有这个解析结果,上面写着“第 21 行:抓取延迟:Googlebot 忽略了 10 条规则”。这是什么意思?
我很感谢你的帮助,
谢谢。
We have a sitemap at our site, http://www.gamezebo.com/sitemap.xml
Some of the urls in the sitemap, are being reported in the webmaster central as being blocked by our robots.txt, see, gamezebo.com/robots.txt ! Although these urls are not Disallowed in Robots.txt. There are other such urls aswell, for example, gamezebo.com/gamelinks is present in our sitemap, but it's being reported as "URL restricted by robots.txt".
Also I have this parse result in the Webmaster Central that says, "Line 21: Crawl-delay: 10 Rule ignored by Googlebot". What does it mean?
I appreciate your help,
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
抓取延迟不是 robots.txt 格式的实际规范,因此该行将被忽略...如果您愿意,可以在 Google 网站站长工具 下的设置 >抓取率
the crawl delay is not an actual specification in robots.txt format so that line will be ignored... you can set a custom crawl rate if you want to in google webmaster tools under settings > crawl rate