防止通过 .htaccess 直接访问 robots.txt
我想阻止用户访问我的 robots.txt 文件,但我仍然希望搜索引擎能够读取它。是否可以? 如果是的话我该怎么做?我相信如果我在 .htaccess 中编写以下内容,它会起作用,但我担心它也会阻止搜索引擎访问它。
订单拒绝、允许 所有人都否认
谢谢
I want to prevent users from accessing my robots.txt file but I still want search engines to read it. Is it possible?
If yes then how do I do it? I believe if I write following in .htaccess it will work but I am afraid it will also block search engines from accessing it.
order deny, allow
deny from all
Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
由于标准
robots.txt
是从您的域的根目录提供服务的,除非您能够以某种方式可靠地区分搜索引擎和用户,所以我认为您所要求的内容是不可能的。您可以尝试按用户代理或可能按 IP 范围进行过滤。
您是否有理由不希望用户看不到 robots.txt 文件中的内容?毕竟该文件中的所有内容都是公开的。
Since standard
robots.txt
is served from the root of your domain unless you can somehow reliably distinguish search engines from users I don't think what you are asking is possible.You could try filtering by user agent or possibly by IP range.
Is there a reason why you don't want your users to not see what is in your robots.txt file? After all everything in that file is public.
您可以使用 x-robots-tag 或 robots 元标记代替 robots.txt 以减少对该文件的依赖。例如,将以下指令添加到您的 .htaccess 文件中。
You can use x-robots-tag or robots meta tags instead of robots.txt to reduce your reliance on the file. For example add the following directive to your .htaccess file.