I don’t know if it works, if the robots use/request different robots.txt for different protocols. But you could deliver a different robots.txt for requests over HTTPS.
So when http://example.com/robots.txt is requested, you deliver the normal robots.txt. And when https://example.com/robots.txt is requested, you deliver the robots.txt that disallows everything.
发布评论
评论(1)
我不知道它是否有效,机器人是否为不同的协议使用/请求不同的
robots.txt
。 但您可以为通过 HTTPS 的请求提供不同的robots.txt
。因此,当请求
http://example.com/robots.txt
时,您将传递正常的robots.txt
。 当请求https://example.com/robots.txt
时,您会传递禁止一切的robots.txt
。I don’t know if it works, if the robots use/request different
robots.txt
for different protocols. But you could deliver a differentrobots.txt
for requests over HTTPS.So when
http://example.com/robots.txt
is requested, you deliver the normalrobots.txt
. And whenhttps://example.com/robots.txt
is requested, you deliver therobots.txt
that disallows everything.