在服务器上建立一个网站,如何阻止搜索引擎找到它?
我正在建立一个网站,听说如果该网站被搜索引擎列出,它可能会获得不好的声誉,并且很难提升它在搜索引擎上的排名。
我想知道,如何才能使我的网站在完成之前不被列在搜索引擎上。
任何帮助将不胜感激。
谢谢。
I am building a website and heard that if the website gets listed on search engines, it might get bad reputation and its very difficult to lift it up on the rank of those search engines.
I would like to know, how could I make my website not to be listed on search engines until I finish it.
Any help would be appreciated.
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
最明显的是:不要将其链接到任何地方。
第二个最明显的:放入身份验证要求(基本,摘要)
另:http: //www.robotstxt.org/,特别是http://www.robotstxt。 org/faq/prevent.html
正如他们所说,请注意
robots.txt
只是一个建议。谷歌会注意到它,但垃圾邮件机器人不会。Most obvious: don't link it anywhere.
Second most obvious: put in authentication requirement (basic, digest)
Also: http://www.robotstxt.org/, specifically http://www.robotstxt.org/faq/prevent.html
As they say, note that
robots.txt
is only a suggestion. Google will heed it, but spambots will not.谷歌搜索robots.txt。这是所有主要搜索引擎都支持的标准,允许您请求网络爬虫忽略您网站的部分或全部。
Google for robots.txt. It's a standard supported by all major search engines that allows you to kindly request a web crawler to ignore parts or all of your website.
在完成之前您需要将其发布到互联网上吗?为什么不把开发留在本地呢?
Do you need to make it available on the internet before it's done? Why not keep the development local?