从 Google 索引中删除整个网站

发布于 2024-11-07 09:57:02 字数 1811 浏览 0 评论 0原文

我的网址是:http://LawMirror.com -(在线法律资源)

我想从 Google 索引中删除内容。 Google 索引包括我的网站,之前在 Google 索引中索引了大约 5,000,000 个页面,但现在剩下的页面有 3,025,000 个。 我的网址是:http://LawMirror.com -(在线法律资源)

我想从 Google 索引中删除内容。 Google 索引包括我的网站,之前在 Google 索引中索引了大约 5,000,000 个页面,但现在剩下的页面有 3,025,000 个。

我做了以下事情,但删除页面的速度非常慢。

Robots.txt

User-agent: * Disallow: /

.htaccess

rewriteengine on
rewritecond %{HTTP_USER_AGENT} ^.*Googlebot/2.1.*$
rewriterule .* - [G]

这是 Googlebot 尝试抓取内容时返回给 Googlebot 的内容

HTTP/1.1 410 Gone 日期:2013 年 1 月 5 日星期六 12:39:23 GMT 服务器:Apache / 2.2.23(Unix)mod_ssl / 2.2.23 OpenSSL / 0.9.8e-fips-rhel5 mod_fastcgi / 2.4.6 mod_jk / 1.2.37 mod_auth_passthrough / 2.1 mod_bwlimited / 1.4 FrontPage / 5.0.2.2635 PHP / 5.3.19 内容长度:661 连接:关闭 内容类型:text/html; charset=iso-8859-1

我还使用了 html 元标记 noindex,没有跟随但没有效果,如下所示:

我也提交了网站删除,但是删除内容的速度很慢。在过去 35 天内,仅有少数页面被删除。我的网站也已从 Google 搜索索引中删除,但 Google 网站管理员 - 健康 ->索引状态仍然显示 3,025,000 个页面,如果我重新提交该网站,它们将显示已索引的页面。如何提高删除页面的速度。

这是Googlebot尝试抓取内容时返回的内容

HTTP/1.1 410 Gone
Date: Sat, 05 Jan 2013 12:39:23 GMT
Server: Apache/2.2.23 (Unix) mod_ssl/2.2.23 OpenSSL/0.9.8e-fips-rhel5 mod_fastcgi/2.4.6 mod_jk/1.2.37 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.3.19
Content-Length: 661
Connection: close
Content-Type: text/html; charset=iso-8859-1

我也使用了html元标记noindex,没有跟随但没有效果,因为:

<meta name="googlebot" content="noindex,nofollow">

我也提交了网站删除,但删除内容的速度非常慢。在过去 35 天内,仅有少数页面被删除。我的网站也已从 Google 搜索索引中删除,但 Google 网站管理员 - 健康 ->索引状态仍然显示 3,025,000 个页面,如果我重新提交该网站,它们将显示已索引的页面。如何提高删除页面的速度。

My URL is: http://LawMirror.com - (Online Legal Resources)

I want to remove the contents from Google Index. Google Index include my websites huge number of Pages indexed in Google Index about 5,000,000 Pages earlier but now left pages are 3,025,000.
My URL is: http://LawMirror.com - (Online Legal Resources)

I want to remove the contents from Google Index. Google Index include my websites huge number of Pages indexed in Google Index about 5,000,000 Pages earlier but now left pages are 3,025,000.

I have made the following things but the removal of pages are very slow pace.

Robots.txt

User-agent: * Disallow: /

.htaccess

rewriteengine on
rewritecond %{HTTP_USER_AGENT} ^.*Googlebot/2.1.*$
rewriterule .* - [G]

The is the content returned to Googlebot when it try's to crawl the content

HTTP/1.1 410 Gone
Date: Sat, 05 Jan 2013 12:39:23 GMT
Server: Apache/2.2.23 (Unix) mod_ssl/2.2.23 OpenSSL/0.9.8e-fips-rhel5 mod_fastcgi/2.4.6 mod_jk/1.2.37 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.3.19
Content-Length: 661
Connection: close
Content-Type: text/html; charset=iso-8859-1

I had also used the html meta tag noindex, no follow but no effect as:

I had also submitted for website removal but the speed of removing of contents is very slow. In last 35 days the only few are pages are removed. My website is also removed from Google Search Index but Google Webmasters - Health -> Index status still showing 3,025,000 page and If I re-submit the site they will show already indexed pages. How can I increase the speed to remove pages.

The is the content returned to Googlebot when it try's to crawl the content

HTTP/1.1 410 Gone
Date: Sat, 05 Jan 2013 12:39:23 GMT
Server: Apache/2.2.23 (Unix) mod_ssl/2.2.23 OpenSSL/0.9.8e-fips-rhel5 mod_fastcgi/2.4.6 mod_jk/1.2.37 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.3.19
Content-Length: 661
Connection: close
Content-Type: text/html; charset=iso-8859-1

I had also used the html meta tag noindex, no follow but no effect as:

<meta name="googlebot" content="noindex,nofollow">

I had also submitted for website removal but the speed of removing of contents is very slow. In last 35 days the only few are pages are removed. My website is also removed from Google Search Index but Google Webmasters - Health -> Index status still showing 3,025,000 page and If I re-submit the site they will show already indexed pages. How can I increase the speed to remove pages.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

假面具 2024-11-14 09:57:02

一种解决方案是每次请求页面时返回 410。它将加速取消索引。确保您的 robots.txt 中没有任何页面被屏蔽。

One solution is to return a 410 each time a page is requested. It will accelerate deindexing. Make sure no page is blocked in your robots.txt.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文