站点地图的使用

发布于 2024-08-17 02:27:01 字数 623 浏览 5 评论 0原文

我最近参与了一个网站的重新开发(一个针对健康专业人士的搜索引擎:http://www.tripdatabase。 com),目标之一是使其对搜索引擎更加“友好”,不是通过任何黑魔法,而是通过更好的 xhtml 合规性、更多关键字丰富的 url 和全面的站点地图(> 500k 文档)。

不幸的是,在 2009 年 10 月推出新版本网站后不久,我们发现网站访问量(主要通过 Google 的自然搜索)大幅下降至以前辉煌的 30%,这不是我们的初衷

:)许多 SEO 专家提供帮助,但没有人能够令人满意地解释流量的立即下降,而且我们听到了各个方面相互矛盾的建议,我希望有人可以帮助我们。

因此,我的问题是:

  1. 站点地图中存在的页面是否也需要从其他页面进行抓取?我们认为站点地图的目的是专门帮助蜘蛛获取尚未“可见”的内容。但现在我们得到的建议是确保每个页面也链接到另一个页面。这就提出了一个问题......为什么要费心使用站点地图?
  2. 几个月过去了,只有 1% 的站点地图(根据网站站长工具,格式良好)似乎已被蜘蛛抓取 - 这正常吗?

预先致谢,

菲尔·墨菲

I've recently been involved in the redevelopment of a website (a search engine for health professionals: http://www.tripdatabase.com), and one of the goals was to make it more search engine "friendly", not through any black magic, but through better xhtml compliance, more keyword-rich urls, and a comprehensive sitemap (>500k documents).

Unfortunately, shortly after launching the new version of the site in October 2009, we saw site visits (primarily via organic searches from Google) drop substantially to 30% of their former glory, which wasn't the intention :)

We've brought in a number of SEO experts to help, but none have been able to satisfactorily explain the immediate drop in traffic, and we've heard conflicting advice on various aspects, which I'm hoping someone can help us with.

My question are thus:

  1. do pages present in sitemaps also need to be spiderable from other pages? We had thought the point of a sitemap was specifically to help spiders get to content not already "visible". But now we're getting the advice to make sure every page is also linked to from another page. Which prompts the question... why bother with sitemaps?
  2. some months on, and only 1% of the sitemap (well-formatted, according to webmaster tools) seems to have been spidered - is this usual?

Thanks in advance,

Phil Murphy

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

荒人说梦 2024-08-24 02:27:01

XML 站点地图可帮助搜索引擎蜘蛛对站点的所有网页建立索引。
如果您经常发布许多页面,但不能取代正确的站点链接系统,则站点地图非常有用:所有文档都必须从其他相关页面链接。

您的站点非常大,您必须注意站点地图中发布的 URL 数量,因为每个 XML 文件的 URL 数量限制为 50.000 个。

完整文档位于 Sitemaps.org

The XML sitemap helps search engine spider to indexing of all web pages of your site.
The sitemap is very usefull if you publish frequently many pages, but does not replace the correct system of linking of the site: all documents must be linke from an other related page.

Your site is very large, you must attention at the number of URLs published in the Sitemap because there are the limit of 50.000 URLs for each XML file.

The full documentation is available at Sitemaps.org

浅笑轻吟梦一曲 2024-08-24 02:27:01

回复:站点地图中的页面是否也需要从其他页面进行抓取?
是的,事实上这应该是您首先要做的事情之一。让您的网站在搜索引擎之前对用户更可用,搜索引擎会因此而喜欢您。页面之间的大量内部链接是必须的第一步。大多数时候,您可以使用内部站点地图页面或类别页面等来完成此操作。

回复:为什么要费心使用站点地图?
是的!,站点地图可以帮助您设置站点上某些内容(例如主页)的优先级,告诉搜索引擎更频繁地查看哪些内容。注意:不要将所有页面设置为最高优先级,这会让 Google 感到困惑并且对您没有帮助。

回复:几个月过去了,只有 1% 的站点地图似乎被蜘蛛抓取 - 这正常吗?
是的!,我有一个包含 10 万多个页面的网页。 Google 从来没有在一个月内将它们全部编入索引,它每个月一次只索引大约 20k 的小块。如果您使用优先级设置属性,您可以告诉蜘蛛每次访问应该重新索引哪些页面。

正如 Rinzi 提到的,更多文档可在 Sitemaps.org

re: do pages present in sitemaps also need to be spiderable from other pages?
Yes, in fact this should be one of the first things you do. Make your website more usable to users before the search engines and the search engines will love you for it. Heavy internal linking between pages is a must first step. Most of the time you can do this with internal sitemap pages or category pages ect..

re: why bother with sitemaps?
Yes!, Site map help you set priorities for certain content on your site (like homepage), Tell the search engines what to look at more often. NOTE: Do not set all your pages with the highest priority, it confuses Google and doesn't help you.

re: some months on, and only 1% of the sitemap seems to have been spidered - is this usual?
YES!, I have a webpage with 100k+ pages. Google has never indexed them all in a single month, it takes small chunks of about 20k at a time each month. If you use the priority settings property you can tell the spider what pages they should re index each visit.

As Rinzi mentioned more documentation is available at Sitemaps.org

初见你 2024-08-24 02:27:01

尝试建立更多的反向链接和“信任”(来自优质来源的链接)

可能有助于进一步加快索引速度:)

Try build more backlinks and "trust" (links from quality sources)

May help speed indexing further :)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文