哪些 SEO 实践可能导致 Google 搜索中如此快速地出现这些问题?

发布于 2024-09-02 20:12:04 字数 296 浏览 7 评论 0原文

有谁知道为什么在这里发布的问题会如此快地出现在谷歌上?

有时,提交的问题会在提交问题后 30 分钟内显示为前 10 个条目左右 - 在第一页上。请告诉我,这里施展着什么样的魔法?

有人有一些想法、建议吗?我的第一个想法是,他们的站点地图中有信息告诉谷歌机器人每 N 分钟左右进行一次拖网 - 这是怎么回事?

顺便说一句,我知道,如果您没有高质量的信息(在您的网站上不断更新),那么简单地指示 Googlebots 每 N 分钟扫描一次您的网站是行不通的。

我只是想知道是否还有其他事情可以做对(当然除了精彩的内容)

Does anyone have some idea as to how come questions posted here on SO are showing up so quickly on Google?.

Sometimes questions submitted are appearing as the first 10 entries or so - on the first page within 30 minutes of submitting a question. Pray tell, what sort of magic is being wielded here?

Anybody have some ideas, suggestions?. My first thought is that they have info in their sitemap that tells google robots to trawl every N minutes or so - is that whats going on?

BTW, I am aware that simply instructing Googlebots to scan your site every N minutes will not work if you dont have quality information (that is constantly being updated on your site).

I'd just like to know if there is something else that SO may be doing right (apart from the marvelous content of course)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

摘星┃星的人 2024-09-09 20:12:04

简而言之,与不太受欢迎或更改频率较低的网站相比,内容质量更高、更改更频繁的更受欢迎的网站在 Google 的算法中排名更高,并且索引和缓存的频率更高。

To put it simply, more popular websites with more quality content and more frequent changes are ranked higher with Google's algorithm, and are indexed and cached more frequently than sites that are less popular or change less frequently.

谈情不如逗狗 2024-09-09 20:12:04

从广义上讲,只有内容才能做到这一点。内容的大小和质量已经达到了谷歌“以网站允许的速度蜘蛛”的门槛。 SO 必须主动限制 Googlebot; Jeff 在 Coding Horror 上表示,他们每天从 Google 收到超过 50,000 个请求,而那是一年多前的事了。

如果您浏览 Alexa 前 500 强中的非新闻网站,您会发现几乎所有这些网站在 Google 中的搜索结果都只有几分钟的历史。 (即在 Google 中输入 site:archive.org 并在左侧菜单中选择“最新”)

因此,除了增加网站的流量之外,您无法对自己的网站进行任何实际操作来加快蜘蛛抓取速度。 。

Broadly speaking, it's only content that does it. The size and quality of content has reached Google's threshold for "spider as fast as the site will permit". SO has to actively throttle the Googlebot; Jeff has said on Coding Horror that they were getting more then 50,000 requests per day from Google, and that was over a year ago.

If you scan through non-news sites from the Alexa top 500 you will find virtually all of those have results in Google that are just minutes old. (i.e. type site:archive.org into Google and choose "Latest" in the menu on the left)

So there's nothing practical you can do to your own site to speed up spidering, except to increase the amount of traffic to your site...

拥抱没勇气 2024-09-09 20:12:04

这真的很简单。

SO 是一个 PageRank 6 网站,为世界提供信息。

Google 对新信息有强烈偏见。它每天会多次抓取该网站,并立即将页面添加到其索引中。它会在一小段时间(几天)内偏爱某个页面(前 10 名)说出特定查询,然后它将停止偏爱该页面并按正常方式对其进行排名。

这是标准的 G 程序,许多站点都会发生这种情况。

正如您可能猜到的那样,灰帽/黑帽 seo 在很多方面都利用了这一事实。

It is really simple.

SO is a PageRank 6 site that gives the world new information.

Google has a strong bias on new information. It will crawl the site many times a day and it will immediately add the pages to its index. It will favor a page (top 10) to say a specific query for a small period of time (a few days) and then it will stop favoring that page and rank it as normal.

This is standard G procedure and it happens with many many sites.

As you might guess, grayhat/blackhat seo uses that fact in many ways.

赠佳期 2024-09-09 20:12:04

另外,在提供 RSS 提要的 SO 的帮助下,我认为谷歌喜欢来自可靠来源的提要。

Also helped by SO providing an RSS feed, I think google likes feeds from reliable sources.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文