为任何 CMS 系统创建 sitemap.xml 的最有效的性能方法是什么?

发布于 2024-08-03 23:50:59 字数 237 浏览 6 评论 0原文

我们想要在 CMS 系统中实现 sitemap.xml 功能。我们的开发人员内部有一些争论,认为此功能会影响性能,因为每次内容发生更改时,都需要创建站点链接的完整列表并将其放置在 sitemap.xml 中。

这个想法是,每次编辑或添加公共查看页面时,它都会立即添加到 sitemap.xml 中,使其与站点保持同步。

当您回答时,如果您有时间,还有哪些其他 CMS 系统开放或没有内置站点地图生成功能?

谢谢,

We want to implement a sitemap.xml feature in out CMS system. There is some argument within our developers that this feature will impact performance due to each time a change is done in the Content a full list of links of the site needs to be created and placed in the sitemap.xml.

The idea is that each time a public viewing page is edited or added it is immediately added to the sitemap.xml making it up to date with the site.

While you are answering, and if you have time, what other CMS system open or not have built-in sitemap generation?

Thanks,

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

后eg是否自 2024-08-10 23:50:59

每次更新 CMS 时更新站点地图肯定会产生性能问题,因为站点地图往往很大,并且生成成本高昂(从 CPU 和磁盘 I/O 角度来看)。

我要做的是:
1. 规划出网站的结构
2.确定您需要链接到站点地图中的哪些区域
3. 将站点地图索引的名称添加到您的 robots.txt 文件
4. 编写一个从数据库读取并生成静态 xml 站点地图文件的脚本
5. 创建一个 cron 作业,定期重新运行此脚本
6. 将站点地图网址提交给搜索引擎

Updating the sitemap every time you update your CMS will definitely create performance issues, because sitemaps tend to be large, and costly to generate (from a CPU & disk i/o perspective).

What I would do is:
1. map out the structure of your site
2. determine which areas you need to link to in the sitemap
3. add the name of the sitemap index to your robots.txt file
4. write a script that will read from the database and generate static xml sitemap files
5. create a cron job that will re-run this script on a regular basis
6. submit your sitemap url to the search engines

半仙 2024-08-10 23:50:59

对于我工作的 CMS 站点(每个站点有 70,000 到 350,000 个页面/文件夹),我们通常每 24 小时重新生成一次站点地图 XML。我们从来没有遇到过任何问题。除非您的网站像 Stackoverflow 一样受欢迎 - 并且 Google 认识到它的更新程度如此之高 - 它不会经常重新抓取您的网站,以证明拥有完全更新的站点地图文件是合理的。

For the CMS-powered sites I work on, with 70,000 to 350,000 pages/folders each, we typically regenerate the sitemap XML once every 24 hours. We've never had any problems with that. Unless your site is as popular as Stackoverflow - and Google recognizes that it gets updated as much as SO - it won't re-crawl your site often enough to justify having a fully updated sitemap file.

冷清清 2024-08-10 23:50:59

请记住,谷歌不会经常读取您的站点地图,每天在 cron 作业上重新生成是安全的,因此,如果您安排每天晚上在安静的时间重建它,谷歌将在您下次访问时拾取更改轮询。

Bearing in mind google doesn't read your sitemap that often, its safe to re-generate on a cron job every day, so if you schedule a rebuild of it each evening in the quiet hours, google will pick the changes up next time you poll.

一梦等七年七年为一梦 2024-08-10 23:50:59

我们在初创公司 (Epiloge) 中使用了 1000 个相关 URL。正如人们在这里和其他地方所说的那样,您需要一个静态的 sitemap.xml,它会定期更新(每天或每隔几天),从而包含所有新的相关 url。

如果您在 Javascript 环境中工作并且不想使用任何库或框架,请查看此处的这篇文章,了解一个简单的 Javascript 解决方案,用于创建访问数据库的站点地图,获取所有 url,然后创建 xml 并获取该文件 - 然后您可以将其插入到您的站点结构中。

https:// www.epologe.com/how-to-generate-a-sitemapxml-with-javascript-for-dynamic-page-urls-296c42

We work with 1000s of relevant urls at our startup (Epiloge). As people said here and elsewhere, you want to have a static sitemap.xml which gets regularly updated (every day or every few days) and thus include all new relevant urls.

If you are working in a Javascript environment and don't want to use any libraries or frameworks, look at this article here for a plain Javascript solution to create a sitemap accessing a database, getting all the urls, then creating the xml and getting you the file - which you can then plug into your site structure.

https://www.epiloge.com/how-to-generate-a-sitemapxml-with-javascript-for-dynamic-page-urls-296c42

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文