自动防止 Trac 中的 wiki 腐烂?
大家好:有没有一种方法可以使用一个插件来提高 trac wiki 质量,该插件可以处理过时的页面、引用不再存在的代码的页面、未链接的页面或更新较低的页面等工件 -速度 ?我认为可能有几种启发式方法可以用来防止 wiki-rot :
- 最近编辑的数量
- 最近查看的数量
- 页面是否链接到源文件
- wiki 页面的最后更新是否是 <<或>它链接到的源文件
- wiki 中的整个目录在过去“n”天内是否已被使用/编辑/忽略
等等。
如果没有别的,仅这些指标对于每个页面和每个目录都是有用的行政立场。
Hi guys : Is there a way to improve trac wiki quality using a plugin that deals with artifacts like for obsolete pages, or pages that refer to code which doesn't exist anymore, pages that are unlinked, or pages which have a low update-rate ? I think there might be several heuristics which could be used to prevent wiki-rot :
- Number of recent edits
- Number of recent views
- Wether or not a page links to a source file
- Wether or not a wiki page's last update is < or > the source files it links to
- Wether entire directories in the wiki have been used/edited/ignored over the last "n" days
etc. etc. etc.
If nothing else, just these metrics alone would be useful for each page and each directory from an administrative standpoint.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我不知道现有的插件可以做到这一点,但你提到的一切听起来确实可以以某种方式实现。
您可以使用 trac-admin CLI 命令获取 wiki 页面列表并转储内容将特定 wiki 页面(作为纯文本)保存到文件或标准输出。使用它,您可以编写一个脚本来读取所有 wiki 页面,解析链接内容,并生成哪些页面链接到哪些页面的图表。这应该查明“孤立页面”(未链接到的页面)、链接到源文件的页面以及链接到外部资源的页面。通过
wget
等方式运行外部链接可以帮助您识别损坏的链接链接。要访问上次编辑的日期,您需要查询 Trac 的数据库。您需要的查询取决于您使用的特定数据库类型。为了以(相对)安全且简单的方式使用数据库,我找到了 WikiTableMacro 和 < a href="http://trac-hacks.org/wiki/TracSqlPlugin" rel="nofollow">TracSql 插件非常有用。
您的问题中最难实现的功能是有关页面浏览量的功能。我不认为 Trac 会跟踪页面浏览量,您可能必须解析 Web 服务器的日志才能获取此类信息。
I don't know of an existing plugin that does this, but everything you mentioned certainly sounds do-able in one way or another.
You can use the trac-admin CLI command to get a list of wiki pages and to dump the contents of a particular wiki page (as plain text) to a file or stdout. Using this, you can write a script that reads in all of the wiki pages, parses the content for links, and generates a graph of which pages link to what. This should pinpoint "orphans" (pages that aren't linked to), pages that link to source files, and pages that link to external resources. Running external links through something like
wget
can help you identify broken links.To access last-edited dates, you'll want to query Trac's database. The query you'll need will depend on the particular database type that you're using. For playing with the database in a (relatively) safe and easy manner, I find the WikiTableMacro and TracSql plugins quite useful.
The hardest feature in your question to implement would be the one regarding page views. I don't think that Trac keeps track of page views, you'll probably have to parse your web server's log for that sort of information.
这些怎么样:
BadLinksPlugin:此插件记录 wiki 内容中发现的不良本地链接。
这是一个相当新的,只处理悬空链接,但正如我从源代码中看到的任何坏链接。这至少是您解决方案请求的一个组成部分。
VisitCounterMacro:宏显示 wiki 页面显示的次数。
这是一本比较老的书了。当缺少管理视图时,您将仅获得每页的统计信息,但这可以相当容易地构建,即像自定义 PageIndex 一样。
How about these:
BadLinksPlugin: This plugin logs bad local links found in wiki content.
It's a quite new one, just deals with dangling links, but any bad links as I see from source code. This is at least one building block to your solution request.
VisitCounterMacro: Macro displays how many times was wiki page displayed.
This is a rather old one. You'll get just the statistic per page while an administrative view is missing, but this could be built rather easily, i.e. like a custom PageIndex.