Drupal 将 nid/vid 从 int 转换为 bigint
我正忙于一个项目,其中 nid 和 vid 值可能达到极限。我需要一种机制来将当前和未来的 nid 和 vid 数据类型从 int 修改为 bigint。
我想也许有一个模式更改钩子,或者类似的东西。我看到有一个名为 hook_schema_alter 的钩子。
构建一个简单检查模式中的 nid 和 vid 并将其修改为 bigint 的模块有多可靠?这是解决问题的实用方法吗?它适用于所有内容类型、模块内容类型和 cck 吗?
G。
I am busy with a project where the nid and vid values may reach its limit. I need a mechanism to modify current and future nid and vid data types from int to bigint.
I figured maybe there was a schema alter hook, or something limilar. I see there is a hook called hook_schema_alter.
How reliable will it be to build a module that simple checks for nid and vid in the schema, and modifies it to be a bigint? Would this be a practical way of solving the problem? Will it work with all content types, module ones and cck?
G.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
由于 hook_schema_alter 只会在模块安装时触发,而不是构建一个自动管理此操作的复杂模块,因此您应该选择您知道将使用的模块子集,安装它们,然后手动更新架构。
如果您将拥有 40 亿个节点(其他发帖人说是 20 亿,但 nid 未签名,这使可用范围加倍),您确实不应该随机打开和关闭模块。您的架构应该坚如磐石,并提前规划好。
另外,您想要在 Drupal 中使用这么多节点的用例是什么?即使完全优化并且没有 Drupal 堆栈的重量(并且它喜欢昂贵的 JOIN 查询),任何类型的具有这么多行的数据库操作都将非常非常密集。
Drupal 非常适合对您正在构建的任何内容进行原型设计,但是当您达到 xxx,000 个节点时,您已经花费了大部分时间来手动调整所有内容以提高性能。如果您拥有世界一流的专业知识和资金,您可能会获得 x,000,000 个节点。对于更多内容,您可能希望开始考虑将数据卸载到专门针对大型数据集进行优化的数据库系统中,然后从 Drupal 作为服务访问它。
查看 Hadoop 和 Cassandra,了解可以扩展到数十亿个项目的 DBMS 示例(Google、Facebook、Twitter 等都使用它们)。
As hook_schema_alter will only be fired on module install, rather than build a complex module that manages this automatically, you should pick a subset of modules that you know you will be using, install them, and manually update the schema.
If you are going to have 4 billion nodes (other poster said 2bn, but nid is unsigned which doubles the available range) you really should not be turning modules on and off at random. Your architecture should be rock solid and planned out well in advance.
Also, what's your use case for wanting that many nodes in Drupal? Any kind of database operation with that many rows is going to be very, very intensive even when fully optimized and without the weight of the Drupal stack (and it's love of expensive JOIN queries) on top of it.
Drupal will be fine for prototyping whatever you're building but by the time you hit xxx,000 nodes you'll already be spending the majority of your time hand-tuning everything for performance. You may get x,000,000 nodes if you have serious world-class expertise and funding. For anything more, you will probably want to start looking at offloading that data into a database system that is specifically optimized for huge datasets and then access it from Drupal as a service.
Take a look at Hadoop and Cassandra for examples of DBMS' that can scale to billions of items (Google, Facebook, Twitter etc use them).
如果你的 nid/vids 将超过 40 亿,那么在你关心这个之前,你可能还有其他一些问题需要处理:) 另外,因为你在 D6,如果这不是 200,000,000 条内容和 200,000,000 条内容,那么你可能需要处理一些其他问题。 20 次修订,而是其他内容,例如股票价格变化信息或我将其存储在它自己的表中的内容。
If your nid/vids are going to get past 4 billion you might have some other issues to deal with before you care about this :) Also since you are in D6 if this isn't like say 200,000,000 pieces of content & 20 revisions, but rather something else like stock price change information or something I would store it in it's own table.