NHibernate 模式生成
您能分享一下您在 NHibernate 模式生成方面的经验吗?就数据模型的复杂性和大小而言,它的可扩展性有多大?与手工制作的数据模型相比,它对性能有什么重大影响吗?
Could you please share your experience with NHibernate schema generation? How much scalable it is in terms of complexity and size of the data model? Does it have any major performance implication compared to hand crafted data model?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
我发现它对于开发非常有用,您可以使用它和一些代码来随意重建和重新填充测试数据库。 Michael 关于迁移的观点符合我们的经验 - 一旦您发布了初始版本,您将需要决定另一种方法来更改生产数据库。
FWIW,我们使用了大约 30 个常见模型的 NH 模式生成(包括每个子类排列一个表),并且它生成的定义是正确的,因此它可以处理的模式大小没有明显的限制。
我现在倾向于认为自动生成的模式几乎总是比手工制作的模式更好的起点,因为该软件将为您提供完全一致且完全符合您指定的内容。熟练的 DBA 可以执行的各种优化在您需要调整大量特定工作负载之前可能没有必要或有用。
I've found it immensely useful for development, when you can use it with a bit of code to rebuild and repopulate test databases at will. Michael's point about migrations matches our experience - once you've made the initial release you'll need to decide on another method for altering production databases.
FWIW, we've used NH schema generation with about 30 models of the usual kinds (including a table per subclass arrangement), and the definitions that it generates are correct, so there's no obvious limit to the size of the schema that it could handle.
I now tend to think that an automatically generated schema is almost always a better starting point than a hand-crafted one, because the software will give you something that is totally consistent and exactly what you specified. The kinds of optimizations that a skilled DBA can do aren't likely to be necessary or useful until after you have a large, specific workload to tune for.
如果您需要导出架构并填充数据库,您会希望看到 Fluent NHibernate Schema Tool。它能够读取您的程序集、hibernate.cfg.xml、*.hbm.xml 和 Fluent 映射。您可以生成/执行数据库的 DDL(创建/更新/删除表),并且它接受类似 CSV 的输入文件,用于填充创建/更新的数据库(数据集文件接受在 HQL 中完成的小型查询) 。该工具对于使用 NHibernate 的单元测试和 Web 应用程序非常有用。
查看更多:https://bitbucket.org/guibv/fnst/wiki/Home。
If you need to export your schema and populate your database, you would like to see the Fluent NHibernate Schema Tool. It is capable to read your assemblies, hibernate.cfg.xml, *.hbm.xml and Fluent Mappings. You can generate/execute the DDL of your database (create/update/drop tables) and it accepts a CSV-like input file to be used for populating the created/updated database (the dataset file accepts small queries to be done in HQL). This tool is very useful for unit testing and Web applications that use NHibernate.
See more: https://bitbucket.org/guibv/fnst/wiki/Home.
您正在比较苹果和梨。手工制作的模型将始终(应该)胜过任何 ORM 技术。
我个人认为 NHibernate 表现良好,并且几乎可以将任何 OO 模型映射到关系模型,这就是它的美妙之处。有一些问题,例如了解应用程序启动时间并确保正确使用会话管理。
我推荐 NHibernate,并且已经在包含大约 80 个表的模式上使用它 18 个月了,尚未发现任何重大问题。
You are comparing Apples and Pears. Hand crafted model will always (well should) out perform any ORM technology.
I personally think NHibernate performs well and will map virtually any OO model to a relational model, that's the beauty of it. There are a few gotcha's like being aware of application start up time and making sure you are using session management correctly.
I would recommend NHibernate and have been using it for 18 months now on schemas that hold around 80 tables or so and have not yet seen any major issues.
我想说这对性能没有任何影响。事实上,如何创建适合映射文件的表的选项并不多。还有一些仅用于模式创建的附加功能,例如可以指定数据库数据类型、创建约束和索引以及在创建模式时运行任意 sql。
性能调优通常可以在自动创建模式后完成。例如,您让 NH 创建表并运行一些
Alter Table
语句来设置一些与性能相关的设置。之后创建(或替换)索引也非常容易。所有这些甚至都可以写入映射文件。艰苦的工作仍然由 NH 完成:根据已有的信息(映射文件)创建所有表和列。I would say that there aren't any performance implications. In fact aren't there many options how to create the tables to fit the mapping files. There are some additional features just for schema creation, like the possibility to specify database data types, create constraints and indexes as well as running arbitrary sql when creating the schema.
Performance tuning can usually be done after automatically creating the schema. For instance, you let NH create the tables and run some
Alter Table
statements to set some performance relevant settings. It is also very easy to create (or replace) indexes afterwards. All this could even be written to the mapping files. The hard work is still done by NH: creating all the tables and columns according to the information that is already there: the mapping files.