使用 NHibernate 通过 Web 服务保存数据?

发布于 2024-08-09 04:17:18 字数 3018 浏览 1 评论 0原文

我们当前有一个应用程序,可以通过 Web 服务从服务器检索数据并填充数据集。然后,API 的用户通过对象对其进行操作,从而更改数据集。然后,更改会被序列化、压缩并发送回服务器以进行更新。

然而,我已经开始在项目中使用 NHibernate,并且我真的很喜欢 POCO 对象的断开连接特性。我们现在遇到的问题是,我们的对象与内部 DataSet 的联系如此紧密,以至于它们无法在许多情况下使用,并且我们最终会制作重复的 POCO 对象来来回传递。

Batch.GetBatch() -> calls to web server and populates an internal dataset
Batch.SaveBatch() -> send changes to web server from dataset 

有没有一种方法可以实现我们正在使用的类似模型,所有数据库访问都通过 Web 服务进行,但使用 NHibernate?

编辑 1

我有一个部分解决方案,可以通过 Web 服务工作并持续存在,但它有两个问题。

  1. 我必须序列化并发送我的整个集合,而不仅仅是更改的项目
  2. 如果我尝试在返回对象时重新填充集合,那么我拥有的任何引用都会丢失。

这是我的示例解决方案。

客户端

public IList<Job> GetAll()
{
    return coreWebService
      .GetJobs()
      .BinaryDeserialize<IList<Job>>();
}

public IList<Job> Save(IList<Job> Jobs)
{
    return coreWebService
             .Save(Jobs.BinarySerialize())
             .BinaryDeserialize<IList<Job>>();
}

服务器端

[WebMethod]
public byte[] GetJobs()
{
    using (ISession session = NHibernateHelper.OpenSession())
    {
        return (from j in session.Linq<Job>()
                select j).ToList().BinarySerialize();
    }
}

[WebMethod]
public byte[] Save(byte[] JobBytes)
{
    var Jobs = JobBytes.BinaryDeserialize<IList<Job>>();

    using (ISession session = NHibernateHelper.OpenSession())
    using (ITransaction transaction = session.BeginTransaction())
    {
        foreach (var job in Jobs)
        {
            session.SaveOrUpdate(job);
        }
        transaction.Commit();
    }

    return Jobs.BinarySerialize();
}

如您所见,我每次都会将整个集合发送到服务器,然后返回整个集合。但我得到的是替换的集合,而不是合并/更新的集合。更不用说当只有部分数据可以更改时来回发送所有数据似乎非常低效。

编辑2

我在网上看到了一些关于几乎透明的持久机制的参考资料。我不太确定这些是否有效,而且大多数看起来都是高度实验性的。

我很难找到我们今天使用的 DataSet 模型的替代品。我想摆脱该模型的原因是因为将每个类的每个属性与数据集的行/单元格联系起来需要大量工作。然后它也将我的所有课程紧密地结合在一起。

We currently have an application that retrieves data from the server through a web service and populates a DataSet. Then the users of the API manipulate it through the objects which in turn change the dataset. The changes are then serialized, compressed and sent back to the server to get updated.

However, I have begin using NHibernate within projects and I really like the disconnected nature of the POCO objects. The problem we have now is that our objects are so tied to the internal DataSet that they cannot be used in many situations and we end up making duplicate POCO objects to pass back and forth.

Batch.GetBatch() -> calls to web server and populates an internal dataset
Batch.SaveBatch() -> send changes to web server from dataset 

Is there a way to achieve a similar model that we are using which all database access occurs through a web service but use NHibernate?

Edit 1

I have a partial solution that is working and persisting through a web service but it has two problems.

  1. I have to serialize and send my whole collection and not just changed items
  2. If I try to repopulate the collection upon return my objects then any references I had are lost.

Here is my example solution.

Client Side

public IList<Job> GetAll()
{
    return coreWebService
      .GetJobs()
      .BinaryDeserialize<IList<Job>>();
}

public IList<Job> Save(IList<Job> Jobs)
{
    return coreWebService
             .Save(Jobs.BinarySerialize())
             .BinaryDeserialize<IList<Job>>();
}

Server Side

[WebMethod]
public byte[] GetJobs()
{
    using (ISession session = NHibernateHelper.OpenSession())
    {
        return (from j in session.Linq<Job>()
                select j).ToList().BinarySerialize();
    }
}

[WebMethod]
public byte[] Save(byte[] JobBytes)
{
    var Jobs = JobBytes.BinaryDeserialize<IList<Job>>();

    using (ISession session = NHibernateHelper.OpenSession())
    using (ITransaction transaction = session.BeginTransaction())
    {
        foreach (var job in Jobs)
        {
            session.SaveOrUpdate(job);
        }
        transaction.Commit();
    }

    return Jobs.BinarySerialize();
}

As you can see I am sending the whole collection to the server each time and then returning the whole collection. But I'm getting a replaced collection instead of a merged/updated collection. Not to mention the fact that it seems highly inefficient to send all the data back and forth when only part of it could be changed.

Edit 2

I have seen several references on the web for almost a transparent persistent mechanism. I'm not exactly sure if these will work and most of them look highly experimental.

I'm having a hard time finding a replacement for the DataSet model we are using today. The reason I want to get away from that model is because it takes a lot of work to tie every property of every class to a row/cell of a dataset. Then it also tightly couples all of my classes together.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

迷迭香的记忆 2024-08-16 04:17:18

我只是粗略地看了一下你的问题,所以如果我的回答是短视的,请原谅我,但这里是:

我认为你不能从逻辑上摆脱从域对象到 DTO 的映射。

通过通过网络使用域对象,您可以紧密耦合客户端和服务,首先拥有服务的部分原因是促进松散耦合。所以这是一个迫在眉睫的问题。

最重要的是,您最终将得到一个脆弱的域逻辑接口,您无法在不破坏客户端的情况下在服务端进行更改。

我怀疑您最好的选择是实现一个松散耦合的服务,该服务实现 REST / 或其他一些松散耦合的接口。您可以使用 automapper 等产品来使转换更简单、更容易,并根据需要扁平化数据结构。

目前,我不知道有什么方法可以真正减少界面层中涉及的冗长内容,但是在从事过没有付出努力的大型项目后,我可以诚实地告诉您节省的费用是不值得的。

I've only taken a cursory look at your question, so forgive me if my response is shortsighted but here goes:

I don't think you can logically get away from doing a mapping from domain object to DTO.

By using the domain objects over the wire you are tightly coupling your client and service, part of the reason to have a service in the first place is to promote loose coupling. So that's an immediate issue.

On top of that you're going to end up with a brittle domain logic interface where you can't make changes on the service side without breaking your client.

I suspect your best bet would be to implement a loosely coupled service which implements a REST / or some other loosely coupled interface. You could use a product such as automapper to make the conversions simpler and easier and also flatten data structures as necessary.

At this point I don't know of any way to really cut down the verbosity involved in doing the interface layers but having worked on large projects that didn't make the effort I can honestly tell you the savings wasn't worth it.

极度宠爱 2024-08-16 04:17:18

我认为您的问题围绕这个问题:

http://thatextramile.be/blog/2010/05/why-you-shouldnt-expose-your-entities-through-your-services/

您是否打算发送 ORM-电线上的实体?

由于您有面向服务的架构..我(像作者一样)不推荐这种做法。

我使用NHibernate。我将这些称为 ORM 实体。他们是 POCO 模型。但它们具有允许延迟加载的“虚拟”属性。

不过,我也有一些 DTO 对象。这些也是 POCO 的。这些不具有延迟加载友好的属性。

所以我做了很多“转换”。我水合 ORM 实体(使用 NHibernate)...然后我最终将它们转换为域 DTO 对象。是的,一开始就很臭。

服务器发出域 DTO 对象。没有延迟加载。我必须用“Goldie Locks”“恰到好处”模型填充它们。也就是说,如果我需要带有一级子级的父级,我必须预先知道这一点,并通过这种方式发送域 DTO 对象,并添加适量的水合作用。

当我发回域 DTO 对象(从客户端到服务器)时,我必须反转该过程。我将域 DTO 对象转换为 ORM 实体。并允许 NHibernate 与 ORM 实体一起工作。

因为架构是“断开连接的”,所以我进行了很多(NHiberntae)“.Merge()”调用。

        // ormItem is any NHibernate poco
        using (ISession session = ISessionCreator.OpenSession())
        {
            using (ITransaction transaction = session.BeginTransaction())
            {
                session.BeginTransaction();
                ParkingAreaNHEntity mergedItem = session.Merge(ormItem);
                transaction.Commit();
            }
        }

.合并是一件美妙的事情。实体框架没有它。嘘。

这是很多设置吗?是的。
我认为它是完美的吗?不,

但是。因为我发送了非常基本的 DTO(Poco),这些 DTO 没有“调味”到 ORM,所以我有能力切换 ORM,而不会破坏我与外界的合约。

我的数据层可以是 ADO.NET、EF、NHibernate 或任何东西。如果我切换,我必须编写“转换器”和 ORM 代码,但其他所有内容都是隔离的。

很多人跟我争论。他们说我做得太多了,而 ORM 实体很好。

再说一遍,我喜欢“现在允许任何延迟加载”的外观。我更喜欢将数据层隔离。我的客户不应该知道或关心我选择的数据层/形式。

EF 和 NHibernate 之间存在足够细微的差异(或者一些不那么细微的差异),足以破坏游戏计划。

我的域 DTO 对象与我的 ORM 实体有 95% 的相似度吗?是的。但就是这 5% 的人会让你发疯。

从数据集迁移并不是一件容易的事,特别是如果它们是从 TSQL 中包含大量业务逻辑的存储过程填充的。但现在我做的是对象模型,而且我从不编写不是简单 CRUD 函数的存储过程,我永远不会回去。

我讨厌在存储过程中使用 voodoo TSQL 进行维护项目。现在已经不是1999年了。嗯,大多数地方。

祝你好运。

PS 如果没有 .Merge(在 EF 中),以下是您在断开连接的世界中必须执行的操作:(boo microsoft)

http://www.entityframeworktutorial.net/EntityFramework4.3/update-many-to-many-entity-using-dbcontext.aspx

I think your issue revolves around this issue:

http://thatextramile.be/blog/2010/05/why-you-shouldnt-expose-your-entities-through-your-services/

Are you or are you not going to send ORM-Entities over the wire?

Since you have a Services-Oriented architecture.. I (like the author) do not recommend this practice.

I use NHibernate. I call those ORM-Entities. They are THE POCO model. But they have "virtual" properties that allow for lazy-loading.

However, I also have some DTO-Objects. These are also POCO's. These do not have lazy'loading friendly properties.

So I do alot of "converting". I hydrate ORM-Entities (with NHibernate)...and then I end up converting them to Domain-DTO-Objects. Yes, it stinks in the beginning.

The server sends out the Domain-DTO-Objects's. There is NO lazy loading. I have to populate them with the "Goldie Locks" "just right" model. Aka, if I need Parent(s) with one level of children, I have to know that up front and send the Domain-DTO-Objects over that way, with just the right amount of hydration.

WHen I send back Domain-DTO-Objects's (from client to the server), I have to reverse the process. I convert the Domain-DTO-Objects into ORM-Entities. And allow NHibernate to work with the ORM-Entities.

Because the architecture is "disconnected", I do alot of (NHiberntae) ".Merge()" calls.

        // ormItem is any NHibernate poco
        using (ISession session = ISessionCreator.OpenSession())
        {
            using (ITransaction transaction = session.BeginTransaction())
            {
                session.BeginTransaction();
                ParkingAreaNHEntity mergedItem = session.Merge(ormItem);
                transaction.Commit();
            }
        }

.Merge is a wonderful thing. Entity Framework does not have it. Boo.

Is this alot of setup? Yes.
Do I think it is perfect? No.

However. Because I send very basic DTO's(Poco's) that are not "flavored" to the ORM, I have the ability to switch ORM's without killing my contracts to the outside world.

My datalayer can be ADO.NET, EF, NHibernate, or anything. I have to write the "Converters" if I switch, and the ORM code, but everything else is isolated.

Many people argue with me. They said I'm doing too much, and the ORM-Entities are fine.

Again, I like to "now allow any lazy loading" appearances. And I prefer to have my data-layer isolated. My clients should not know or care about my data-layer/orm of choice.

There are just enough subtle differences (or some not so subtle ones) between EF and NHibernate to screwball the game plan.

Do my Domain-DTO-Objects's look 95% like my ORM-Entities? Yep. But its the 5% that can screwball you.

Moving from DataSets, especially if they are populated from stored-procedures with alot of biz-logic in the TSQL, isn't trivial. But now that I do object model, and I NEVER write a stored procedure that isn't simple CRUD functions, I'd never go back.

And I hate maintenance projects with voodoo TSQL in the stored procedures. It ain't 1999 anymore. Well, most places.

Good luck.

PS Without .Merge(in EF), here is what you have to do in a disconnected world: (boo microsoft)

http://www.entityframeworktutorial.net/EntityFramework4.3/update-many-to-many-entity-using-dbcontext.aspx

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文