适合实时客户端/服务器物理模拟的设计模式?
是否存在某种设计模式,有助于设计客户端服务器实时物理模拟,由于网络的原因,需要一些特定的设计需求(逻辑/代码解耦)诸如此类的手续:
在客户端绘图,而不是发送 线路上的大量数据和服务器上的主要模拟逻辑(引擎)
一半的目标代码在服务器上,一半在客户端(变化的部分)绘制
仅发送不断变化的对象(属性)的一部分
就 Javanio 的实现而言,任何建议都将是真正的赞赏。
谢谢,
jibbylala
PS:我看到了很多文章、讨论和伪代码,如何实现解耦,但没有找到任何特定的简单紧凑的实现。
Is it some design pattern exist, which can be helpful in designing client server real time physics simulation, which demands some specific need in design(logic/code decoupling) because of network formalities like :
drawing on the client, not sending
much data on wires and main simulation logic (engine) on serverhalf of the object code on server and half on client (the changing part) to draw
sending only part of objects(attributes) which are changing constantly
any suggestion as far as implementtion is concerned with Javanio will be really appreciated.
thanks,
jibbylala
P.S: i saw lot of articles and discussion and pseudo codes, how decoupling can be achieved but didn't find any particular simple compact implementation.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我强烈建议阅读 Glenn Fiedler 关于此主题的文章。
由于带宽和延迟的限制,实时服务器-客户端网络物理是一个具有挑战性的问题。为了缓解这些问题,大多数网络模拟依赖于客户端仅向服务器发送输入,而服务器仅发回帧状态之间的差异。除此之外,客户端还可以使用客户端预测来预测服务器的世界状态。不幸的是,该模型并不完美,因为多个客户端操作之间的交互可能会导致客户端预测出现差异。然而,处理这些差异是您必须解决的问题的另一部分......
I highly recommend reading Glenn Fiedler's article on this subject.
Real-time server-client networked physics is a challenging problem due to the constraints of bandwidth and latency. In order to alleviate these problems, most networked simulations rely on clients only sending input to the server, with the server only sending back differences between framestates. Along with this, the client can predict the server's world-state using client-side prediction. Unfortunately, this model is imperfect due to the fact that interactions between multiple client's actions can create discrepancies in the client-side prediction. However, dealing with these discrepancies is another part of the problem you have to solve...