结合 Spring、Quartz 调度和 Hazelcast
我正在尝试研究如何为我正在编写的 Web 应用程序开发一个合理可扩展的批处理框架。
我正在为 web 应用程序使用 Spring MVC,并具有自定义 DAO 层(要访问数据库,需要从设置为 @Autowired 的 UnitOfWorkFactory 构造 UnitOfWork 实例并由 Spring 在运行时注入)。
我正在使用 Spring Scheduler 注解 (@Scheduled) 来安排任务,但是我希望这些任务在集群中的不同机器上运行。每个批处理作业应该由一台集群机器拾取然后执行。
Hazelcast 似乎很适合这个,因为 分布式执行为此目的设计显得非常简单和优雅。
我遇到了一个似乎没有被文档涵盖的问题。我已阅读有关 Spring Integration 的文档,但这似乎集中在如何使用 Spring 配置 Hazelcast (我已经完成了)。
当调度程序指示任务要启动时,我想创建该任务的一个新实例(Callable 实例)并将其提交给DistributedExecutor。当集群机器收到要运行的任务时,我需要集群机器上的 Spring 容器在任务尝试执行批处理任务之前将 UnitOfWorkFactory 实例注入到批处理任务中。每个集群都从 Spring 开始,并且已经使用正确的详细信息实例化了 UnitOfWorkFactory ,问题在于将 UnitOfWorkFactory 实例注入到我的任务中。
有谁知道我如何配置我的应用程序,以便 Hazelcast 可以在 Callable 启动时自动注入 UnitOfWorkFactory ?我已尝试将Callable标记为可序列化和ApplicationContextAware,但在尝试运行任务时仍然遇到NullPointerException。
我可以直接访问 ApplicationContext,但我不愿意,因为它会限制我的任务的可测试性,并为我的批处理作业引入对 Spring 的硬依赖。
I am trying to work out how to develop a reasonably scalable batch-processing framework for a webapp I am writing.
I am using Spring MVC for the webapp, with a custom DAO layer (to access the database a UnitOfWork instance is needed to be constructed from a UnitOfWorkFactory that is set as @Autowired and injected at runtime by Spring).
I am using the Spring Scheduler annotations (@Scheduled) to schedule tasks, however I want these tasks to run on different machines in my cluster. Each batch job should be picked up by one of the cluster machines and then executed.
Hazelcast seemed like a natural fit for this as the Distributed Execution design appeared really simple and elegant for this purpose.
I am running into a problem that doesn't seem to be covered by the documentation. I have read the documentation about Spring Integration however this seems focused at how to configure Hazelcast using Spring (which I have done already).
When the scheduler indicates that the task is to start I want to create a new instance of the task (a Callable instance) and submit it to the DistributedExecutor. When a cluster machine receives the task to run I need the Spring container on the cluster machine to inject the UnitOfWorkFactory instance into the batch task before the task attempts to execute it. Each of the clusters starts with Spring and will have the UnitOfWorkFactory already instantiated with the correct details, the problem is with getting the UnitOfWorkFactory instance injected into my task.
Does anybody know how I can configure my application so that Hazelcast can have the UnitOfWorkFactory injected automatically when a Callable is started? I have tried marking the Callable as Serializable and ApplicationContextAware but I still get NullPointerException when trying to run the task.
I could access the ApplicationContext directly, however I would rather not as it will restrict the testability of my tasks and introduce a hard-dependency on Spring for my batch jobs.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
到版本 2.1 Hazelcast 可以将 Spring 上下文和/或 Spring beans 注入到 Hazelcast 托管对象中。
如果您使用 Hazelcast Spring 配置来配置 Hazelcast 并使用
@SpringAware
注释一个 bean,Hazelcast 将要求 Spring 注入该 bean 的依赖项。Hazelcast 2.1 之前的版本不支持 Spring,因此无法将 Spring 上下文或任何 Spring bean 注入到 2.1 之前版本的 Hazelcast 托管对象中。
Hazelcast 组上有一篇帖子询问此功能。
Hazelcast / Callable 的依赖注入
正如您可能已经知道并在 Hazelcast 小组中建议的那样,您可以访问 Spring应用程序上下文使用;
By version 2.1 Hazelcast can inject Spring context and/or Spring beans into Hazelcast managed objects.
If you configure Hazelcast using Hazelcast Spring configuration and annotate a bean using
@SpringAware
, Hazelcast will ask Spring to inject dependencies of that bean.Versions before 2.1 of Hazelcast are not Spring aware, so it is not possible to inject Spring context or any Spring bean into a Hazelcast managed object for pre-2.1 versions.
There is a post asking about this feature on Hazelcast group.
Hazelcast / Dependency injection for Callable
As you may already know and suggested in Hazelcast group you can access Spring ApplicationContext using;