Spring Cloud Data Flow 数据源覆盖 Spring Batch 应用程序数据源

发布于 2025-01-14 14:15:04 字数 3116 浏览 4 评论 0原文

我正在设置 Spring Cloud Data Flow 的一个实例。我运行了以下命令:

java -jar spring-cloud-dataflow-server-2.9.2.jar \
--spring.cloud.dataflow.features.streams-enabled=false \
--spring.cloud.dataflow.features.schedules-enabled=true \
--spring.datasource.url=jdbc:postgresql://localhost:5432/batch \
--spring.datasource.username=postgres \
--spring.datasource.password=postgres \
--spring.datasource.driver-class-name=org.postgresql.Driver \
--spring.datasource.initialization_mode=always

我已经使用 Spring Batch 开发了一个批处理作业,并将其部署到该平台中。该作业使用两个数据源:用于 Spring 的批处理和用于我的业务逻辑的任务元数据和 app_db。当我在本地运行该应用程序时,它会按预期批量保存元数据并将我的业务数据保存在 app_db 中。问题是当我尝试在 Spring Cloud Dataflow 内执行 de job 时。该平台覆盖我配置的业务逻辑数据库并仅使用批处理数据库,该数据库应该仅存储元数据。

application.yaml

spring:
  batch:
    datasource:
      url: jdbc:postgresql://localhost:5432/batch
      username: postgres
      password: postgres

  datasource:
    url: jdbc:postgresql://localhost:5432/app_db
    username: postgres
    password: postgres

DatasourceConfiguration

public class DatasourceConfiguration {
    @Bean
    @ConfigurationProperties("spring.datasource")
    @Primary
    public DataSourceProperties dataSourceProperties() {
        return new DataSourceProperties();
    }
 
    @Bean
    @Primary
    public DataSource dataSource(DataSourceProperties dataSourceProperties) {
        return dataSourceProperties.initializeDataSourceBuilder().build();
    }

    @Bean(name = "batchDataSourceProperties")
    @ConfigurationProperties("spring.batch.datasource")
    public DataSourceProperties batchDataSourceProperties() {
        return new BatchDataSourceProperties();
    }
    
    @Bean(name = "batchDataSource")
    public DataSource batchDataSource() {
        return batchDataSourceProperties.initializeDataSourceBuilder().build();
    }
}
@SpringBootApplication
@EnableTask
@EnableBatchProcessing
public class BatchApplication {

    @Bean
    public TaskConfigurer taskConfigurer(@Qualifier("batchDataSource") DataSource dataSource) {
        return new DefaultTaskConfigurer(dataSource);
    }

    @Bean
    public BatchConfigurer batchConfigurer(@Qualifier("batchDataSource") DataSource dataSource) {
        return new DefaultBatchConfigurer(dataSource);
    }
    public static void main(String[] args) {
        SpringApplication.run(BatchApplication.class, args);
    }
}

Job

@Bean
public Job startJob(JobBuilderFactory jobBuilderFactory, DataSource dataSource) {
    try {                                   
        System.out.println(dataSource.getConnection().getMetaData().getURL().toString());
        } catch (Exception e) {
          //TODO: handle exception
        }
}

当我查看数据源时,从本地执行批处理时将打印jdbc:postgresql://localhost:5432/app_db,并且jdbc:postgresql://当从 SCDF 执行批处理(任务)时,将打印 /localhost:5432/batch

我想知道数据流如何覆盖应用程序 spring.datasource 即使我在执行任务时没有传递任何参数。请提出一个解决方案以避免覆盖数据源。 我正在考虑的一种解决方案是创建 AppDatasourceConfiguration(app.datasource) 使用它。但是是否有可能使用 spring.datasource 而不会被 SCDF 覆盖。

I'm setting up an instance of Spring Cloud Data Flow. I've run the following commands:

java -jar spring-cloud-dataflow-server-2.9.2.jar \
--spring.cloud.dataflow.features.streams-enabled=false \
--spring.cloud.dataflow.features.schedules-enabled=true \
--spring.datasource.url=jdbc:postgresql://localhost:5432/batch \
--spring.datasource.username=postgres \
--spring.datasource.password=postgres \
--spring.datasource.driver-class-name=org.postgresql.Driver \
--spring.datasource.initialization_mode=always

I've developed a batch job using spring batch to be deployed in this platform. The job uses two data sources: batch for Spring and task Metadata and app_db for my business logic. When I run the app locally, it persists metadata in batch and my business data in app_db, as expected. The problem is when I try to execute de job inside the Spring Cloud Dataflow. The platform overrides my configured business logic database and uses only the batch database, which is supposed to store metadata only.

application.yaml

spring:
  batch:
    datasource:
      url: jdbc:postgresql://localhost:5432/batch
      username: postgres
      password: postgres

  datasource:
    url: jdbc:postgresql://localhost:5432/app_db
    username: postgres
    password: postgres

DatasourceConfiguration

public class DatasourceConfiguration {
    @Bean
    @ConfigurationProperties("spring.datasource")
    @Primary
    public DataSourceProperties dataSourceProperties() {
        return new DataSourceProperties();
    }
 
    @Bean
    @Primary
    public DataSource dataSource(DataSourceProperties dataSourceProperties) {
        return dataSourceProperties.initializeDataSourceBuilder().build();
    }

    @Bean(name = "batchDataSourceProperties")
    @ConfigurationProperties("spring.batch.datasource")
    public DataSourceProperties batchDataSourceProperties() {
        return new BatchDataSourceProperties();
    }
    
    @Bean(name = "batchDataSource")
    public DataSource batchDataSource() {
        return batchDataSourceProperties.initializeDataSourceBuilder().build();
    }
}
@SpringBootApplication
@EnableTask
@EnableBatchProcessing
public class BatchApplication {

    @Bean
    public TaskConfigurer taskConfigurer(@Qualifier("batchDataSource") DataSource dataSource) {
        return new DefaultTaskConfigurer(dataSource);
    }

    @Bean
    public BatchConfigurer batchConfigurer(@Qualifier("batchDataSource") DataSource dataSource) {
        return new DefaultBatchConfigurer(dataSource);
    }
    public static void main(String[] args) {
        SpringApplication.run(BatchApplication.class, args);
    }
}

Job

@Bean
public Job startJob(JobBuilderFactory jobBuilderFactory, DataSource dataSource) {
    try {                                   
        System.out.println(dataSource.getConnection().getMetaData().getURL().toString());
        } catch (Exception e) {
          //TODO: handle exception
        }
}

When I look at the data source,jdbc:postgresql://localhost:5432/app_db will be printed when the batch is executed from local and jdbc:postgresql://localhost:5432/batch will be printed when the batch (task) is executed from SCDF.

I want to know how dataflow is overriding application the spring.datasource even though I am not passing any arguments while executing the task. Please suggest a solution to avoid the overriding of datasource.
One solution I am thinking of is creating AppDatasourceConfiguration(app.datasource) use it. But is there a possibility to use spring.datasource without getting overiddien by SCDF.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文