使用 Azure 数据工厂进行多步增量加载和处理
我想实现增量加载/处理,并在处理它们后使用Azure数据工厂将它们存储在不同的地方,例如:
外部数据源(数据是结构化的)-> ADLS(原始)-> ADLS(已处理)-> SQL DB
因此,我需要根据当前日期从源中提取原始数据样本,将它们存储在 ADLS 容器中,然后处理相同的数据采样数据,将它们存储在另一个 ADLS 容器中,最后将处理结果附加到 SQL DB 中。
ADLS 原始:
2022-03-01.txt
2022-03-02.txt
ADLS 处理:
2022-03-01-processed.txt
2022-03-02-processed.txt
SQL DB:
ADLS 中处理的所有 txt 文件容器将被附加并存储在SQL DB中。
因此想检查在必须批量运行的单个管道中实现此目的的最佳方法是什么?
I wanted to achieve an incremental load/processing and store them in different places using Azure Data Factory after processing them, e.g:
External data source (data is structured) -> ADLS (Raw) -> ADLS (Processed) -> SQL DB
Hence, I will need to extract a sample of the raw data from the source, based on the current date, store them in an ADLS container, then process the same sample data, store them in another ADLS container, and finally append the processed result in a SQL DB.
ADLS raw:
2022-03-01.txt
2022-03-02.txt
ADLS processed:
2022-03-01-processed.txt
2022-03-02-processed.txt
SQL DB:
All the txt files in the ADLS processed container will be appended and stored inside SQL DB.
Hence would like to check what will be the best way to achieve this in a single pipeline that has to be run in batches?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您可以使用动态管道来实现此目的,如下所示:
在 SQL DB 中创建一个配置/元数据表,您可以在其中放置源表名称、源名称等详细信息。
按如下方式创建管道:
a) 添加一个查找活动,您可以在其中根据您的配置表创建查询
https://learn.microsoft.com/ en-us/azure/data-factory/control-flow-lookup-activity
b) 添加 ForEach 活动并使用 Lookup 输出作为 ForEach 的输入
https://learn.microsoft。 com/en-us/azure/data-factory/control-flow-for-each-activity
c) 在 ForEach 内部,您可以添加一个 switch 活动,其中每个 Switch 案例区分表或源
d) 在每种情况下添加您需要在 RAW 层中创建文件的副本或其他活动
e) 在已处理层的管道中添加另一个 ForEach,其中您可以添加与 RAW 层类似类型的内部活动,并且在此活动中您可以添加处理逻辑
通过这种方式,您可以创建一个单个管道,也是一个动态管道,可以对所有源执行必要的操作
You can achieve this using a dynamic pipeline as follows:
Create a Config / Metadata table in SQL DB wherein you would place the details like source table name, source name etc.
Create a pipeline as follows:
a) Add a lookup activity wherein you would create a query based on your Config table
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity
b) Add a ForEach activity and use Lookup output as an input to ForEach
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-for-each-activity
c) Inside ForEach you can add a switch activity where each Switch case distinguishes table or source
d) In each case add a COPY or other activities which you need to create file in RAW layer
e) Add another ForEach in your pipeline for Processed layer wherein you can add similar type of inner activities as you did for RAW layer and in this activity you can add processing logic
This way you can create a single pipeline and that too a dynamic one which can perform necessary operations for all sources
您无法一次重命名多个文件,因此必须一个接一个地复制文件。
You can't rename multiple files at once so you have to copy files one after the other.