将内容上传到Azure Data Lake存储中
将文件上传到存储的常见过程是:
- 创建新文件
- 附加内容
- 泛滥数据
我遇到的问题是,存储包含Databricks使用的创建文件事件,并且在数据泛滥之后未“消耗”文件。
是否可以与内容一起创建/上传文件?就像在Azure门户上的上传文件功能一样。
Common process to upload a file to the storage is to:
- create new file
- append content
- flush data
I have a problem that storage contains the create file event, used by the databricks, and files are not “consumed” after the data flush.
Is it possible to create/upload a file together with the content? Like the upload file functionality on Azure Portal.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
当添加或修改BLOB时,您可以使用Azure Logic应用程序实现需求(仅属性)(v2)触发器。以下是我的逻辑应用程序的流程。
结果:
我正在尝试使用put Menthod将文件上传到Postman的Azure Storage。
参考文献:
将文件上传到azure blob存储
You can achieve your requirement using Azure logic Apps through
When a blob is added or modified (properties only) (V2)
trigger. Below is the flow of my logic app.RESULT:
I'm trying to upload file to Azure storage from Postman using PUT Menthod.
REFERENCES:
Uploading files to Azure Blob Storage
Flush操作的发生 close 参数,默认情况下设置为
false
。将所有数据附加到文件上后,应使用
CLOSS
设置为true
进行冲洗。这将意味着该文件已完全上传,并将触发存储帐户事件。
更多信息: https://learn.microsoft.com/en-us/dotnet/api/azure.storage.files.datalake.datalakefileclient.flushasync?view = azure-dotnet #parameters
It occurred that Flush operation has the
close
parameter, set asfalse
by default.When all data has been appended to the file, Flush should be performed with
close
set astrue
.That will mean that file has been fully uploaded and the Storage account event will be triggered.
More info: https://learn.microsoft.com/en-us/dotnet/api/azure.storage.files.datalake.datalakefileclient.flushasync?view=azure-dotnet#parameters