databricks dataLakeFileClient返回错误

发布于 2025-01-28 16:30:15 字数 1454 浏览 3 评论 0原文

我有一个Databricks笔记本电脑每5分钟运行一次,部分功能是连接到Azure Data Lake Storage Gen2(ADLS Gen2)中的文件。

我在代码中会有以下错误,但是由于该过程以前工作正常,因此似乎“无处不在”。 “ file =”部分是由我写的,所有参数均为预期,并且与正确的文件名/容器匹配,并且确实存在于数据湖中。

---> 92     file = DataLakeFileClient.from_connection_string("DefaultEndpointsProtocol=https;AccountName="+storage_account_name+";AccountKey=" + storage_account_access_key, 
     93                                                    file_system_name=azure_container, file_path=location_to_write)
     94 

/databricks/python/lib/python3.8/site-packages/azure/storage/filedatalake/_data_lake_file_client.py in from_connection_string(cls, conn_str, file_system_name, file_path, credential, **kwargs)
    116         :rtype ~azure.storage.filedatalake.DataLakeFileClient
    117         """
--> 118         account_url, _, credential = parse_connection_str(conn_str, credential, 'dfs')
    119         return cls(
    120             account_url, file_system_name=file_system_name, file_path=file_path,

/databricks/python/lib/python3.8/site-packages/azure/storage/filedatalake/_shared/base_client.py in parse_connection_str(conn_str, credential, service)
    402     if service == "dfs":
    403         primary = primary.replace(".blob.", ".dfs.")
--> 404         secondary = secondary.replace(".blob.", ".dfs.")
    405     return primary, secondary, credential

有什么想法/帮助吗?实际错误是在base_client.py代码中,但我什至不知道应该是什么“次要”,以及为什么那里会出现错误。

I have a databricks notebook running every 5 mins, part of the functionality is to connect to a file in Azure Data Lake Storage Gen2 (ADLS Gen2).

I get the following error in the code, but it seems to have "come out of nowhere" as the process was previously working fine. the "file = " part is written by me, all the parameters are as expected and matching the correct file names/containers and do exist in the data lake.

---> 92     file = DataLakeFileClient.from_connection_string("DefaultEndpointsProtocol=https;AccountName="+storage_account_name+";AccountKey=" + storage_account_access_key, 
     93                                                    file_system_name=azure_container, file_path=location_to_write)
     94 

/databricks/python/lib/python3.8/site-packages/azure/storage/filedatalake/_data_lake_file_client.py in from_connection_string(cls, conn_str, file_system_name, file_path, credential, **kwargs)
    116         :rtype ~azure.storage.filedatalake.DataLakeFileClient
    117         """
--> 118         account_url, _, credential = parse_connection_str(conn_str, credential, 'dfs')
    119         return cls(
    120             account_url, file_system_name=file_system_name, file_path=file_path,

/databricks/python/lib/python3.8/site-packages/azure/storage/filedatalake/_shared/base_client.py in parse_connection_str(conn_str, credential, service)
    402     if service == "dfs":
    403         primary = primary.replace(".blob.", ".dfs.")
--> 404         secondary = secondary.replace(".blob.", ".dfs.")
    405     return primary, secondary, credential

Any thoughts/help? The actual error is in the base_client.py code, but I don't even know what "secondary" is supposed to be and why there would be an error there.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

客…行舟 2025-02-04 16:30:15

由于某种原因,在重新启动集群后,发生了一些变化,并且需要以下“端点后缀”才能继续工作,因此找不到任何文档,但没有任何文档,但是直到几天前,它始终始终工作:

"DefaultEndpointsProtocol=https;AccountName="+storage_account_name+";AccountKey="+storage_account_access_key+";EndpointSuffix=core.windows.net"

For some reason, after restarting the cluster, something changed and the following "endpoint suffix" was required for this to continue working, couldn't find any docs on why it would work without this, but until a few days ago, it had always worked:

"DefaultEndpointsProtocol=https;AccountName="+storage_account_name+";AccountKey="+storage_account_access_key+";EndpointSuffix=core.windows.net"
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文