将SharePoint列表数据获取到Python DataFrame
我已经在SharePoint-&gt中创建了一个列表;我的清单。 URL
url = 'https://xxxxx-my.sharepoint.com/personal/account@******.com/Lists/MySamplelist/AllItems.aspx?env=WebViewList'
username = 'account@*****.com'
password = 'pwd'
cred = HttpNtlmAuth(username, password)
site = Site(url, auth=cred, verify_ssl=False)
以下是在尝试使用上述通过site()的上述URL加载SharePoint的数据时的
ShareplumRequestError: Shareplum HTTP Post Failed : 403 Client Error: Forbidden for url:
'https://xxxxx-my.sharepoint.com/personal/account@******.com/Lists/MySamplelist/AllItems.aspx?env=WebViewList'
(我在下面)遇到的错误,请让我知道Cani采取了什么来摆脱错误并加载SharePoint列表数据?
i have created a list in sharepoint-> my lists. Following is the URL
url = 'https://xxxxx-my.sharepoint.com/personal/account@******.com/Lists/MySamplelist/AllItems.aspx?env=WebViewList'
username = 'account@*****.com'
password = 'pwd'
cred = HttpNtlmAuth(username, password)
site = Site(url, auth=cred, verify_ssl=False)
While trying to load data from sharepoint using above URL through site() I am getting error as below
ShareplumRequestError: Shareplum HTTP Post Failed : 403 Client Error: Forbidden for url:
'https://xxxxx-my.sharepoint.com/personal/account@******.com/Lists/MySamplelist/AllItems.aspx?env=WebViewList'
Please let me know what canI do to get rid of the error and load the sharepoint list data?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
当前, SharePoint 在 Azure Databricks 中不支持。
是的,有几个替代选项可以在 azure databricks 中读取共享点文件。
选项1 :使用Azure 逻辑应用 ,然后将存储帐户安装到Azure Databricks,并从存储帐户中读取数据。
选项2 :使用 azure数据工厂 。
您可以引用此 文档 更多地了解安装Azure Blob存储容器,以添加Azure Databricks。
Currently, SharePoint is not supported in Azure Databricks.
Yes, there are couple of alternative options to read share point files in Azure Databricks.
Option 1: Transfer files from SharePoint to Blob storage with azure logic apps, then mount the storage account to Azure Databricks and read data from storage account.
Option 2: Copy files from SharePoint into Azure blob storage using Azure Data Factory.
You can refer this document to learn more about Mount an Azure Blob storage container to Azure Databricks.