酸洗文件
我将Kedro与Databricks-Connect结合使用来运行我的机器学习模型。我接受了使用Databricks笔记本的训练并测试了模型,并将模型的泡菜文件保存到Azure Blob存储中。为了在本地测试管道,我下载了泡菜文件并在本地存储。 kedro在本地存储时会在文件中读取罚款,但是问题是我尝试直接从Azure读取文件中的kedro时。我会收到以下错误:“无效的负载密钥'?”
不幸的是,我无法发布任何屏幕截图,因为我的想法是在Azure上进行的咸菜文件的存储方式有所不同,并且在本地下载时,它是正确存储的
。型号训练的二手统计模型(SM.Logit)
I using kedro in conjunction with databricks-connect to run my machine learning model. I am trained and tested the model using databricks notebook and saved the pickle file of the model to azure blob storage. To test the pipeline locally I downloaded the pickle file and stored it locally. kedro reads in the file fine when stored locally, however the problem is when I try to read in the file into kedro directly from azure; I get the following error : "Invalid Load Key '?'
Unfortunately I cannot post any screenshots since I am doing this for my job. My thinking is that the pickle file is stored differently while on azure and when downloaded locally it is stored correctly.
For a bit more clarity I am attempting to store a logistic Regression model trained used statsmodels (sm.logit)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
不幸的是,Pickles 不能保证跨环境工作,至少 python 版本(可能还有操作系统)需要相同。这里有不匹配吗?
Pickles unfortunately aren't guaranteed to work across environments, at the very least the python version (and possibly OS) need to be identical. Is there a mismatch here?