试图在数据链中手动下载MNIST PYTORCH数据集
我现在尝试了几个不同的迭代,以将数据集手动加载到Databricks的dbfs中。。 。到目前为止,我所有的试验都遇到了这个错误,
train_dataset = datasets.MNIST(
13 'dbfs:/FileStore/tarballs/train_images_idx3_ubyte.gz',
14 train=True,
RuntimeError: Dataset not found. You can use download=True to download it
我知道我可以转动download = true
,但是由于防火墙这不是一个选择,我只想上传文件并通过DBFS ...有人也这样做吗?
编辑: @alexey建议我需要添加额外的路径mnist/raw
,然后将输入更改为
train_dataset = datasets.MNIST(
'/dbfs/FileStore/tarballs',
train=True,
download=False,
transform=transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))]))
data_loader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)
相同的错误
I've attempted a couple different iterations now to get the dataset manually loaded into databricks's DBFS.. so that PyTorch can load it.. however the MNIST dataset seems to just be some binary file.. is it expected I unzip it first or just.. point to the GZipped tarball? So far all my trials have gotten this error
train_dataset = datasets.MNIST(
13 'dbfs:/FileStore/tarballs/train_images_idx3_ubyte.gz',
14 train=True,
RuntimeError: Dataset not found. You can use download=True to download it
I am aware I can turn Download=True
, however due to the firewalls this is not an option and I want to just upload the files and wire them in myself via DBFS... anyone done this as well?
EDIT: @alexey suggested I need to add the extra paths MNIST/raw
And then change the input to
train_dataset = datasets.MNIST(
'/dbfs/FileStore/tarballs',
train=True,
download=False,
transform=transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))]))
data_loader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)
But same error
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我的代码和dir:
My code and dir: