访问Xarray数据集时的HDF5警告
我想了解是什么导致了我在以下方案中收到的警告消息:
在较早的操作中,我创建了一些NetCDF文件,并使用xarray.to_netcdf()将它们保存到磁盘上>。
在jupyter笔记本中,对这些数据集的懒惰评估非常好,我没有收到警告/错误:
- 通过
ds = xarray.open_mfdataset打开这些
.nc .nc
文件('/path/path/to/files/files /*。nc') - 通过
ds.time.values
- 懒惰选择通过
ds.sel(time = starttime)
我似乎能够在对内存加载的数据上进行计算时要做的一切。但是,我通常会在以下情况下接收相同的错误集:
- 加载数据通过
ds.sel(time = starttime).scalar_data.plot()
- 通过
ts = pd.Series提取/加载数据(ds.scalar_data.loc [:,y,x],index = elets_data.index)
请注意,尽管有这些警告,但我执行的操作确实会导致所需的结果(图,时间剧结构等)。
生成以下错误的共同点似乎是从打开的数据集加载数据。编辑:似乎在进行了进一步的实验后,我的工作环境中的包装版本可能会导致依赖HDF5的包装。
以下错误重复多次。
HDF5-DIAG: Error detected in HDF5 (1.12.2) thread 1:
#000: H5A.c line 528 in H5Aopen_by_name(): can't open attribute
major: Attribute
minor: Can't open object
#001: H5VLcallback.c line 1091 in H5VL_attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#002: H5VLcallback.c line 1058 in H5VL__attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#003: H5VLnative_attr.c line 130 in H5VL__native_attr_open(): can't open attribute
major: Attribute
minor: Can't open object
#004: H5Aint.c line 545 in H5A__open_by_name(): unable to load attribute info from object header
major: Attribute
minor: Unable to initialize object
#005: H5Oattribute.c line 494 in H5O__attr_open_by_name(): can't locate attribute: '_QuantizeBitGroomNumberOfSignificantDigits'
major: Attribute
minor: Object not found
...
HDF5-DIAG: Error detected in HDF5 (1.12.2) thread 2:
#000: H5A.c line 528 in H5Aopen_by_name(): can't open attribute
major: Attribute
minor: Can't open object
#001: H5VLcallback.c line 1091 in H5VL_attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#002: H5VLcallback.c line 1058 in H5VL__attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#003: H5VLnative_attr.c line 130 in H5VL__native_attr_open(): can't open attribute
major: Attribute
minor: Can't open object
#004: H5Aint.c line 545 in H5A__open_by_name(): unable to load attribute info from object header
major: Attribute
minor: Unable to initialize object
#005: H5Oattribute.c line 476 in H5O__attr_open_by_name(): can't open attribute
major: Attribute
minor: Can't open object
#006: H5Adense.c line 394 in H5A__dense_open(): can't locate attribute in name index
major: Attribute
minor: Object not found
关于可能导致这些的原因的任何建议将不胜感激。
I'd like to understand what is causing the warning messages that I'm getting in the following scenario:
In an earlier operation I've created some NetCDF files and saved them to disk using xarray.to_netcdf()
.
Lazy evaluation of these datasets is perfectly fine in a jupyter notebook and I receive no warnings/errors when:
- opening these
.nc
files viads = xarray.open_mfdataset('/path/to/files/*.nc')
- loading dimension data into memory via
ds.time.values
- lazy selection via
ds.sel(time=starttime)
I seem to be able to do everything that I want to do in making calculations on memory loaded data. However I often receive the same set of errors when:
- loading data to plot via
ds.sel(time=starttime).SCALAR_DATA.plot()
- extracting/loading data via
ts = pd.Series(ds.SCALAR_DATA.loc[:,y,x], index=other_data.index)
Note that despite these warnings the operations I perform do result in the desired outcomes (plots, timeseries structures, etc.).
The common denominator in generating the following error seems to be loading data from the opened dataset. EDIT: It seems after some further experimentation that the package versions in my working environment may be causing some conflicts among those dependent on HDF5.
The following errors repeat some number of times.
HDF5-DIAG: Error detected in HDF5 (1.12.2) thread 1:
#000: H5A.c line 528 in H5Aopen_by_name(): can't open attribute
major: Attribute
minor: Can't open object
#001: H5VLcallback.c line 1091 in H5VL_attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#002: H5VLcallback.c line 1058 in H5VL__attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#003: H5VLnative_attr.c line 130 in H5VL__native_attr_open(): can't open attribute
major: Attribute
minor: Can't open object
#004: H5Aint.c line 545 in H5A__open_by_name(): unable to load attribute info from object header
major: Attribute
minor: Unable to initialize object
#005: H5Oattribute.c line 494 in H5O__attr_open_by_name(): can't locate attribute: '_QuantizeBitGroomNumberOfSignificantDigits'
major: Attribute
minor: Object not found
...
HDF5-DIAG: Error detected in HDF5 (1.12.2) thread 2:
#000: H5A.c line 528 in H5Aopen_by_name(): can't open attribute
major: Attribute
minor: Can't open object
#001: H5VLcallback.c line 1091 in H5VL_attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#002: H5VLcallback.c line 1058 in H5VL__attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#003: H5VLnative_attr.c line 130 in H5VL__native_attr_open(): can't open attribute
major: Attribute
minor: Can't open object
#004: H5Aint.c line 545 in H5A__open_by_name(): unable to load attribute info from object header
major: Attribute
minor: Unable to initialize object
#005: H5Oattribute.c line 476 in H5O__attr_open_by_name(): can't open attribute
major: Attribute
minor: Can't open object
#006: H5Adense.c line 394 in H5A__dense_open(): can't locate attribute in name index
major: Attribute
minor: Object not found
Any suggestions on what might be causing these would be greatly appreciated.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
这些警告可能是由
netcdf4
版本1.6.x
引起的。在我的情况下,降级到
netcdf4 = 1.5.8
修复了问题。另请参见 https://github.com/scitools/iris/iris/iris/issues/issues/5187
These warnings could be caused by
netcdf4
version1.6.X
.Downgrading to
netcdf4=1.5.8
fixed the issue in my case.See also https://github.com/SciTools/iris/issues/5187
在过去的几天里,我遇到了非常相似的错误,最终发现,限制我的dask客户使用1个线程解决了问题,即:
如果jpolly的解决方案对您不起作用,则值得一试(就我而言,我'我不使用conda ...)
I was struggling with very similar errors the past few days and eventually discovered that restricting my dask client to use 1 thread per worker solved the problem, i.e.,:
worth a shot if jpolly's solution doesn't work for you (in my case, I'm not using conda...)
让
conda
解决各种软件包之间的依赖关系确实是解决这些警告的解决方案。当我手动将所有各种软件包彼此安装时,而无需仔细指定版本或让
conda
求解依赖项时,警告持续了。编辑:此答案有一个很好的解释。
Letting
conda
solve the dependencies between the various packages really ended up being the solution that got rid of these warnings for me.When I'd manually installed all the various packages on top of one another, without carefully specifying versions or letting
conda
solve the dependencies, the warnings persisted.EDIT: There is a nice explanation of this in this answer.