使用scipy做SVD时出现内存错误
我尝试使用 LSI 生成向量来表示文档。我正在使用 Scipy 库中的 svd 包。但程序会抛出内存错误。我的矩阵的大小是100*13057。这对于我的 8G RAM 来说太大了吗?
我在stackflow中搜索了这个问题。有人说我只需在 64 位操作系统上安装 64 位 Python 即可。 (现在,我在 64 位操作系统上有 32 位 Python)。但重新安装所有库太简单了。另一种意见是转换稀疏矩阵。
那么大家对于这个问题有什么想法吗?谢谢!
raw_matrix = []
for text in forest_lsi:
raw_matrix.append( text.get_vector() )
from svd import compute_svd
print("The size of raw matrix: "+str(len(raw_matrix))+" * "+str(len(raw_matrix[0])))
matrix = compute_svd( raw_matrix )
Concole 中的消息如下:
The size of raw matrix: 100 * 13057
Original matrix:
[[1 1 2 ..., 0 0 0]
[0 3 0 ..., 0 0 0]
[0 0 0 ..., 0 0 0]
...,
[0 0 0 ..., 0 0 0]
[0 0 1 ..., 0 0 0]
[0 0 2 ..., 1 1 3]]
Traceback (most recent call last):
File "D:\workspace\PyQuEST\src\Practice\baseline_lsi.py", line 93, in <module>
matrix = compute_svd( raw_matrix )
File "D:\workspace\PyQuEST\src\Practice\svd.py", line 12, in compute_svd
U, s, V = linalg.svd( matrix )
File "D:\Program\Python26\lib\site-packages\scipy\linalg\decomp_svd.py", line 79, in svd
full_matrices=full_matrices, overwrite_a = overwrite_a)
MemoryError
I tried to use LSI to generate vectors to represent documents. I am using the svd package in Scipy library. But the program throws a memory error. The size of my matrix is 100*13057. Is this too big for my 8G RAM?
I searched this problem in stackflow. Somebody said I just have to install 64-bit Python on my 64-bit OS. (Now, I have 32-bit Python on 64-bit OS). But re-installing all libraries is too trivial. Another opinion is to convert sparse matrix.
So does everyone have idea on this problem? Thanks!
raw_matrix = []
for text in forest_lsi:
raw_matrix.append( text.get_vector() )
from svd import compute_svd
print("The size of raw matrix: "+str(len(raw_matrix))+" * "+str(len(raw_matrix[0])))
matrix = compute_svd( raw_matrix )
The message in Concole is as bellow:
The size of raw matrix: 100 * 13057
Original matrix:
[[1 1 2 ..., 0 0 0]
[0 3 0 ..., 0 0 0]
[0 0 0 ..., 0 0 0]
...,
[0 0 0 ..., 0 0 0]
[0 0 1 ..., 0 0 0]
[0 0 2 ..., 1 1 3]]
Traceback (most recent call last):
File "D:\workspace\PyQuEST\src\Practice\baseline_lsi.py", line 93, in <module>
matrix = compute_svd( raw_matrix )
File "D:\workspace\PyQuEST\src\Practice\svd.py", line 12, in compute_svd
U, s, V = linalg.svd( matrix )
File "D:\Program\Python26\lib\site-packages\scipy\linalg\decomp_svd.py", line 79, in svd
full_matrices=full_matrices, overwrite_a = overwrite_a)
MemoryError
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果您使用默认的
dtype=np.float
,您的V
矩阵将占用13057*13057*8
字节的内存,大约为。 1.4GB。我的预感是,这对于您的 32 位 Python 来说太大了。尝试使用 32 位浮点数,即 dtype=np.float32,将内存使用量减少一半,或者开始使用 scipy.sparse(几乎总是一个好主意)用于信息检索问题)。Your
V
matrix will take13057*13057*8
bytes of memory if you're using the defaultdtype=np.float
, which is approx. 1.4GB. My hunch is that that's too large for your 32-bit Python. Try using 32-bit floating point numbers, that isdtype=np.float32
, to cut memory use in half, or start usingscipy.sparse
(almost always a good idea for information retrieval problems).