无法分配小于std :: vector :: max_size()的大型CPP std :: vector
我试图为50,000,000,000个条目分配vector< bool>
;但是,程序错误。 什么():std :: bad_alloc (或在在线编译器它刚刚结束)。
我最初认为这是由于尺寸太大而造成的。但是,v1.maxsize()
对我来说大于50GB。令人困惑的是,当我减少条目#时,它的工作正常。
问题:考虑到条目的数量小于向量的最大化,这可能是什么原因原因?
其他问题/答案表明,类似的问题是由于32位CPU所致。但是我有64位。
#include <iostream>
#include <vector>
using namespace std;
int main()
{
long size = 50000000000;
std::vector<bool> v1;
std::cout << "max_size: " << bool(v1.max_size() > 50000000000) <<"vs" << size << "\n";
v1 = std::vector<bool>(size,false);
cout << "vector initialised \n" << endl;
cout << v1.size() << endl;
}
I am trying to allocate a vector<bool>
in c++ for 50,000,000,000 entries; however, the program errors out.terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc (or in the online compiler it just ends).
I initially thought this was due to too large a size; however, v1.maxsize()
is greater than 50GB for me. What's confusing though, is when I reduce the # of entries it works fine.
Question: What could be the root cause of this considering that the number of entries is less than maxsize of a vector?
Other questions/answers have suggested that similar issues are due to being on a 32 bit cpu; however I have a 64bit.
#include <iostream>
#include <vector>
using namespace std;
int main()
{
long size = 50000000000;
std::vector<bool> v1;
std::cout << "max_size: " << bool(v1.max_size() > 50000000000) <<"vs" << size << "\n";
v1 = std::vector<bool>(size,false);
cout << "vector initialised \n" << endl;
cout << v1.size() << endl;
}
note: I am essentially trying to create a memory efficient bitmap to track if certain addresses for a different data structure have been initialized. I cant use a bitset mentioned in this post since the size isn't known at compile time.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
来自
std :: vector :: vector :: max_size
::这意味着
std :: vector :: max_size
不能很好地表明您由于硬件限制而可以分配的实际最大尺寸。在实践中,实际最大尺寸几乎总是较小,具体取决于您在运行时的可用RAM。
在当前的64位系统上,这种情况将永远是这种情况(至少使用当前可用的硬件),因为64位地址空间中的理论大小比可用的RAM大小要大得多。
From
std::vector::max_size
:This means that
std::vector::max_size
is not a good indication to the actual maximum size you can allocate due to hardware limitation.In practice the actual maximum size will [almost] always be smaller, depending on your available RAM in runtime.
On current 64 bit systems this will always be the case (at least with current available hardware), because the theoretical size in a 64 bit address space is a lot bigger than available RAM sizes.