使用 solrj 索引文件时出现内存不足异常
我用 solrj 编写了一个简单的程序来索引文件,但一分钟后它崩溃了并且 java.lang.OutOfmemoryError:出现java堆空间
我使用Eclipse,我的内存存储约为2GB,我为tomcat的VM arg和调试配置中的应用程序设置了-Xms1024M-Xmx2048M,并取消注释solrconfig中的maxBufferedDocs并设置它到 100 然后再次运行应用程序,但是当它达到大于 500MB 的文件时它很快就会崩溃
是否有任何配置可以索引使用 solrj 处理大文件? 我的 solrj 的详细信息如下:
String urlString = "http://localhost:8983/solr/file";
CommonsHttpSolrServer solr = new CommonsHttpSolrServer(urlString);
ContentStreamUpdateRequest req = new ContentStreamUpdateRequest("/update/extract");
req.addFile(file);
req.setParam("literal.id", file.getAbsolutePath());
req.setParam("literal.name", file.getName());
req.setAction(ACTION.COMMIT, true, true);
solr.request(req);
I write a simple program with solrj that index files but after a minute passed it crashed and the
java.lang.OutOfmemoryError : java heap space appears
I use Eclipse and my memory storage is about 2GB and i set the -Xms1024M-Xmx2048M for both my VM arg of tomcat and my application in Debug Configuration and uncomment the maxBufferedDocs in solrconfig and set it to 100 then run again the application but it crash soon when it reaches the files greater than 500MB
is there any config to index large files with solrj?
the detail my solrj is as below:
String urlString = "http://localhost:8983/solr/file";
CommonsHttpSolrServer solr = new CommonsHttpSolrServer(urlString);
ContentStreamUpdateRequest req = new ContentStreamUpdateRequest("/update/extract");
req.addFile(file);
req.setParam("literal.id", file.getAbsolutePath());
req.setParam("literal.name", file.getName());
req.setAction(ACTION.COMMIT, true, true);
solr.request(req);
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
在 eclipse 中运行 java 类时是否还设置堆大小参数?
Are you also setting the heap size params when running the java class in eclipse ?
Solr 是否也与 solrj 运行在同一台机器上?运行 Solr 的计算机上可能存在内存限制。启动 Solr 后您还有多少可用内存? - 您可能需要该盒子上有更多可用内存。
尝试在每个文档之后进行提交,看看是否可以暂时解决该问题。
Is Solr also running on the same machine as solrj? There might be memory constraints on the machine where you are running Solr. How much free memory do you have once you start Solr? - you will probably need more memory available on that box.
Try to put a commit after every document and see if you can get around the problem temporarily.