堆空间内存不足错误

发布于 2024-10-12 06:13:45 字数 3419 浏览 6 评论 0原文

我正在尝试使用以下程序运行 coreNLP 包

package corenlp;
import edu.stanford.nlp.pipeline.*;
import java.io.IOException;
/**
 *
 * @author Karthi
 */
public class Main {

    /**
     * @param args the command line arguments
     */
    public static void main(String[] args) throws IOException, ClassNotFoundException {
        // TODO code application liogic here
        String str="-cp stanford-corenlp-2010-11-12.jar:stanford-corenlp-models-2010-11-06.jar:xom-1.2.6.jar:jgrapht-0.7.3.jar -Xms3g edu.stanford.nlp.pipeline.StanfordCoreNLP [ -props <Main> ] -file <input.txt>";
        args=str.split(" ");
        StanfordCoreNLP scn=new StanfordCoreNLP();
        scn.main(args);
    }

}

我不确定代码本身是否正确,但出现以下错误

    Searching for resource: StanfordCoreNLP.properties
Searching for resource: edu/stanford/nlp/pipeline/StanfordCoreNLP.properties
Loading POS Model [edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger] ... Loading default properties from trained tagger edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger
Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger ... Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:704)
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:649)
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:268)
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:228)
        at edu.stanford.nlp.pipeline.POSTaggerAnnotator.loadModel(POSTaggerAnnotator.java:57)
        at edu.stanford.nlp.pipeline.POSTaggerAnnotator.<init>(POSTaggerAnnotator.java:44)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP$4.create(StanfordCoreNLP.java:441)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP$4.create(StanfordCoreNLP.java:434)
        at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:62)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:309)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:347)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:337)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:329)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:319)
        at corenlp.Main.main(Main.java:22)
Java Result: 1

我尝试在 netbeans 中的 VM 选项中给出这些值,但对于每个值我收到错误

-Xms3g

run:
Error occurred during initialization of VM
Incompatible initial and maximum heap sizes specified
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)

-Xmx3g

run:
Error occurred during initialization of VM
Could not create the Java virtual machine.
Could not reserve enough space for object heap
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)

-Xms3g -Xmx4g

run:
Could not create the Java virtual machine.
Invalid maximum heap size: -Xmx4g
The specified size exceeds the maximum representable size.
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)

I am trying to run the coreNLP package with the following program

package corenlp;
import edu.stanford.nlp.pipeline.*;
import java.io.IOException;
/**
 *
 * @author Karthi
 */
public class Main {

    /**
     * @param args the command line arguments
     */
    public static void main(String[] args) throws IOException, ClassNotFoundException {
        // TODO code application liogic here
        String str="-cp stanford-corenlp-2010-11-12.jar:stanford-corenlp-models-2010-11-06.jar:xom-1.2.6.jar:jgrapht-0.7.3.jar -Xms3g edu.stanford.nlp.pipeline.StanfordCoreNLP [ -props <Main> ] -file <input.txt>";
        args=str.split(" ");
        StanfordCoreNLP scn=new StanfordCoreNLP();
        scn.main(args);
    }

}

I am not sure if the code itself is correct, but am getting the following error

    Searching for resource: StanfordCoreNLP.properties
Searching for resource: edu/stanford/nlp/pipeline/StanfordCoreNLP.properties
Loading POS Model [edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger] ... Loading default properties from trained tagger edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger
Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger ... Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:704)
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:649)
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:268)
        at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:228)
        at edu.stanford.nlp.pipeline.POSTaggerAnnotator.loadModel(POSTaggerAnnotator.java:57)
        at edu.stanford.nlp.pipeline.POSTaggerAnnotator.<init>(POSTaggerAnnotator.java:44)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP$4.create(StanfordCoreNLP.java:441)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP$4.create(StanfordCoreNLP.java:434)
        at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:62)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:309)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:347)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:337)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:329)
        at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:319)
        at corenlp.Main.main(Main.java:22)
Java Result: 1

I tried giving these values in VM options in netbeans, but for each value i am getting error

-Xms3g

run:
Error occurred during initialization of VM
Incompatible initial and maximum heap sizes specified
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)

-Xmx3g

run:
Error occurred during initialization of VM
Could not create the Java virtual machine.
Could not reserve enough space for object heap
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)

-Xms3g -Xmx4g

run:
Could not create the Java virtual machine.
Invalid maximum heap size: -Xmx4g
The specified size exceeds the maximum representable size.
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

怀念你的温柔 2024-10-19 06:13:46

您在哪个操作系统上运行它?是64位系统吗?如果没有,那么在可以分配给单个 Java 进程的堆大小方面,您会受到很大的限制。尝试使用 -Xms1024M -Xmx1024M 运行,看看是否可以解决您的问题。

Which OS are you running this on? Is it a 64 bit system? If not, then you are pretty much restricted when it comes to how much heap you can allocate to a single Java process. Try running with -Xms1024M -Xmx1024M and see if it solves your issue.

陌上芳菲 2024-10-19 06:13:46

尝试使用运行时参数

java -cp -XX:+AggressiveHeap -jar jarfile

java -cp... -XX:MaxHeapFreeRatio=70 -XX:+UseLargePages -jar jarfile >

try with the runtime parameters

java -cp -XX:+AggressiveHeap -jar jarfile

or

java -cp... -XX:MaxHeapFreeRatio=70 -XX:+UseLargePages -jar jarfile

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文