7

I am trying to get around using the Stanford CoreNLP. I used some code from the web to understand what is going on with the coreference tool. I tried running the project in Eclipse but keep encountering an out of memory exception. I tried increasing the heap size but there isnt any difference. Any ideas on why this keeps happening? Is this a code specific problem? Any directions of using CoreNLP would be awesome.

EDIT - Code Added

import edu.stanford.nlp.dcoref.CorefChain;
import edu.stanford.nlp.dcoref.CorefCoreAnnotations;
import edu.stanford.nlp.pipeline.Annotation;
import edu.stanford.nlp.pipeline.StanfordCoreNLP;


import java.util.Iterator;
import java.util.Map;
import java.util.Properties;


public class testmain {

    public static void main(String[] args) {

        String text = "Viki is a smart boy. He knows a lot of things.";
        Annotation document = new Annotation(text);
        Properties props = new Properties();
        props.put("annotators", "tokenize, ssplit, pos, parse, dcoref");
        StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
        pipeline.annotate(document);


        Map<Integer, CorefChain> graph = document.get(CorefCoreAnnotations.CorefChainAnnotation.class);



        Iterator<Integer> itr = graph.keySet().iterator();

        while (itr.hasNext()) {

             String key = itr.next().toString();

             String value = graph.get(key).toString();

             System.out.println(key + " " + value);      
        }

   }
}
viki.omega9
  • 345
  • 4
  • 12

3 Answers3

4

I found similar problem when building small application using Stanford CoreNLP in Eclipse.
Increasing Eclipse's heap size will not solve your problem.
After doing search, it is ant build tool heap size that should be increased, but I have no idea how to do that.
So I give up Eclipse and use Netbeans instead.

PS: You will eventually get out of memory exception with default setting in Netbeans. But it can easily solved by adjust setting -Xms per application basis.

Khairul
  • 1,483
  • 1
  • 13
  • 23
  • Thank you! Thats a life saver! This memory issue was driving me nuts! How did you figure out that the problem was with ant? – viki.omega9 Jan 23 '12 at 12:17
  • I did some googling. Let me know if you find way to set ant heap size. – Khairul Jan 23 '12 at 12:24
  • [This] (http://soenkerohde.com/2008/06/change-eclipse-ant-settings-when-you-run-out-of-memory/) website has an answer but I'm not sure why it still isn't working. Also, if you are online, I'd like to have a chat with you! – viki.omega9 Jan 23 '12 at 12:25
  • I couldn't in this moment. I'm gonna out. You could send me an email me [kyu dot helf at gmail dot com] – Khairul Jan 23 '12 at 12:42
  • Is changing to Netbeans the only way to avoid the outofmemory exception? – Ishank May 24 '12 at 06:37
  • As far as solutions go, switching the IDE is probably the worst. You can easily set the heap size on a per-configuration basis. Open the run/debug configurations and select the "Arguments" tab. http://img.viralpatel.net/2009/10/tomcat-jvm-config.png – Jonny Nov 21 '13 at 16:53
3

Fix for eclipse: You can configure this in eclipse preference as follows

  1. Windows -> Preferences ( on mac it's: eclipse ->preferences)
  2. Java -> Installed JREs
  3. Select the JRE and click on Edit
  4. On the default VM arguments field, type in "-Xmx1024M". (or your memory preference, for 1GB of ram its 1024)
  5. Click on finish or OK.
eddie-ryan
  • 285
  • 1
  • 5
  • 19
user1374131
  • 131
  • 1
  • 2
2

I think you can define the heap size in right-click->run->run-configurations under the VM arguments. i have tested it on mac and it works.

Akash
  • 119
  • 1
  • 8