2

I want to parse large number of examples/sentences using syntaxnet model. Currently i am using python subprocess module to run demo.sh file for each example which is taking a lot of time as it might be loading all the trained models again and again. Please tell me any alternative for this.

1 Answers1

0

There are instructions on how to process files:

To change the pipeline to read and write to specific files (as opposed to piping through stdin and stdout), we have to modify the demo.sh to point to the files we want.

https://github.com/tensorflow/models/tree/master/syntaxnet#annotating-a-corpus

m33lky
  • 7,055
  • 9
  • 41
  • 48