1

Is there a way to bulk load the data below into elasticsearch without modifying the original content? I POST each object to be a single document. At the moment I'm using Python to parse through individual objects and POST them one at a time.

{
   {"name": "A"},
   {"name": "B"},
   {"name": "C"},
   {"name": "D"},
}

Doing this type of processing in production from REST servers into elasticsearch is taking a lot of time.

Is there a single POST/curl command that can upload the file above at once and elasticsearch parses it and makes each object into its own document?

We're using elasticsearch 1.3.2

Saeed Zhiany
  • 2,051
  • 9
  • 30
  • 41
codeBarer
  • 2,238
  • 7
  • 44
  • 75

1 Answers1

2

Yes, you can do bulk api via curl by using the _bulk endpoint. But not custom parsing. Whatever process that creates the file can format it to ES specification if that is an option. See here:

http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html

There is also bulk support in python via helper. See here: http://elasticsearch-py.readthedocs.org/en/master/helpers.html

coffeeaddict
  • 858
  • 5
  • 3