I have CSV file with 1000 rows and 3 columns, as follows:
field1, field2, field3
ABC A65 ZZZ
...
I want to export its content into the mapping myrecords
of the index myindex
(I have more mappings in this index):
PUT /myindex
{
"mappings": {
"myrecords": {
"_all": {
"enabled": false
},
"properties": {
"field1": { "type": "keyword" },
"field2": { "type": "keyword" },
"field3": { "type": "keyword" }
}
}
}
}
Is there any easy way to do it?
UPDATE:
I executed this Logstash config file, but though the size of CSV is small (1000 entries), the process is running eternally. When I execute GET /myindex/myrecords/_search
, I see only 1 record all the time.
input {
file {
path => ["/usr/develop/data.csv"]
start_position => beginning
}
}
filter {
csv {
columns => ["field1","field2","field3"]
separator => ","
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => ["127.0.0.1:9200"]
index => "myindex"
document_type => "myrecords"
document_id => "%{Id}" // Here I also tried "%{field1}"
workers => 1
}
}