0

I need to create indexing of exiting postgres database in elasticsearch. For this purpose I have setup elasticsearch 7.17.4, kibana 7.17.4 and logstash 7.17.4 on my local machine. I have downloaded csv file of posts table from db that contain almost 62k rows of data in it. I have setup logstash config file as per mentioned requirements of logstash doc. i.e.

input {
  file {
    path => "/Users/manzoorfaisal/Desktop/Laptop-Migration-2/logstash-7.17.4/source-posts.csv"
   start_position => "beginning"
   sincedb_path => "/dev/null"
  type => "doc"
   #skip_header => true
   #source => "message"

  }
}
filter {
  # Add your filters here
  csv{
    separator=> ","
    columns=>["id","createat","updateat","deleteat","userid","channelid","rootid","originalid","message","type","props","hashtags","filenames","fileids","hasreactions","editat","ispinned","remoteid"]
 
skip_header => true
  }

  mutate {
    remove_field => ["updateat","deleteat", "rootid","originalid","props","filenames","fileids","hasreactions","editat", "ispinned", "remoteid", "@timestamp","@version","host","path"]
  add_field => {
      "teamid" => "new_value"
    }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "post_index-7"
    document_id => "%{id}"
    
  }

  stdout{codec => rubydebug }
}

it create index for almost 10k rows and after that throw error in logstash terminal i.e. enter image description here

I need to create index of existing db in elasticsearch using logstash or using any other method. using logstash I am facing above issue. Can anyone get me out from this issue? or suggest any other authentic method for reindexing of existing postgres db tables into elasticsearch index.

0 Answers0