1

Trying to feed logstash a csv for elastic indexing facing mapping error. The conf code is

using autodetect_column_names so I don't have to feed in the columns name. Also I havent created any index or mapping for the data from dev console and expecting the logstash to create index and dynamic mapping at run time.

input {
     file {
       path => "/Users/amansingh/SELECT_______orca_OpID_as_op_id________t.csv"
       start_position => "beginning"
     sincedb_path => "/dev/null"
       }
     }
     filter {
    csv {
      separator => ","
      autodetect_column_names => true
      convert => {
       "is_cancelled_2" => "boolean"
       "is_cancelled_14" => "boolean"
       "is_cancelled_7"  => "boolean"
       "is_cancelled_30"  => "boolean"
       "is_cancelled"  => "boolean"
       "is_dispute" => "boolean"
       "is_return" => "boolean"
       "is_large_parcel" => "boolean"
       "is_managed" => "boolean"
       }
     }
    }
    output {
      elasticsearch {
        hosts => "http://localhost:9200"
        index => "bit_prices"
        document_type => "doc"
     }
    stdout {}
    }

error:

[2018-07-27T10:05:25,172][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bit_prices", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x5568f8e6>], :response=>{"index"=>{"_index"=>"bit_prices", "_type"=>"doc", "_id"=>"C1wO3GQByymnO3qY9KTy", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [0] of different type, current_type [text], merged_type [ObjectMapper]"}}}}

The csv file looks like

op_id,is_cancelled_2,is_cancelled_14,is_cancelled_7,is_cancelled_30,revenue_adjustment_2,revenue_adjustment_7,revenue_adjustment_14,revenue_adjustment_30,cost_adjustment_2,cost_adjustment_7,cost_adjustment_14,cost_adjustment_30,order_date,is_cancelled,update_date,is_dispute,orcompletionstatus,is_return,is_large_parcel,is_managed
1627151503,0,0,0,0,0.0000,0.0000,0.0000,0.0000,0.0000,17.5100,17.5100,17.5100,2018-02-10 13:19:19.000,0,2018-02-14 02:00:41.003,0,3,0,0,0
1627151503,0,0,0,0,0.0000,0.0000,0.0000,0.0000,0.0000,17.5100,17.5100,17.5100,2018-02-10 13:19:19.000,0,2018-02-14 02:00:41.003,0,3,0,0,0
marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Amandeep Singh
  • 305
  • 2
  • 11

0 Answers0