0

I keep getting this error. I am trying to parse a csv. file. I am wondering if I am missing a library or something.

I am using logstash.bat -f logstash.conf command in windows command line to run this and getting this output.

I am trying to output using rubydebug codec

21:19:03.781 [main] INFO  logstash.setting.writabledirectory - Creating directory {:setting=>"path.queue", :path=>"C:/Users/Public/logstash-5.2.1/data/queue"}

21:19:03.787 [LogStash::Runner] INFO  logstash.agent - No persistent UUID file found. Generating new UUID {:uuid=>"0546332b-dc4d-4916-b5c6-7900d1fdd8a4", :path=>"C:/Users/Public/logstash-5.2.1/data/uuid"}

21:19:04.138 [[main]-pipeline-manager] ERROR logstash.agent - Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/Public/logstash-5.2.1/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:178:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "C:/Users/Public/logstash-5.2.1/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:172:in `register'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/pipeline.rb:235:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/pipeline.rb:235:in `start_workers'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/pipeline.rb:188:in `run'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/agent.rb:302:in `start_pipeline'"]}

SINGLE LINE from Log:

80,17-02-2017 18:28:31,56.000,45.000,0.000,2.000,0.000,44.000,55.000,57.000,50.000

FEW LINES from the log.

80,17-02-2017 18:28:31,56.000,45.000,0.000,2.000,0.000,44.000,55.000,57.000,50.000

80,17-02-2017 18:28:32,53.000,45.000,0.000,3.000,0.000,54.000,43.000,54.000,43.000

80,17-02-2017 18:28:33,56.000,45.000,0.000,2.000,0.000,45.000,51.000,43.000,50.000

80,17-02-2017 18:28:34,53.000,45.000,0.000,1.000,0.000,42.000,47.000,48.000,48.000

80,17-02-2017 18:28:35,59.000,45.000,0.000,2.000,0.000,45.000,59.000,39.000,48.000

80,17-02-2017 18:28:36,56.000,45.000,0.000,3.000,0.000,44.000,49.000,50.000,50.000

MY FILTER

filter {
     csv {
         columns => ["port", "timestamp", "tempcpuavg", "gputemp", "fanspeed", "gpuusage", "framerate", "tempcpu1", "tempcpu2", "tempcpu3", "tempcpu4"]
                     #80,   17-02-2017 18:28:31,56.000,    45.000,     0.000,      2.000,        0.000,     44.000,      55.000,   57.000,       50.000
         separator => ","
         skip_empty_columns => "true"
         remove_field => ["message"]
     }
     mutate {
         convert => ["port", "integer"]
         convert => ["tempcpuavg", "double"]
         convert => ["gputemp", "double"]
         convert => ["fanspeed", "double"]
         convert => ["gpuusage", "double"]
         convert => ["framerate", "double"]
         convert => ["tempcpu1", "double"]
         convert => ["tempcpu2", "double"]
         convert => ["tempcpu3", "double"]
         convert => ["tempcpu4", "double"]
     }

     date {
         match => [@timestamp", "MM-dd-YYYY HH:mm:ss"]
     }
 }
Val
  • 207,596
  • 13
  • 358
  • 360
ScipioAfricanus
  • 1,331
  • 6
  • 18
  • 39

1 Answers1

2

Your mutate/convert filter uses a data type that is not supported, i.e. double. The documentation of the mutate filter states that

Valid conversion targets are: integer, float, string, and boolean.

So you simply need to change all double to float

 mutate {
     convert => ["port", "integer"]
     convert => ["tempcpuavg", "float"]
     convert => ["gputemp", "float"]
     convert => ["fanspeed", "float"]
     convert => ["gpuusage", "float"]
     convert => ["framerate", "float"]
     convert => ["tempcpu1", "float"]
     convert => ["tempcpu2", "float"]
     convert => ["tempcpu3", "float"]
     convert => ["tempcpu4", "float"]
 }
Val
  • 207,596
  • 13
  • 358
  • 360
  • Hey, still getting the same error. I have not installed logst. on my windows pc. I am only running it in console using logstash -f logstash.yml. I am wondering if I need to install anything .. – ScipioAfricanus Feb 18 '17 at 06:35
  • Hey. I found my mistake... there were 15 columns in my data and I was trying to ingest 16.. sry to waste your time. But after fixing that, I am still not seeing any out put... – ScipioAfricanus Feb 20 '17 at 00:24
  • Hey, I created a new post. http://stackoverflow.com/questions/42336205/logstash-csvparsefailure-and-dateparsefailure – ScipioAfricanus Feb 20 '17 at 05:03
  • Before proceeding, shall we close this post then? – Val Feb 20 '17 at 05:04
  • yes, I tried to delete this, but it didnt let me as it had a correct answer attached to it. thanks. – ScipioAfricanus Feb 20 '17 at 05:12