1

I have the following .conf file for Logstash:

input {
    file {
        path => "C:/elastic/logstash-8.3.2/config/*.csv"
        start_position => "beginning"
        sincedb_path => "NULL"
    }
}
filter {
    csv {
        separator => ";"
        columns => ["name","deposit","month"]
    }
    mutate {
        convert => {
            "deposit" => "integer"
        }
    }
}
output {
    elasticsearch {
        hosts => "http://localhost:9200"
        index => "payment_test"
    }
    stdout {}
}

I get inputs from 10 .csv files, which have names like in-0.csv, in-1.csv and so on. I want the index names in ElasticSearch to be payment_test-0, payment_test-1 and so on for the corresponding .csv input files (the data in in-0.csv would be in index payment_test-0 and so on). How can I achieve this?

  • Does this answer your question? [Logstash output to custom index if file path is "/file/path"](https://stackoverflow.com/questions/70112244/logstash-output-to-custom-index-if-file-path-is-file-path) – Sagar Patel Jul 29 '22 at 09:35
  • @SagarPatel Unfortunately not, I don't use beats and I want the index names to increment like `payment-test-0`, `payment-test-1`, `payment-test-2` etc. according to their input file names. Thanks for the suggestion though. – Yağız Can Aslan Jul 29 '22 at 09:51
  • This will apply same even for logstash as well. You can add If condition in Output of logstash as well. You can add if condition and check what is file name and based on that you can set index name. – Sagar Patel Jul 29 '22 at 10:04
  • @SagarPatel If I add if conditions for each file, wouldn't I need 10 if statements for 10 files? Can't I just get the last part of the filename like `filename.substr(3)` in JavaScript and use the result in the index name as variable like %{customName}? – Yağız Can Aslan Jul 29 '22 at 10:15
  • yes, you can do that. In you filter plugin add mutate plugin which will create new field and that field you can combine with your index name inside output plugin. – Sagar Patel Jul 29 '22 at 10:16
  • @SagarPatel I am very new to the ELK stack and Logstash, can you give an example please? – Yağız Can Aslan Jul 29 '22 at 10:17
  • Please chekc my answer. I hope it will help. – Sagar Patel Jul 29 '22 at 10:38

2 Answers2

2

I would simply do it like this with the dissect filter instead of grok:

filter {
    ... your other filters

    dissect {
      mapping => {
        "[log][file][path]" => "%{?ignore_path}/in-%{file_no}.csv"
      }
    }
}
output {
    elasticsearch {
        hosts => "http://localhost:9200"
        index => "payment_test-%{file_no}"
    }
    stdout {}
}
Val
  • 207,596
  • 13
  • 358
  • 360
  • When I run with these settings, I get the following error and no new indices: `Dissector mapping, field not found in event {"field"=>"path", "event"=>{"log"=>{"file"=>{"path"=>"C:/elastic/logstash-8.3.2/config/in-4.csv"}}, "name"=>"Name", "event"=>{"original"=>"Name;35215;January\r"}, "message"=>"Name;35215;January\r", "deposit"=>35215, "month"=>"January", "@timestamp"=>2022-07-29T12:49:55.888897300Z, "@version"=>"1", "host"=>{"name"=>"PC"}}}` – Yağız Can Aslan Jul 29 '22 at 12:51
  • Good point, it's not `path` but in ECS it's called `[log][file][path]`, fixed! – Val Jul 29 '22 at 12:52
0

You can create new field as shown below and that you can use in index name:

input {
    file {
        path => "C:/elastic/logstash-8.3.2/config/*.csv"
        start_position => "beginning"
        sincedb_path => "NULL"
    }
}
filter {
    csv {
        separator => ";"
        columns => ["name","deposit","month"]
    }
    mutate {
        convert => {
            "deposit" => "integer"
        }
    }
grok {
    match => ["path","%{GREEDYDATA}/%{GREEDYDATA:file_name}\.csv"]
  }
grok { match => { "file_name " => "^.{3}(?<file_no>.)" } }
}
output {
    elasticsearch {
        hosts => "http://localhost:9200"
        index => "payment-test-%{file_no}"
    }
    stdout {}
}

I have used file_name field name for file name but you can used your original field in which file name is coming.

Sagar Patel
  • 4,993
  • 1
  • 8
  • 19
  • Seems to work but I can't see any new indices in Postman also the outputs in the console has `"tags" => [ [0] "_grokparsefailure"],`. Do I need to create the indices first and then run Logstash? – Yağız Can Aslan Jul 29 '22 at 10:44
  • If you have the index as a template , they should automatically be created ? – Ramachandran.A.G Jul 29 '22 at 10:54
  • @Ramachandran.A.G I use the exact .conf posted, so I don't think I have the index as a template. How do I create the index as a template? – Yağız Can Aslan Jul 29 '22 at 11:18
  • is your field name is correct ? – Sagar Patel Jul 29 '22 at 12:01
  • @SagarPatel In the grok config, `file_name` should be `path` instead, hence the `_grokparsefailure` – Val Jul 29 '22 at 12:03
  • @Val Yes, thats right and I have added same comment in my answer as well in last line. because i have taken `file_name` as just example. – Sagar Patel Jul 29 '22 at 12:04
  • Yes, though the file name is always stored in the `path` variable by the `file` input plugin, so it's known – Val Jul 29 '22 at 12:07
  • @YağızCanAslan Please check my updated answer where i have added grok for getting file_name from path and which is used in next grok for number. I think so this should be work for you. – Sagar Patel Jul 29 '22 at 12:14