1

Is there a way of having the filename of the file being read by logstash as the index name for the output into ElasticSearch?

I am using the following config for logstash.

input{
    file{
        path => "/logstashInput/*"
    }
}
output{
    elasticsearch{
        index => "FromfileX"
    }
}

I would like to be able to put a file e.g. log-from-20.10.2016.log and have it indexed into the index log-from-20.10.2016. Does the logstash input plugin "file" produce any variables for use in the filter or output?

Topher Brink
  • 329
  • 2
  • 13

2 Answers2

0

Yes, you can use the path field for that and grok it to extract the filename into the index field

  input {
     file {
         path => "/logstashInput/*"
     }
  }
  filter {
     grok {
        match => ["path", "(?<index>log-from-\d{2}\.\d{2}\.\d{4})\.log$" ]
     }
  }
  output{
     elasticsearch {
        index => "%{index}"
     }
  }
Val
  • 207,596
  • 13
  • 358
  • 360
  • with version 7.9.1, it does not work, it shows an error that this is an invalid syntax – max Sep 08 '20 at 21:26
  • @max I'll check and get back to you. In the meantime, if you can share the error you get, that'll help. Maybe you'll be able to remove your downvote ;-) – Val Sep 09 '20 at 04:49
  • If I put anything else inside grok match, I get the error: Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError and if I remove other things, I do not get any errors but %{index} is not getting subsituted. I tried using a string for path instead of a list, still did not work. – max Sep 09 '20 at 21:41
-1
input {
    file {
        path => "/home/ubuntu/data/gunicorn.log"
        start_position => "beginning"
    }
}

filter { 
    grok {
        match => {
        "message" => "%{USERNAME:u1} %{USERNAME:u2} \[%{HTTPDATE:http_date}\] \"%{DATA:http_verb} %{URIPATHPARAM:api} %{DATA:http_version}\" %{NUMBER:status_code} %{NUMBER:byte} \"%{DATA:external_api}\" \"%{GREEDYDATA:android_client}\""
        remove_field => ["message"]
       }
    }

    date {
        match => ["http_date", "dd/MMM/yyyy:HH:mm:ss +ssss"]
    } 

    ruby {
        code => "event.set('index_name',event.get('path').split('/')[-1].gsub('.log',''))"
    } 
}
output {
    elasticsearch {
        hosts => ["0.0.0.0:9200"]
        index => "%{index_name}-%{+yyyy-MM-dd}"
        user => "*********************"
        password => "*****************"
    }

    stdout { codec => rubydebug }
}
Siddharth Kumar
  • 2,672
  • 1
  • 17
  • 24
  • with version 7.9.1, it causes log stash to freeze and you can no longer restart the service – max Sep 08 '20 at 21:27