1

Attempting to get cloudtrail logs of multiple aws accounts from s3 into elasticsearch and things appear to be working on and off until now where everything ground to halt. and error show is shown below

[2018-10-16T21:33:42,096][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch, but no there are no living connections in the connection pool. Perhaps Elasticsearch is unreachable or down? {:error_message=>"No Available connections", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError", :will_retry_in_seconds=>8}
[2018-10-16T21:33:44,406][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://vpc-sec-dummytext.eu-west-1.es.amazonaws.com:443/, :path=>"/"}
[2018-10-16T21:33:44,430][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://vpc-sec-dummytext.eu-west-1.es.amazonaws.com:443/"}
[2018-10-16T21:33:51,426][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff  {:code=>413, :url=>"https://vpc-sec-dummytext.eu-west-1.es.amazonaws.com:443/_bulk"}

Also here is my logstash config as am using logstash to do ingestion

```

    input {
       s3 {
         bucket => "dummy-s3"
         region => "eu-west-1"
         type => "cloudtrail"
         sincedb_path => "/tmp/logstash/cloudtrail"
         exclude_pattern => "/CloudTrail-Digest/"
         interval => 120
         codec => "json"
       }
    }

    filter {
        if [type] == "cloudtrail" {
            json {
                source => "message"
            }
            split {
                    field => "Records"
                    add_tag => "splitted"

            }
            if ("splitted" in [tags]) {
              date {
                    match => ["eventTime", "ISO8601"]
                    remove_tag => ["splitted"]
                    remove_field => ["timestamp"]
                   }
            }

            geoip {
                source => "[Records][sourceIPAddress]"
                target => "geoip"
                add_tag => ["cloudtrail-geoip"]
            }

            mutate {
                 gsub => [

                  "eventSource", "\.amazonaws\.com$", "",
                  "apiVersion", "_", "-"
            ]

            }
        }
    }

    output {
        elasticsearch {
          hosts => ["vpc-sec-dummytext.eu-west-1.es.amazonaws.com:443"]
          ssl => true
          index => "cloudtrail-%{+YYYY.MM.dd}"
          doc_as_upsert => true
          template_overwrite => true
          }
        stdout {
         codec => rubydebug
        }
     }
}

When log-stash start or restarted from ubuntu ec2 logs as ingested for a few minutes then stops

Any help will really be appreciated.

tripleb
  • 67
  • 2
  • 2
  • 10

0 Answers0