Can someone assist please. I need to fix the error so CloudTrail log in S3 can be shipped to Logstash the ES and viewed in Kibana. Can't figure out how to increase the field limit to something higher. My configuration looks like
input {
s3 {
bucket => "sample-s3bucket"
region => "eu-west-1"
type => "cloudtrail"
codec => cloudtrail {}
sincedb_path => "/tmp/logstash/cloudtrail"
exclude_pattern => "/CloudTrail-Digest/"
interval => 300
}
}
filter {
if [type] == "cloudtrail" {
json {
source => "message"
}
geoip {
source => "sourceIPAddress"
target => "geoip"
add_tag => ["cloudtrail-geoip"]
}
}
}
output {
elasticsearch {
hosts => "coordinate_node:9200"
index => 'cloudtrail-%{+YYYY.MM.dd}'
}
stdout {
codec => rubydebug
}
}
Here is what am seeing on my Logstash machine about limit
2018-10-04T17:49:49,883][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"cloudtrail-2018.09.27", :_type=>"doc", :_routing=>nil}, #], :response=>{"index"=>{"_index"=>"cloudtrail-2018.09.27", "_type"=>"doc", "_id"=>"lrMzQGYBOny1_iySNW6G", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [cloudtrail-2018.09.27] has been exceeded"}}}
Thanks in advance