2

Using filebeat 7.5.2:

I'm using a filebeat configuration with close_eof enabled and I run filebeat with the flag --once. I can see the harvester reaching eof but the filebeat keeps going.

Flebeat conf:

filebeat.inputs:
- type: log
  close_eof: true
  enabled: true
  paths:
    - "${LOGS_PATH}"
  scan_frequency: 1s
  fields: {
    machine: "${HOST}"
  }


output.logstash:
  hosts: ["192.168.41.6:5044"]
  bulk_max_size: 1024
  timeout: 30s
  pipelining: 1
  workers: 1

And I run it using:

filebeat run --once -v -c "PATH TO CONF..."

And some logs from the filebeat instance:

...
    2020-02-04T18:30:16.950Z        INFO    instance/beat.go:297    Setup Beat: filebeat; Version: 7.5.2
    2020-02-04T18:30:17.059Z        INFO    [publisher]     pipeline/module.go:97   Beat name: logstash
    2020-02-04T18:30:17.167Z        WARN    beater/filebeat.go:152  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch out
    put is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
    2020-02-04T18:30:17.168Z        INFO    instance/beat.go:429    filebeat start running.
    2020-02-04T18:30:17.168Z        INFO    [monitoring]    log/log.go:118  Starting metrics logging every 30s
    2020-02-04T18:30:17.168Z        INFO    registrar/migrate.go:104        No registry home found. Create: /tmp/tmp.BXJtfiaEzb/data/registry/filebeat
    2020-02-04T18:30:17.179Z        INFO    registrar/migrate.go:112        Initialize registry meta file
    2020-02-04T18:30:17.192Z        INFO    registrar/registrar.go:108      No registry file found under: /tmp/tmp.BXJtfiaEzb/data/registry/filebeat/data.json. Creating a new re
    gistry file.
    2020-02-04T18:30:17.193Z        INFO    registrar/registrar.go:145      Loading registrar data from /tmp/tmp.BXJtfiaEzb/data/registry/filebeat/data.json
    2020-02-04T18:30:17.193Z        INFO    registrar/registrar.go:152      States Loaded from registrar: 0
    2020-02-04T18:30:17.193Z        WARN    beater/filebeat.go:368  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch out
    put is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
    2020-02-04T18:30:17.193Z        INFO    crawler/crawler.go:72   Loading Inputs: 1
    2020-02-04T18:30:17.194Z        INFO    log/input.go:152        Configured paths: [/tmp/tmp.BXJtfiaEzb/*.log]
    2020-02-04T18:30:17.206Z        INFO    input/input.go:114      Starting input of type: log; ID: 13918413832820009056 
    2020-02-04T18:30:17.225Z        INFO    input/input.go:167      Stopping Input: 13918413832820009056
    2020-02-04T18:30:17.225Z        INFO    crawler/crawler.go:106  Loading and starting Inputs completed. Enabled inputs: 1
    2020-02-04T18:30:17.225Z        INFO    log/harvester.go:251    Harvester started for file: /tmp/tmp.BXJtfiaEzb/dcbgw-20200124080032_darkblue.log
    2020-02-04T18:30:17.231Z        INFO    beater/filebeat.go:384  Running filebeat once. Waiting for completion ...
    2020-02-04T18:30:17.231Z        INFO    beater/filebeat.go:386  All data collection completed. Shutting down.
    2020-02-04T18:30:17.231Z        INFO    crawler/crawler.go:139  Stopping Crawler
    2020-02-04T18:30:17.231Z        INFO    crawler/crawler.go:149  Stopping 1 inputs
    2020-02-04T18:30:17.258Z        INFO    pipeline/output.go:95   Connecting to backoff(async(tcp://192.168.41.6:5044))
    2020-02-04T18:30:17.296Z        INFO    pipeline/output.go:105  Connection to backoff(async(tcp://192.168.41.6:5044)) established

    ... Only metrics here ...

    2020-02-04T18:35:55.686Z        INFO    log/harvester.go:274    End of file reached: /tmp/tmp.BXJtfiaEzb/dcbgw-20200124080032_darkblue.log. Closing because close_eof is enabled.
    2020-02-04T18:35:55.686Z        INFO    crawler/crawler.go:165  Crawler stopped
... MORE METRICS ...
    2020-02-04T18:36:26.609Z        ERROR   logstash/async.go:256   Failed to publish events caused by: read tcp 192.168.41.6:49662->192.168.41.6:5044: i/o timeout
    2020-02-04T18:36:26.621Z        ERROR   logstash/async.go:256   Failed to publish events caused by: client is not connected
    2020-02-04T18:36:28.520Z        ERROR   pipeline/output.go:121  Failed to publish events: client is not connected
    2020-02-04T18:36:28.520Z        INFO    pipeline/output.go:95   Connecting to backoff(async(tcp://192.168.41.6:5044))
    2020-02-04T18:36:28.521Z        INFO    pipeline/output.go:105  Connection to backoff(async(tcp://192.168.41.6:5044)) established

... MORE METRICS ...

From this I'm outputing this to Logstash 7.5.2 running in the same Ubuntu 18 VM. Running Logstash with log level trace does not output any error.

0 Answers0