0

I am absolutely new to the elastic stack.

So my problem space is I have utility which runs on client machines .We have few logs which are generated on these machines (thousands of them), So we have three data source- csv files, log files(generated by my application) and windows event log . I want to combine these three and generate some useful information out of them .Also want to generate a dashboard with some graphs which will be used by managers.

I have zeroed down on elk stack , idea is I install beats on client machine and push data to elastic and then use Kibana to get some visualization. Since I might have thousand of client pushing the data to elastic server, it might not be feasible to keep this data in the server for ever. But I need updated visualizations, to be available always. So I was planning that periodic queries will be run on the indexed data in elastic and the result which is generated (which is real information I need) will be saved back in elastic in a separate index and the visualization in Kibana are set up based on this index .And all the original data can now be cleared. This way I extract real info and keep it and delete unnecessary info.

My question to the expert are

  1. Is my thinking or design correct(wrt to elk stack) given the problem statement
  2. Is it feasible in elk stack and are there any examples or utilities to achieve this.

Thanks Gaurav

1 Answers1

0
  1. Saving the results of your aggregations back into ElasticSearch is a perfectly valid option. You should also consider Cold storage as an option for storing large amounts of data with long retention.
  2. You tagged logz.io in your question, so it's worth mentioning that there is a logz.io feature called 'Timeless accounts' which uses Optimizers to define query results that should be saved for longer than the retention periods of the underlying logs.

For the record, I work at logz.io

Barak
  • 3,066
  • 2
  • 20
  • 33