8

I have a web app hosted on the azure platform and an ELK stack hosted on a virtual machine also in azure (same subscription) and am struggling to find a way to ship the logs from the app to logstash.

A web app stores all its files on a storage only accessible via FTP which logstash does not have an input plugin for.

What do people use to ship logs to ELK from web apps? If it was running as a VM I would use NXlog but that's not possible for a Web app.

I also use Log4Net and tried a UDP forwarder which worked on my local ELK stack but not the azure hosted one despite me adding the public UDP endpoint.

Sheff
  • 3,474
  • 3
  • 33
  • 35
  • 1
    *"A web app stores all its files on a storage only accessible via FTP "* - This isn't true. Web apps are not limited to using their local storage. They are perfectly capable of working with Azure Storage (e.g. blobs), Azure File Service, databases, etc. – David Makogon Nov 17 '15 at 12:06
  • @DavidMakogon - That's true. I did look at blob storage as an option but it isn't obvious as to how to get server logs and log4net files to save to them. I am wondering if anyone has done it and configured logstash to use blob storage as an input feed? – Sheff Nov 18 '15 at 10:01

2 Answers2

7

Currently i am using Serilog to push my application log messages (in batch) towards a Redis queue which in turn is read by Logstash to enrich them and push them into Elasticsearch. This results in a reliable distributed setup which does not lose any application logs unless the redis max queue length is exceeded. Added bonus; Serilog emits json so your logstash config can stay pretty simple. Example code can be found here: https://gist.github.com/crunchie84/bcd6f7a8168b345a53ff

Mark van Straten
  • 9,287
  • 3
  • 38
  • 57
  • Hi Mark, why don't you directly pass Serilog log to Elasticsearch? From Serilog to Redis to Logstash to ES look complicated. check this git: https://github.com/serilog/serilog-sinks-elasticsearch – Chan Sep 15 '17 at 07:58
  • 1
    it was a matter of decoupling; azure had hosted redis and it is very fast to ingest data. elasticsearch was not a hosted solution and since it needs to process the submitted json could end up to be a bottleneck potentially influencing the log producing app – Mark van Straten Sep 15 '17 at 18:16
3

Azure now has a project on GitHub called azure-diagnostics-tools that contains LogStash plugins for reading from blob and table storage, among other things.

Presumably, you can just enable diagnostics logging to blob/table storage for your WebApp and use the LogStash plugin to get them into ElasticSearch.

Regarding how to get log4net files into blob storage - you can use log4net.Appender.TraceAppender to write to the Trace system, which will cause it to be collected into Blob/Table storage when the option is enabled in the WebApp.

There's also an alternative to FTP for local files - you can use the Kudu REST VFS API to access (and modify) files by HTTP. I found this to be significantly faster and more reliable than FTP.

makhdumi
  • 1,308
  • 11
  • 35
  • I ended up going with how Mark does it - but used Serilog to directly export to Elasticsearch. The Azure Ruby libraries are immature, and reading from Azure storage does not scale well. – makhdumi Apr 06 '17 at 19:02