37

I have web application backed end in NodeJS and logstash/elasticsearch/kibana to handle system logs like (access_error.log, messages.log etc).

Right now I need to record all JavaScript client side errors into kibana also. What is the best way to do this?

EDIT: I have to add additional information to this question. As @Jackie Xu provide partial solution to my problem and as follows from my comment:

I'm most interested in realizing server-side error handling. I think it's not effective write each error into file. I'm looking for best practices how to make it more performance.

I need to handle js error records on server-side more effective than just write into file. May you provide some scenarios how could I increase server-side logging performance?

Erik
  • 14,060
  • 49
  • 132
  • 218
  • Hey guys help me. Does my question requires any explanations? – Erik Jul 05 '14 at 11:18
  • what version of node? What type of app (Express? Other MVC?) What are you trying to accomplish on the server side, with these client-side errors? – methai Jul 06 '14 at 00:25
  • I use the latest version of Node JS with express js 4.0. I need to just save js errors in my elasticsearch instance and see in my kibana dashboard. – Erik Jul 06 '14 at 09:36

4 Answers4

45

When you say client, I'm assuming here that you mean a logging client and not a web client.

First, make it a habit to log your errors in a common format. Logstash likes consistency, so if you're putting text and JSON in the same output log, you will run into issues. Hint: log in JSON. It's awesome and incredibly flexible.

The overall process will go like this:

  1. Error occurs in your app
  2. Log the error to file, socket, or over a network
  3. Tell logstash how to get (input) that error (i.e. from file, listen over network, etc)
  4. Tell logstash to send (output) the error to Elasticsearch (which can be running on the same machine)

In your app, try using the bunyan logger for node. https://github.com/trentm/node-bunyan

node app index.js

var bunyan = require('bunyan');
var log = bunyan.createLogger({
  name: 'myapp',
  streams: [{
    level: 'info',
    stream: process.stdout // log INFO and above to stdout
  }, {
    level: 'error',
    path: '/var/log/myapp-error.log' // log ERROR and above to a file
  }]
});

// Log stuff like this
log.info({status: 'started'}, 'foo bar message');

// Also, in express you can catch all errors like this
app.use(function(err, req, res, next) {
   log.error(err);
   res.send(500, 'An error occurred');
});

Then you need to configure logstash to read those JSON log files and send to Elasticsearch/Kibana. Make a file called myapp.conf and try the following:

logstash config myapp.conf

# Input can read from many places, but here we're just reading the app error log
input {
    file {
        type => "my-app"
        path => [ "/var/log/myapp/*.log" ]
        codec => "json"
    }   
}

# Output can go many places, here we send to elasticsearch (pick one below)
output {

  elasticsearch {
    # Do this if elasticsearch is running somewhere else
    host => "your.elasticsearch.hostname"
    # Do this if elasticsearch is running on the same machine
    host => "localhost"
    # Do this if you want to run an embedded elastic search in logstash
    embedded => true   
  }

}

Then start/restart logstash as such: bin/logstash agent -f myapp.conf web

Go to elasticsearch on http://your-elasticsearch-host:9292 to see the logs coming in.

methai
  • 8,835
  • 1
  • 22
  • 21
  • Thanks for the reply. I'll trying to apply your approach to my code. Is it possible to handle errors by using nginx only don't touching nodejs? – Erik Jul 06 '14 at 16:03
  • Your logstash conf file can have as many file inputs as you'd like. So you would just add an input section for the nginx error logs files as well e.g. ```path => [ "/var/log/nginx/nginx-error.log" ]```. Now, if you want web client errors via XHR you'd be better off logging those in your node app since you can format the error how you want it. It may be tricky telling nginx to read the payload of a client error and dynamically log those into a file. Also, why? Program that kind of stuff in your app instead of the webserver. – methai Jul 06 '14 at 16:17
  • The last question. If I'll save JS errors in `/var/log/myapp-error.log` and send into elasticsearch that I don't need that record in `/var/log/myapp-error.log` anymore. It is possible to auto remove it from that file? – Erik Jul 06 '14 at 19:21
  • Or is it possible to send log in elasticsearch directly from node js don't touching a file? – Erik Jul 07 '14 at 03:18
9

If I understand correctly, the problem you have is not about sending your logs back to the server (or if it was @Jackie-xu provided some hints), but rather about how to send them to elastiscsearch the most efficiently.

Actually the vast majority of users of the classic stack Logstash/Elasticsearch/Kibana are used to having an application that logs into a file, then use Logstash's plugin for reading files to parse that file and send the result to ElasticSearch. Since @methai gave a good explanation about it I won't go any further this way.

But what I would like to bring on is that:

You are not forced to used Logstash.
Actually Logstash's main role is to collect the logs, parse them to identify their structure and recurrent field, and finally output them in a JSON format so that they can be sent to ElasticSearch. But since you are already manipulating javascript on the client side, one can easily imagine that you would talk directly to the Elasticsearch server. For example once you have caught a javascript exception, you could do the folowing:

var xhr = new XMLHttpRequest();
xhr.open("PUT", http://your-elasticsearch-host:9292, true);
var data = {
    lineNumber: lineNumber,
    message: message,
    url: url
}
xhr.send(JSON.stringify(data));

By doing this, you are directly talking from the client to the ElasticSearch Server. I can't imagine a simpler and faster way to do that (But note that this is just theory, I never tried myself, so reality could be more complex, especially if you want special fields like date timestamps to be generated ;)). In a production context you will probably have security issues, probably a proxy server between the client and the ES server, but the principle is there.

If you absolutely want to use Logstash you are not forced to use a file input
If, for the purpose of harmonizing, doing the same as everyone, or for using advanced logstash parsing confifuration you want to stick to Logstash, you should take a look at all the alternative inputs to the basic file input. For example I used to use a pipe myself, with a process in charge of collecting the logs and outputting these to the standard output. There is also the possibilty to read on an open tcp socket, and a lot more, you can even add your own.

Community
  • 1
  • 1
Aldian
  • 2,592
  • 2
  • 27
  • 39
5

You would have to catch all client side errors first (and send these to your server):

window.onerror = function (message, url, lineNumber) {

    // Send error to server for storage
    yourAjaxImplementation('http://domain.com/error-logger/', {
        lineNumber: lineNumber,
        message: message,
        url: url
    })

    // Allow default error handling, set to true to disable
    return false

}

Afterwards you can use NodeJS to write these error messages to a log. Logstash can collect these, and then you can use Kibana to visualise.

Note that according to Mozilla window.onerror doesn't appear to work for every error. You might want to switch to something like Sentry (if you don't want to pay, you can directly get the source from GitHub).

Aeveus
  • 5,052
  • 3
  • 30
  • 42
  • Thanks for the reply but I'm most interested in realizing server-side error handling. I think it's not effective write each error into file. I'm looking for best practices how to make it more performance. – Erik Jul 05 '14 at 14:52
  • 2
    In that case, you should rephrase your question. Your current question is about displaying client-side errors in Kibana. As for file writing performance in NodeJS, I don't have the specifics. I suppose you could use NodeJS's fs.WriteStream class, which seems to be its fastest (and least memory consuming) way of writing data to a file. You might be able to skip this bottleneck by using child.stdin and writing to a process written in a language with faster disk IO, but I could be wrong in that, as I haven't tried such a thing. You'll get better answers with a new (relevant) question, though! – Aeveus Jul 05 '14 at 15:04
1

Logging errors trough the default built-in file logging allows your errors to be preserved and it also allows your kernel to optimize the writes for you.

If you really think that it is not fast enough (you get that many errors?) you could just put them into redis.

Logstash has a redis pub/sub input so you can store the errors in redis and logstash will pull them out and store them in your case in elasticsearch.

I'm presuming logstash/es are on another server, otherwise there really is no point doing this, es has to store the data on disc also, and it is not nearly as efficient as writing a logfile.

With whatever solution you go with, youll want to store the data, eg. writing it to disc. Append-only to a single (log) file is highly efficient, and when preserving data the only way you can handle more is to shard it across multiple discs/nodes.

Paul Scheltema
  • 1,993
  • 15
  • 25
  • Thanks for the reply. May you provide example how could I handle error via files or point me to according tutorial? – Erik Jul 06 '14 at 11:08