1

Goal: I want to create a dashboard which shows user requests made to my website. For this, I created a filter in my java web-app and started capturing user requests and storing them in an ES index. The document is in the form of:

{
  'user': 'user1',
  'url': 'domain.com/page1',
  'hitcount': 12
}

So, now I have an index which contains the information as to how many times a user requested which URLs. Now, I want to create visualizations to show usage trends per user.

Question:

  1. Which visualizations should be used for this use-case?

  2. If I need to show the change in user-trends over time, how should I save the data? For e.g. is there a visualization where I could show, that a user has stopped/reduced requesting a page and now accesses a different page more frequently. Any direction will be helpful. Note: I understand, this could be done with grafana + prometheus, but I wish to do this with elastic stack.

YetAnotherBot
  • 1,937
  • 2
  • 25
  • 32
  • How often are you indexing documents that you showed above ? hitcount: 12 over a period of how much duration ? I think it would be better to index un-aggregated user requests data into ES and then use Kibana visualizations to aggregate your data to show usage trends – ben5556 Nov 01 '18 at 05:16
  • Right now we are updating the same document with a fresh `count`. But, yeah, your point makes sense. – YetAnotherBot Nov 01 '18 at 06:02
  • Its just that inserting each URL hit individually would fill up the memory pretty fast. Then we would have to add strategy to create snapshots and all. – YetAnotherBot Nov 01 '18 at 06:03
  • 1
    Do you have access logs which log user access requests ? Then you can simply use filebeat to read those log entries and index into ES. Create visualizations in ES aggregating on your log data. You can also split your logs into individual fields to help with visualizations as you need to aggregate on fields – ben5556 Nov 01 '18 at 06:13
  • We are pushing data to for each URL request made. – YetAnotherBot Nov 01 '18 at 06:20
  • That is not so efficient. It is better to output user requests to a log file and use filebeats to index into ES. – ben5556 Nov 01 '18 at 06:30
  • Won't this be an overhead? Since filebeat would index all the records from a file when we can directly index them as the requests come to the web app. – YetAnotherBot Nov 01 '18 at 06:50
  • How are you indexing currently ? Using filebeat wont be an overhead as filebeat is a light-weight agent that won’t use much resources but extremely good at reading log events from files and forwarding it to ES. – ben5556 Nov 01 '18 at 06:55
  • So, for indexing, I first query ES with where clause for `user` and `url`. The result doc contains `count`. Increment and push an update request for the same document. – YetAnotherBot Nov 01 '18 at 07:01
  • Hmm so you are running a script to query ES & update same doc for every user request to your website ? This approach will result in a lot of individual index requests to ES if you have high user requests to your webapp. What will happen if your website has 1000 concurrent user requests ? Instead if you write it to a log file, you can configure filebeat to harvest the file every few secs and bulk index into ES. – ben5556 Nov 01 '18 at 07:15
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/182916/discussion-between-ben5556-and-aditya-gupta). – ben5556 Nov 01 '18 at 07:18

1 Answers1

2

I’d recommend logging user requests to a log file and have filebeat read and index them into ES. It is better to send non aggregated data into ES and then let ES aggregate it to create required visualizations

ben5556
  • 2,915
  • 2
  • 11
  • 16