Log-analysis is the process of parsing log file, created from any logging service or application.
Questions tagged [log-analysis]
126 questions
1
vote
1 answer
How to design simple Log analyser in java
I want to design a log analyser where i will provide application.log file to java program it will
parse the log file and try to capture few fields like time, Ip Address, Status Code (200/401/500 etc), Request Type (GET/POST/PUT etc) etc. i want to…

Mohit Singh
- 401
- 1
- 10
- 30
1
vote
1 answer
access json body parameters in custom formatter of lnav log file navigator
I'm using lnav to filter and query on top of the custom log file we have created.
As this is a custom log file, I need to create a custom format file and install it using below command to define the structure of the log entries.
lanv -i…

Baranidharan S
- 105
- 1
- 9
1
vote
1 answer
Splunk: How to apply conditionals for multiple rows with same column value?
I have got a table with columns in the following format where host_names are repeated and a single host can have both Compliant and Non-Compliant values against it. How can i write a query which checks for each host_name and marks it as…

Ankit Vashistha
- 325
- 6
- 17
1
vote
4 answers
Looping through 100 text files in Python
My python code is as below:
#Loading libraries
import re
import pandas as pd
import numpy as np
import datetime
#Creating an empty dataframe
columns = ['A']
df_ = pd.DataFrame(columns=columns)
df_ = df_.fillna(0)
#Reading the data line by…

Riane Rose Kinuthia
- 33
- 6
1
vote
1 answer
Extracting the StatusDescription from a text file using Python
I have a sample text file. I want to extract the StatusDescription for each line and incase its not available, i want it to return a null i.e
Line1 StatusDescription=Null
Line2 StatusDescription=Success
The sample text file:
[23-Oct-2019]…

Riane Rose Kinuthia
- 33
- 6
1
vote
0 answers
Anomaly detection on Azure Databricks Diagnostic audit logs
I have a lot of audit logs coming from the Azure Databricks clusters I am managing. The logs are simple application audit logs in the format of JSON. You have information about jobs, clusters, notebooks, etc. and you can see a sample of one record…

dadadima
- 938
- 4
- 28
1
vote
1 answer
Filebeat and Logstash: logdata not passing
Hi I have 2 servers in which one has Logstash 7.2.0 and another has filbeat 7.2.0.
The below is my logstash conf file:
input {
beats {
port => 5044
ssl => false
}
}
output {
elasticsearch {
hosts…

AlisonGrey
- 497
- 1
- 7
- 23
1
vote
2 answers
Simple way to analysis log file and display the result
I have a log file. I want to upload the log file and I want to do some query on the log file then I want to display the result. What is the simplest way to do this work? Is it possible to do with only Elastic search and Kibana without using…

LearningCoding
- 47
- 7
1
vote
1 answer
Exporting logs out of loganalaysis ibm cloud
Is there a way to export logs out of IBM Cloud? Mainly activity logs which come from Activity Tracker. Also, does anyone know where these logs are stored? I can only view them inside Kibana but dont see any storage associated with it.
I tried…

NoviceMe
- 3,126
- 11
- 57
- 117
1
vote
1 answer
How to analyze over log files with custom logback pattern configuration
I would like to analyze through set of log files (look out for errors and creating a report)
These log files has the records in custom logback pattern

VJohn
- 493
- 1
- 14
- 23
1
vote
1 answer
log format for goaccess log analysis
Installed goaccess, and trying to parse/analyse one log file. Facing issues in the log format. Any one knows the format we need to use - for below kind of log:[updated the log sample]
::1 - - [24/Jun/2013:17:10:39 -0500] "GET /favicon.ico HTTP/1.1"…

jikku
- 641
- 1
- 6
- 18
1
vote
2 answers
How to get data from txt file in python log analysis?
I am beginner to python, I am trying to do log analysis, but I do not know how to get the txt file.
This is the code for outputting date, but these dates must be taken from the txt file :
import sys
import re
file = open ('desktop/trail.txt')
for…

warezers
- 174
- 10
1
vote
1 answer
How to merge last two events into one row
I have a table like below.
date id event keyword val pattern_id()
2017-08-01 001 triggerX abc (null) 1
2017-08-01 001 triggerY (null) 3 1
2017-08-01 009 triggerX cde (null) 2
2017-08-01 …

K.K.
- 415
- 1
- 5
- 13
1
vote
1 answer
Find Missing Logs with ELK
I've just deployed ELK in an attempt to see if it can be used for monitoring logs and alerting about issues.
What I need to be able to detect is mostly missing records:
Say a log record was received, saying a user is about to make some sort of…

SivanBH
- 392
- 3
- 13
1
vote
1 answer
Flume / Elasticsearch creating a new index and ignoring the index explicitly created
We created an index in Elasticsearch as follows, index name is apachelog with dynamic mapping set to "strict", we set the httpresponse field to type integer:
curl -X PUT 'http://localhost:9200/**apachelog**' -d \
'{
"log": {
"dynamic":…

Thangarajan Pannerselvam
- 146
- 2
- 14