1

I have logs being pushed to sumo logic once every day, but other co-workers have the ability to force a push to update statistics. This causes an issue where some sumo logic searches will find and return double (or more) than what is expected due to finding more than one message within the allocated time range.

I am wondering if there is some way I can use timeslice so that I am only looking at the last set of results within a 24h period?

My search that works when there is only one log in 24h:

| json field=_raw "Policy"
| count by policy
| sort by _count

What I am trying to achieve:

| json field=_raw "Policy"
| timeslice 1m
| where last(_timeslice)
| count by policy
| sort by _count
KingRogue
  • 13
  • 3

2 Answers2

0

Found a solution, not sure if optimal.

| json field=_raw "Policy"
| timeslice 1m
| count by policy , _timeslice
| filter _timeslice in (sort by _timeslice desc | limit 1)
| sort by _count
| fields policy, _count
KingRogue
  • 13
  • 3
0

If I'm understanding your question right, I think you could try something with the accum operator:

*
| json field=_raw "Policy"
| timeslice 1m
| count by _timeslice, policy
| 1 as rank
| accum rank by _timeslice
| where _accum = 1

This would be similar to doing a window partition in SQL to get rid of duplicates.

the-nick-wilson
  • 566
  • 4
  • 18
  • 1
    Not quite what I was looking for, but heading in the right direction. Made a few changes: | json field=_raw "Policy" | timeslice 1m | count by _timeslice, policy | 1 as rank | sort by _timeslice | accum rank by policy | where _accum = 1 | sort by _count – KingRogue Jan 09 '18 at 23:54
  • Nice! That makes sense now. As far as performance, I'd just recommend that you make sure to use some metadata tags (like _sourceCategory) in your scope. Also if you need to do some parsing on that json a lot, move it to a field extraction rule. https://help.sumologic.com/Search/Get-Started-with-Search/How-to-Build-a-Search/Best-Practices%3A-7-Search-Rules-to-Live-By – the-nick-wilson Jan 11 '18 at 20:32