0

I am trying to find some duplicated log like for example:

"[entry] shipmentOrder 1234" 
"[entry] shipmentOrder 1234"
"[entry] shipmentOrder 1235"
"[entry] shipmentOrder 1236"

In that case case I would like to find all the log that appear two time (1234 in that example). Since 1234 is a random number I did not find any way to achieve that. Anyone had similar need ?

Thanks for your help!

Jonathan.

Jonathan Chevalier
  • 993
  • 1
  • 9
  • 18
  • You're question lacks specificity and is necessarily (we don't have your logs) challenging to reproduce. However, Google's [Logging Query language](https://cloud.google.com/logging/docs/view/logging-query-language) is well-documented and supports [regular expression](https://cloud.google.com/logging/docs/view/logging-query-language#regular-expressions) which you'll need to filter the results. The unit of the Logging service is a log entry so you can't use the query language to find duplicate log entries but you could post-process filtered results (using your preferred tools) to achieve it. – DazWilkin Nov 02 '22 at 17:25
  • thanks a lot for the answer @DazWilkin ! I just updated the question I hope it made it clearer, after taking a look at your link I did not find something related to my case. – Jonathan Chevalier Nov 03 '22 at 12:39
  • You don't explain the structure of the logs so it's not possible to provide much of an example. You'll want to use a regular expression that matches the strings that you show, i.e. `\[entry\]\sshipmentOrder\s[0-9]{4}`. You'll then need to use another tool to sort and extract unique values. An **unnecessarily** lax example is `gcloud logging read "logName=\"${LOG_NAME}\" ${FIELD}=~\"\[entry\]\sshipmentOrder\s[0-9]{4}\"" --project=${PROJECT} --limit=500 --format="value(${FIELD})" | sort | uniq -c"`. Good luck! – DazWilkin Nov 03 '22 at 17:36

0 Answers0