1

I have a table in an oracle 11R2 database with the follow structure transactionId, timestamp, logpoint, server, client For example

screenshot of structure

transactionId   timestamp           logpoint            server      client
tran-1          22/07/2019 13:21:52 Final.Response      Myserver1   
tran-1          22/07/2019 13:21:52 BackEnd.Response    Myserver1   
tran-1          22/07/2019 13:21:46 BackEnd.Request     Myserver1   
tran-1          22/07/2019 13:21:46 Initial.Request     Myserver1   clientB
tran-2          22/07/2019 13:21:57 Final.Response      Myserver2   
tran-2          22/07/2019 13:21:57 BackEnd.Response    Myserver2   
tran-2          22/07/2019 13:21:53 BackEnd.Request     Myserver2   
tran-2          22/07/2019 13:21:53 Initial.Request     Myserver2   clientB
tran-3          22/07/2019 13:22:14 Final.Response      Myserver1   
tran-3          22/07/2019 13:22:14 BackEnd.Response    Myserver1   
tran-3          22/07/2019 13:22:00 BackEnd.Request     Myserver1   
tran-3          22/07/2019 13:22:00 Initial.Request     Myserver1   clientA

The table has millions of rows and I want to run the following stats on it max processing time, min processing time, median processing time, average processing time, 90th percentile processing time, 99th percentile processing time. I want to do this on the end to end and backend component i.e (Final.Response timestamp -Initial.Request timestamp) and (BackEnd.Response timestamp - BackEnd.Request timestamp)

The only way I have found to be able to do this is export to csv, split the files into ~ 1 million rows per file (Excel limit of row per sheet) and import into excel with multiple sheets until the excel file gets too large and then create a new excel file with several sheets. Then have a separate spreadsheet with a summary of each sheet in each excel file and take an average of the average or median of the medians which is less than Ideal and the whole process takes about a day

Anon
  • 25
  • 5

0 Answers0