0

I've defined the input stream below. The Datetime string is like 2010-09-01 06:59:00.000, the result is a double like 157,382 and the UnixDateTime is in type long like 1283324340111.

define stream HStream(ID int, DateTime String, Result double, UnixDateTime long);

I want to make length batches for 100 events that display an average for the result column and I want to compare these batches with each other. I want to do this sliding comparison for the next 5 batches (that all include 100 events each). So I want to compare the first batch (0-100 event) to the second batch (101-200), untill the sixth batch (501-600). And I want the second batch to compare untill the 7th batch. What I want to achieve with the comparison is that when 4 or more (from the 5) batches have a batch average result that is all bigger or all smaller than 1 (compared to the average result from the original batch), then I want to log the info about the original batch.

My code is below. The problem I don't know the exact syntax. I've looked at the tutorials and documentation at WSO2 and Siddhi, but I cannot solve the problem.

@info(name = 'MovingAverageQuery')
from every e1=HStream, e2=HStream[e1.avg(Result) <= avg(Result))+, e2=HStream[e2[last].avg(Result) <= avg(Result)]
select ID, DateTime, Result, 
avg(Result), UnixDateTime
output last every 100 events
insert into OutputStream;

@sink(type='log', prefix='LOGGER')
define stream OutputStream(Nr ID, DateTime String, Result double, Avg double, UnixDateTime long);
Buddhi
  • 839
  • 10
  • 15
user7432713
  • 197
  • 3
  • 17

1 Answers1

0

You have to use two queries for the requirement, one is to calculate the averages (Average100Query) and other is to compare the averages(IdentifyIncreaseingTrend).

@App:name("AverageSequence")
@App:description("Identify the average increase trend")

define stream HStream(ID int, DateTime String, Result double, UnixDateTime long);

@sink(type='log', prefix='LOGGER')
define stream OutputStream(ID int, DateTime String, avgResult double, UnixDateTime long);

@info(name = 'Average100Query')
from HStream#window.lengthBatch(100)
select ID, DateTime, avg(Result) as avgResult, UnixDateTime 
insert into AverageStream;

@info(name='IdentifyIncreaseingTrend')
from every e1=AverageStream, e2=AverageStream[e2.avgResult >= (e1.avgResult + 1)],  e3=AverageStream[e3.avgResult >= (e2.avgResult + 1)],  e4=AverageStream[e4.avgResult >= (e3.avgResult + 1)], e5=AverageStream[e5.avgResult >= (e4.avgResult + 1)]
select e1.ID, e1.DateTime, e1.avgResult, e1.UnixDateTime 
insert into OutputStream;

Some of the syntax issues I have noticed is after a calculation is done, such as sum(result) you have to use as keyword to name that attribute as sum(result) as totalResult . In sequences, you cant use average function as it need to be done for multiple events but you canuse the renamed attribute totalResult.

Niveathika
  • 1,319
  • 2
  • 8
  • 17
  • Thank u for your answer. I have implemented your code, but when I run the app I get this error: ````AverageSequence.siddhi - Siddhi AppAverageSequence is in faulty state.```` – user7432713 Oct 17 '19 at 09:08
  • 1
    I tested in Siddhi Tooling 5.1.0, https://siddhi.io/en/v5.1/docs/quick-start/ and it is running – Niveathika Oct 17 '19 at 10:13
  • 1
    The app may have not deployed before starting the app. Wait till you see the log, ` Siddhi App AverageSequence successfully deployed.` before starting the siddhi app – Niveathika Oct 17 '19 at 10:14
  • I did something wrong. It was not a error of Siddhi. Thank u for the help. I have another question about the query u wrote. The query u wrote only looks if the avgResult is 1 bigger, can I also put in the same query that it looks for 1 smaller? Or do I have to make a seperate query for that? – user7432713 Oct 17 '19 at 10:33
  • You have to add a separate query for that since you need that check for all 5 consecutive events – Niveathika Oct 17 '19 at 11:31
  • Please feel free to reach the siddhi team on slack. Its is much faster than stack overflow, https://siddhi.io/community/ – Niveathika Oct 17 '19 at 11:32
  • Ok, I will do that! – user7432713 Oct 17 '19 at 11:53