0

I have the following problem in StreamInsight. I have a query where new tasks from an order came in and trigger an output adapter to make an prediction. The outputadapter writes the predicted task cycle time to a table (in Windows Azure). The prediction is based on neural networks and is plugged in in the outputadapter. After the prediction is written in the table I want to do something else with all the predicted times. So in a second query I want to count the number of written tasks in a time window of 5 minutes. When the number of predicted values saved in the table is equal to the number of tasks in an order, I want to get all the predicted values from the table and make a prediction of the order cycle time.

For this idea I need to make a new event in my outputadapter to know the predicted time is writen in the table. But I don't thinks its possible to enqueue new events in the streaminsight server from an outputadapter.

Maybe this figure makes the problem clear: http://i40.tinypic.com/4h4850.jpg

Hope someone can help me. Thanks Carlo

Carlo
  • 3
  • 2
  • I just want to make sure I understand you correctly. You have a task input adapter that feeds a query whose output is dumped into SQL. A neural network prediction engine loads that data from SQL performs some operation on it and then writes it back to SQL. You then want to get those results from SQL and run them through another query? – TXPower275 May 14 '13 at 20:26
  • Almost correct, the query feeds the outputadapter => the outputadapter calls the neural network to predict the cycle time => gets the result back => then the outputadapter writes it in the azure storage (not SQL but comparable) and then i want to generate a new event to call the second query to read all predicted times from the table – Carlo May 14 '13 at 20:52

1 Answers1

1

First off, I'm assuming you are using pre-2.1 StreamInsight based on your use of the term "output adapter".

From what you've posted, I would strongly recommend that your adapters do either input or output, but not both. This cuts down on the complexity, makes the implementation easier, and depending on how you wrote the adapter, you now have a reusable piece of code in your solution.

If you are wanting to send data from StreamInsight to your neural network prediction engine, you will need to write an output adapter to do that. Then I would create an input adapter that will get the results from the neural network prediction engine and enqueue the data into StreamInsight. After creating your stream from the neural network prediction engine input adapter, you can use dynamic query composition to share the stream to a Windows Azure storage output adapter and your next query.

If your neural network prediction engine can "push" data to your input adapter, that would be the way to do. If not, you'll have to poll for results.

There is a lot more to this, but it's difficult to drill in to more specifics without more details.

Hope this helps.

TXPower275
  • 511
  • 2
  • 9
  • Thanks for your answer but I forgot to tell there is also an other query that updates the prediction table, this query measures the real cycle time of a task. The actual order prediction depends on tasks in the new order and pending tasks from other previous orders that are still running. But I think the idea to let my neural network push new data to an input adapter is the same as the output adapter pushing this data. I think this is a solution. How could I do this, because polling is not efficient. Do I need to work with an observable neural network class and an observer input adapter? – Carlo May 15 '13 at 15:34
  • And yes I'm using the StreamInsight version 2.1 – Carlo May 15 '13 at 15:37
  • If you are using StreamInsight 2.1 then I would suggest using the newer Rx sources and sinks model. It's easier to develop for. Remember that an Observable source is a "push" and an Enumerable source is a "pull". How you implement your code to communicate with your neural network prediction engine depends on the way it works and/or the APIs it provides to work with it. – TXPower275 May 15 '13 at 15:42
  • I would look at creating an Observable sink that allows StreamInsight to push data to your neural network prediction engine and then an Observable source that the neural network prediction engine pushes to StreamInsight. – TXPower275 May 15 '13 at 15:44
  • If this response answers your question, please mark it as the answer please. – TXPower275 May 15 '13 at 19:33