I am using PredictionIO Model to train, Build & Deploy as a service on a sample data set with default parameters.
Now I have a scenario where the dataset is getting huge day by day (incrementally) and I wanted to understand how PredictionIO Model detects the new data and trains it and deploys it according as a service?
As PredictionIO stores data in event server & capable of distributing the data using Apache SparkMLlib, How it detects there is new data available in the dataset?
Thanks in advance for your help