We have an image processing workflow product. Typically 10,000->100,000 images can be run though our processing in a job. More than one job may be pending.
Currently, all the image processing is performed in our home grown imaging library, a managed C++ library, .NET compatible. It is run in the user’s application space. What I mean by that is that if you log on as “PeteSmith” the images will be run on Pete Smiths’ account.
Currently, we only allow one instance of this image processing at a time. Customers are asking us for a new version, one that allows more than one instance to run at the same time, so the question of how we do this is now something we are examining.
The idea of getting processing off the “users account” and using a “system account” to do the processing in the background is appealing. It is appealing, because of the way windows services are naturally managed by OS events like logging in and logging out and other system resource utilization events alarms.
It appears to me that all we would need to do is manage a small number of well-defined events, well documented by Microsoft. That’s all nice and wonderful. But what I need to understand is what going to a service implantation for our image processing code means for performance, from our customer’s point of view. In their view, they need more processed, faster.
QUESTION How I should think about tradoffs:
1) Using a service to run a job vs. running N different “instances” of the software running only on Pete Smith (the users’) account?
2) Allowing N number of services to run N different jobs (no cross talk needed) in comparison to running N different “instances” of the software running only on Pete Smith (the users’) account?