0

In a proposed project the plan is to divide large data sets into chunks, expose them by SFTP into a swarm CPU type of LAN cluster architecture for processing of historical numerical and financial data for simulations. Time consuming.

What about using something like Influxdb? I know about sharing and scaling a bit. Could we take advantage of this in how it distributes data to nodes and perform processing on them?

ElHaix
  • 269
  • 3
  • 13

1 Answers1

0

Do you have some other requirements? Multiple tools can achieve that, here is a list of some job schedulers which can distribute a workload (some are open source)

For instance:

  • What is your infrastructure size?
  • Do you have any security requirements?
  • Do you have any need for tight integration with other solutions?
  • etc.
XYZ123
  • 101
  • 1