I have a Multiobjective Particle Swarm Optimization algorithm for a complex problem, it uses a big population (4000 particles) and is a time consuming simulation (4 - 6 hours of execution).
As the algorithm keeps an archive, a repository of best solutions found so far, in order to analyze algorithm convergence and behavior I need to save some data from this repository and sometimes from the entire population at each iteration.
Currently in each iteration I'm (Java speaking) copying some attributes from the particle's object (from the repository and/or the population), formatting it to a StringBuffer in a method that runs in a separate thread from the simulation and, only at the end of the program execution I save it to a text file.
I think my algorithm is consuming memory in a bad way by doing this. But thinking also about performance I don't know what is the best way to save all these data: should I follow the same logic but save a .txt file each iteration instead of doing it by the end of the algorithm? Or should I save to a database? If so, should I save it in each iteration or at the end or another time? Or should I approach it differently somehow?
Edit: Repository data are often in a [5 - 10] MB range while the Population data occupies [100 - 200]MB memory. Every time I run the program I need about 20 simulations to analyze average convergence.