I'm trying to build a model framework using python for an engineering application. The logic is as follows:
Read inputs from user (Operating conditions, operating fluid)
Enter guesses for fluid flow rate, pressure and two more variables (x1,x2,x3,x4)
Compute other properties of inlet fluid using pressure of fluid as a guess (Y1)
Use Y1 and X2 to compute another required variable (Y2). Calculate error between y2 and user provided value
Use estimated Y2 and X3 to calculate Y3. Check if Y3 = 0, if not calculate error.
Use calculated Y3 to run further calculations. Use Y3 and X4 to calculate Y4. Calculate error between Y4 and user provided value.
Calculate Root Mean Square error (RMS) of the 4 errors calculated above.
Use solver to minimize RMS by using x1,x2,x3,x4 as inputs and using the appropriate constraints.
I set this model up in Excel using VBA and it works very well. Gives me the expected results but the GRG solver is painfully slow.
I have the exact same routine in Python (I stepped through and checked the values at each step in both VBA and Python - values are pretty close). When I try to minimize the function using global optimization routines in scipy or the SQSLP method in minimize, the program doesn't give me a converged solution (RMS is very high).
Root seems to work but it's a bit patchy, very dependent on initial guess and there are no bounds on the variables which makes me uncomfortable. Is there a good way to troubleshoot this issue?
I've tried calculating the errors two ways: as a difference (Y - target)**2 and as a dimensionless quantity (Y/target - 1)**2. Neither approach helps.