My senior developer and I witnessed LookUp Transformation sending random amount of rows. For example: 150 rows are read from CSV but only 90 rows or sometimes 135 rows are sent to the database.
For test purposes we are only dealing with 150 rows but during deployment, more than 1000 to 10,000 rows are estimated.
We noticed that when settings were changed from no-cache [in cache mode for lookup transform] and Partial Cache to Full Cache, only Full Cache yielded results with full 150 row count transferred to database in comparison with 150 rows sent to sent as input to Lookup Transform. (Results were as expected). In computer B that has higher specs than computer A showing problems, we noticed that computer B produced expected results consistently.
Can anyone advise upon this issue?
Recently we noticed that this issue only occurred with originally generated CSV, however after editing using Excel and re-saving, results were fine.