0

I am trying to create an "instant cloud flow" on Power Automate.

Context:

  • Excel file "A" contains a table "X" which gets updated regularly with new rows at the bottom containing new information.

  • Excel file "B" contains a table "Y" with the same characteristics, number of columns, headers etc. Except for the number of rows since the table "X" is being updated regularly with new rows of data.

  • Both files are stored on OneDrive cloud and will possibly move into Sharepoint file storage, so they will be in the cloud, not stored locally on any device.

Challenge/need:

  • I need table "Y", within file "B", to mirror the changes happening on table "X" from file "A". Specifically the new rows of data being added to table "X":

Internet/world > New rows of data at the bottom of Table "X" of file "A" > These same new rows get copied into also the bottom of Table "Y" of file "B". Basically both tables, "X" and "Y" need to stay exactly the same with a maximum interval of 3 minutes.

Solution tried:

  • I tried a flow which gets triggered every minute. In this flow, I tried creating an array containing the new rows of data added to table "X". Then using the Apply to each control with the values from this new array, I tried the actions Add a row into a table, followed by Update a row for each item inside this array. Keeping in this way table "Y" updated as per table "X". This part works, rows are added and updated on table "Y".

My problem:

  • The Condition that compares the data from the 2 tables, decides that all rows from table "X" are new data, even though some are already present in table "Y". This is a problem because too many rows are added to table "Y" and the tables become out of sync due to the difference in the number of rows/body length. In my understanding, this happens because an item/object is generated by List rows present in a table called ItemInternalId.

This ItemInternalId generates different id numbers for the same rows already updated previously, and because of this, the condition identifies all rows on table "X" as new data to be updated on table "Y".

enter image description here

Questions:

  • Could someone confirm that this ItemInternalId is the problem here? I am in doubt because I tried removing this by creating another array using the Select action and then proceeded using just the columns/headers I need, excluding this way ItemInternalId. Problem is that the "header" is excluded (which I need), containing only the value, and also the condition proceeds to identify all rows on "X" as new data again anyway...

  • Maybe the problem is that I am doing it wrong and there is another simple, or better way to get an array with the new items from table "X"? Here is the condition that I use to try to feed feed a new Array with the new rows from table "X":

enter image description here

Thank you

Verminous
  • 490
  • 3
  • 14
  • 1
    Out of interest. Is there a reason why you are not simply using the copy file action and make a copy of the file every three minutes? – Expiscornovus Oct 10 '22 at 09:10
  • It's an access permissions issue. File "A" will be accessed by a large number of users when they will report progress. File "B" is accessed only by team controlers. File "B" contains sensitive data on other tabs that cannot be accessed by other users. – Verminous Oct 10 '22 at 11:34

1 Answers1

0

I found a workaround. I will not accept this as the right answer because it is just a workaround not the definitive solution to the problem.

Basically, The file "A" needs to have a "X" table with just 1 blank row. The Power Automate flow will "add new rows" with the information to this table.

Then on file "B" the table "Y" will need to be created with a certain amount of rows depending on how much data comes in per day, but can be like 100. Then create a Power automate flow that "updates the table" this will add the information from "X" table to "Y" table.

Please be aware that you will need a Key column on both tables so that Power automate knows what rows to update. You can just use basic numerical order for each row on the Key column.

Verminous
  • 490
  • 3
  • 14