Background
I have an extremely large data table that takes up to 12 hours to run for around 1 million input scenarios on a high-end 64bit machine. The scenarios are based on a number of discrete Excel models, that are then fed into a financial model for detailed calculations
To improve the process, I am looking to test and compare the speeds of:
- The current manual process
- Using VBA to refresh the Data Table (with
Calculation
,ScreenUpdating
etc off) - Running a
VBS
to refresh the Data Table in a invisible Excel instance
So, I am looking for the best approach to programmatically manage a Data Table
Update: using code in (2) and (3) did not provide a benefit on testing a simple example with a workbook with a single large data table
Rather surprisingly there seems to be very little - possibly no - direct support in VBA for Data Tables
My current knowledge and literature search
- QueryTable
BeforeRefresh
andAfterRefresh
Events can be added with this class module code. Intellisense doesn't provide this as an option for Data Tables - Individual PivotTables and QuertyTables can be accessed like so
ActiveWorkbookk.Sheets(1).QueryTables(1)
. Not so Data Tables - Eliminating all other
Data Tables
and then running aRefreshAll
was suggested in this MrExcel thread as a workaround.
The workaround is certainly do-able as I only have a single Data Table, but I'd prefer a direct approach if one exists.
Yes, I'm sticking to Excel :)
Please do not suggest other tools for this approach, both the input models and the overarching model that uses the data table are
- part of a well established ongoing process that will stay Excel based,
- have been professionally audited,
- have been streamlined and optimised by some experience Excel designers
I was simply curious if there was a way to tweak the process by refreshing a specific data table with code, which my initial test results above have concluded no to.