I use the following rules of thumb:
- Calculate all dimensional attributes in SQL. Never use any calculated columns in DAX.
- Calculate the most complex and expensive metrics in SQL. These are usually very few, like 2-5% of all calculations. Leave the rest to DAX.
In the background, DAX is translated into SQL queries (you can see them using DAX Studio: https://www.sqlbi.com/tools/dax-studio/)
. If DAX is written correctly, execution plans will be as efficient as SQL can be. Of course, if your DAX is wrong, it might translate into a very ineffective SQL, but that should be expected.
Finally, just to set expectations correctly: a Tabular model is considered "large" if it exceeds 1 Billion records. Models between 100M - 1B records are considered medium-sized and might need some optimization (basically, you must know what you are doing). Anything less than 100M records is a small model, and if you have performance issues, it's either your data model is wrong, or your DAX is incorrect/ineffective.