0

I have a single instance of SQL Server 2008 Analysis Services running on Windows Server 2008 x64 with 8GB of RAM on the server. All of the settings for memory are left to their defaults.

When the SSAS service starts, it behaves normally. However, whenever a query is run against the instance, the SSAS service continually will consume all of the memory on the server, to the point that the server needs to be rebooted. At present, the TotalMemoryLimit and LowMemoryLimit settings are set to 80% and 75% respectively. The HardMemoryLimit is set to 0 by default.

Why is SSAS using all of the memory on the server, despite the limit settings? Does HardMemoryLimit need to be set?

Brian Knight
  • 1,175
  • 2
  • 8
  • 17

2 Answers2

2

From technet SQL Server 2005 Analysis Services (SSAS) Server Properties

SSAS has a special memory “cleaner” background thread that constantly determines if it needs to clean up memory. The cleaner looks at the amount of memory used. The following basic processes are used by the cleaner to control amount of physical memory used by Analysis Server:

If the memory used is above the value set in the TotalMemoryLimit property, the cleaner cleans up to this value.

If the memory used is under the value set in the LowMemoryLimit property, the cleaner does nothing.

If the memory used is between the values set in the LowMemoryLimit and the TotalMemoryLimit properties, the cleaner cleans memory on the need-to-use basis.

If the value specified in any of these properties is between 0 and 100, the value is treated by SSAS as a percentage of total physical memory. If the value specified is greater than 100, the value is treated by SSAS as an absolute memory value (in bytes). Note that when Analysis Server is processing, if it requires additional memory above the value specified in TotalMemoryLimit, it will try to reserve that amount, regardless of the TotalMemoryLimit value.

Note that MaxTotalMemory does not define a hard limit on the amount of memory that Analysis Services uses. Rather, it is a soft limit that is used to identify situations where the server is experiencing memory pressure. For some operations, such as processing, if Analysis Services requires additional memory beyond the value of Memory\TotalMemoryLimit, the Analysis Services server attempts to reserve that memory regardless of the value of the property.

In short, TotalMemoryLimit can be passed up during heavy loads.

From SSASPerfGuide:

[W]hen processing multiple partitions in parallel or processing an entire cube in a single transaction, you must ensure that the total memory required does not exceed the value of the Memory\TotalMemoryLimit setting. If Analysis Services reaches the Memory\TotalMemoryLimit during processing, it does not allow the aggregation buffer to grow and may cause temporary files to be used during aggregation processing.

Also most people run SSAS on the box that has SQL. Remember to keep 1.5-2GB for OS and light services, set and then add the SQL memory minimum in there (1GB? 2?) plus about .5GB for SQL overhead (memory sets the buffer cache size, not all the memory that sql uses). Now set the Memory limit percentage based on that. 100-(100*(os+sql)/total).

If you have 4gb used for SQL and os, then set the TotalMemoryLimit at 50 (100-(100*4/8)). Be aware of the temp file issue... if it grabbed all the memory on your box then full-cube processing may really want that much working space!

Mark
  • 2,248
  • 12
  • 15
1

I would check the perfmon counters to confirm what is exactly using the memory - It is not clear if you are using task manager to identify the use. Analysis Service should peak out its memory at the total memory limit. Do the queries continue to run? - why do you have to reboot? is it to kill the ongoing queries? I would check the cleaner counters to see if the issue is there - ultimately you may just need to add more memory for the queries you are running or take a look at your cube designs

CPU_BUSY
  • 2,332
  • 17
  • 17