I was working with a client recently, and one of their complaints was about slow performance on some of the reports they were running in SQL Server Reporting Services. Since I was walking into the scenario somewhat blindly I ran some basic evaluation scripts against the SQL Server instance.
As an aside, there are several things in the SQL Server installation that get configured by default, that are just horribly, horribly wrong (one is the autogrowth settings on the model database). One of the others is the fact that is the default setting for max server memory is 2147483647 megabytes (MB). I’m not sure about you, but I don’t generally have that much memory in my data center.
Anyway, the recommended approach to setting max memory on a SQL Server is to use 80% of the physical memory on a server. This can vary in cases–if you have a whole lot of memory, you can probably allocate more than this–Windows generally only needs 6-8 GB to run.
Here’s a quick script to do this automatically:
declare @srvmemory int declare @sqlmemory int select @srvmemory = total_physical_memory_kb/1024 from sys.dm_os_sys_memory; select @sqlmemory = round(@srvmemory*.8,0); exec sp_configure ‘show advanced options’, 1; RECONFIGURE; exec sp_configure ‘max server memory’, @sqlmemory; RECONFIGURE; GO This script is designed to be run against standalone instances of SQL Server–if you are stacking multiple instance of SQL Server, you need to carefully evaluate how much memory to allocate each of them.