TL;DR Turn this option on, from false to true
Whilst looking at a problem SQL server instance, I was on the diagnostic journey looking at why the query plan cache was getting totally cleared every few minutes. It turned out the server had bad memory setup and SQL server was suffering some bad memory pressure. However I did learn about a setting that I find hard to think of a reason why you’d not want to use it in a typical (what install is typical tim?) setup. This option Optimize for Ad hoc Workloads tells SQL server to not cache the query plan for a query until it sees it twice.
This is important where the SQL query work load is very varied, particularly where ad-hoc, non-stored procedure queries are being ran as they can bloat the query cache. The query cache is used to store the compiled execution plan required to execute a particular query. When a new query is encountered, the plan is calculated and put in the cache to save having to compute the plan again, if the query is seen again. The problem is that with ad-hoc queries they are unlikely to be seen again, thus SQL server is using up memory unnecessarily that could be used for better things.
I understand, but need to verify, that this is also true of ORMs such as Entity framework, that although it does create parameter based SQL to execute, in the SQL it sends to the server, the length of those parameters can vary depending upon the length of the values in those parameters. Thus this can create a large number of query plans for what are essentially very similar queries. (OK technically the length of the searched text can vary the query plan but run with this).
The setting will greatly reduce the number of plans and memory used for them on the server as only if they the query is seen twice will it be fully cached. The first time it is seen a stub is created that is enough to spot if the query is seen a second time, only then will the query plan be cached. Truely ad-hoc queries wont be seen again and space in the cache is saved. The compile time is not really worth worrying about because having to compile the query twice is no big deal as after the second compile, the further 100,000k executions can come from the cache, so proportionally its an efficiency worth having for a tiny, tiny hit of compiling the query plan twice.
There is a good article here on how start looking into if you have bloat,http://www.sqlservercentral.com/articles/SQLServerCentral/91250/
There is a similar setting of Forced Paramatization for details of this setting see this post by Brent Ozar, https://www.brentozar.com/archive/2018/03/why-multiple-plans-for-one-query-are-bad/ basically this setting forces the server to look more carefully at the plans and infer where parameters would exist if you were to paramatize the query. This is great for reducing plans in the cache but increases processor work as it has to examine queries a lot more to work out where the parameters lie. It can also lead to bad parameter sniffing as before the plans would have been totally bespoke, now they will be shared and the exact parameter can change the optimal plan.
In previous blog posts we have talked about the way that other applications can drill down into parts of GP by using an direct approach via a pluggable protocol handler that is installed when GP is installed. Sometimes this does not work, to debug if the protocol handler has an end point to talk to follow these instructions.
Download Handle.exe which is part of the SysInternals tools that Microsoft acquired some time back, written by Mark Russinovich,
Extract the Zip file, if you have 64 bit system use the 64 bit version, otherwise use the plain version.
Use PowerShell to pipe out and decode handle.exe output
Open a PowerShell editor on your machine (e.g. start menu, search powershell, launch Windows PowerShell ISE)
In my case I extracted the handles.exe into the C:\Users\tw\Documents\ folder, you will need to change the path for where you extracted it to…
Execute the following PowerShell Command:
C:\Users\tw\Documents\handle64.exe net.pipe -accepteula
This will accept the user agreement and pipe the output of the handle viewer to powershell for decoding.
This gives the following output.
You can see that visual studio and Dynamics.exe have named pipes running.
If I close Dynamics GP, then run it again…
You see that Dynamics .exe is not listed as having named pipes available anymore as it is closed.
Using powershell to list all
The PowerShell modules for GP allow a number of DevOps activities. For example you may install the Dynamics GP Lesson (sample) company TWO and install the system database via these PowerShell modules. I take full advantage of the PowerShell modules when running and building Docker images for Dynamics GP.
Today whilst doing this I encountered the following error when moving to build the image for a different version of Dynamics GP:
Macro failed to complete successfully.
The PowerShell script works by creating a macro file in the GP\data folder, the PowerShell script then runs DexUtils passing the macro file as a command parameter. The macro file tells utilities what it needs to do (in this case create the system database), you can see this macro in the folder below, captured after this error occurred in the Docker container. CreateSysetmDatabase.mac is the macro file that PowerShell is using to create the system database.
We have no idea at this point what the error actually is, as the error description of “Macro failed to complete successfully” is not very helpful. However you will notice also in that folder the “DexMacro.log” file. Lets look in that file by reading it with the command “type DexMacro.log”.
So below is the output of that command, the contents of the log.
There we can see the issue, I’m using SQL Server 2017, I already know this is only compatible with GP2018 onwards. However you will notice from the folder name I’m trying to use it with GP2016. The error is badly worded, it says you need to upgrade to server 11 from 14. It actually means you need to use an older version of SQL for this version of Dynamics GP. Something I already knew but my focus wasn’t in this area at the time I was developing the Docker container.
Please comment if you found this helpful – motivates me to blog more!
If you are using Dynamics GP with Visual Studio 2017 –> , you will find that the project templates for Dynamics GP do not show up in the new project or new item menus.
I wrote a solution to this in the form of a Visual Studio extension, but it seems that the extension is not showing up when it is searched for in the extension gallery, something brought to my attention via the Microsoft forums.
However if you page through the results then go back to page one of the results, then all of a sudden it appears in the list!
I’ll be bringing this to the attention of the VS team, in the meantime the video shows this work around.