Updating Dynamics GP Comment ID in Sales Order Entry

A library of pre-written comments can be defined in Dynamics GP and then stored for later quick recall. A name known as the Comment ID is used for quick recall of the stored comments. The lookup table that stores the library of pre-written comments is SY04200. This table stores comments for many parts of GP, so SOP comments will have a the CMTSRIES field equal to 3. When an sop document is entered and a comment placed on it, by selection or typing the comment id. Then that order will be stamped with the pre-written comment corresponding to that comment id in the library. Think of this process as being like using the old rubber ink stampers for stamping previously prepared comments onto documents using an ink pad. A COPY of the comment is put against the order in the SOP10106 table under the field CMMTEXT (and the other comment fields too, more on them later).

SOP Entry Comment ID Window showing manual text entry

If the source comment is changed at a later point in time, then any future documents will be stamped with the new texts, but the documents that have already been stamped will not have the comment changed as it is a COPY of the comment as it was when it was attached to the order. See later for ideas on “fixing” this.

It is also possible and normal to go in and edit the comment text, adding or changing the text that was added from the comment id. The comment ID name will remain in this case so that you may still see the comment has been applied at some point, and a pencil edit icon will be added next to the comment ID to show it has been manually updated.

Pencil Edit Icon on Comment ID

If the comment ID name is removed from the field then the whole comment is erased, including any manual edits.

For the developers out there, note that the comment is added to the SOP10106 table immediately on entering the comment ID, before the order is saved. Also note that the (text type) CMMTEXT field is also broken up into chunks of 51 characters over a number of comment fields. This is to support report outputs that cannot handle the massive text type field.

COMMENT_x fields 1-4 char(51) break up CMMTTEXT

I want to update the exiting comments from comment IDs

So something has changed about your standard comment so the comment in the comment ID is updated, that deals with future documents, but what about the exiting sales documents that have been stamped with the previous comment?

There are two obvious solutions:

  • Use a mail merge Dynamics GP macro to, for each order document of concern, update the comments, but opening each order, removing the comment ID and replacing it again, then finally saving the order.
  • Use SQL to update the database directly.

The first method of macro will work if you have no in house SQL skills. Be aware though that any edits to the comments will be also lost, check with the users that they are not in the habit of editing the text that has been stamped into the document.

The second method is ok if you are confident with SQL, or ask your dynamics GP partner to help if you are at all concerned.

SET CMMTTEXT=REPLACE(CAST( CMMTTEXT AS VARCHAR(MAX)),'Old Comment Text', 'New comment text')
SOP10100 so
SOP10106 sc
AND so.COMMNTID='CommentIDName'

The above SQL will update only for orders stamped with the comment ID in question. It will then replace the old text with the new, leaving any extra text, so it is low risk in terms of losing data. However if someone has edited the text from the template, then this will not work. You may need to be careful if you have comments that span multiple lines, as the line returns will need to be taken care of, you could always source the text from an existing record for ease, using a sub query.
WHERE SOPNUMBE='{sourcedoc}' and soptype={sourcetype}

SOP10100 so
SOP10106 sc
AND so.COMMNTID='CommentIDName'

Replace CommentIDName with the comment id to target. And replace ‘Old comment text’ and ‘New comment text’ as appropriate.  Please check this SQL before trying it and run it on a copy of your data in your test system first. These scripts are for ideas only and not claiming to be a tested solution.

Hopefully this post will help someone, let me know with a comment if it did.

Dynamics GP Item Stock Enquiry Window taking too long to display data

If it takes a long time for the Item Sock Enquiry window to display data after entering an item number, this is a sign of large amounts of data in GP slowing things down.

Stock Inquiry

In my example, entering an item number into the stock enquiry window, it then takes nine minutes to display the data. That is not a typo, I timed it with my iPhone…


Why is this so slow?

The document that helps us is The Balance in Item Stock Inquiry is incorrect in Microsoft Dynamics GP , this document under “more information” explains that IV30300 (Transaction Amounts History detail) is loaded for the item and iterated through  totalling up the values to give the balance. Running the following SQL gives us the items with the most transactions in that table, and the item I’m looking at has 9253 rows.

FROM IV30300

So this is more data that expected. The “problem” is that there is too much data for the way GP works. The solution is to trim the historical transactions. Removing history from the IV module will reduce  the number of records the window has to iterate through and make the performance acceptable.

Document attach feature Dynamics GP database and BusObjKey formats

I’ve been working on an archive project recently where I’ve been archiving records from GP into another archive database. This meant looking at how attachments are handled in Dynamics GP, in order to take any attachments related to records being moved into the archive.
The Table CO00102 is the table that relates the attachment to the record it is attached to and contains the BusObjKey column, that is implemented as a resource path back to the database record that the attachment is related to. Using a resource path makes it flexible, allowing the widest variety of GP object to be compatible with the attachment system.

The resource path is implemented in a inconstant manner, varying due to the different nature of the records involved and also through misuse by developers too (I think). 

An example of the format pattern is:

0\PM\Purchase Order History\290941

Where the elements are as follows:

0 Dynamics GP Product Dictionary ID (see hyperlink)
PM Table Series ID
Purchase Order History Table “Display Name”
(see Microsoft Dynamics GP>>Resource Descriptions>> Tables)
290941 Record ID
Could be customer id or purchase order number or other identifier for record

However for attachments to notes windows, the Record ID is the NOTEINDX (ID of the record) but encoded into a hexadecimal version! Some developer was obviously thinking we didn’t have enough hoops to jump through when they imagined that one up and fancied burning some more CPU on the SQL server! 

Update 2017-01-12: If you've used the OLE notes migration tool, you will notice that the OLE notes look like they used this way of encoding the record ID too, this might be where this scheme came from.

The article Document Attach vs. Notes and how to locate whether a master record has a document attached explains how to get from the note index against an attachment record to the hex version of that note index for participating in the BusObjKey column.

CONVERT(VARCHAR(MAX), CONVERT(binary(4),cast(NOTEINDX as integer)), 2)
In the screen shot above, the note ending in number 000001CF2 is actually note index ID 7410.
Now examining the final format format in the list, this is for the Advanced Credit Control Module (module 1577). For this module they decided that they would shorted the format of the path. Note how the format is one “folder” shorter in depth than the other paths, it misses out the “Table Series”, then jumps right to the table display name. Doing this screws up some SQL scripts that you might encounter out there on the internet that are used for reporting on attachments. They assume the same number of path separators when deconstructing the path but this has one short–oh dear! So what I’m saying is that you must program/script defensively when working with these BusObjKey fields as you are at the mercy of developers, also expect to use a lot of case statements or to create a SQL scalar function to convert your BusObjKey values!

I would be really interested to expand this article if you have any other formats that you have encountered, please drop me a comment against this post!

SQL server FILESTREAM for Dynamics GP file attachments in coattachitems -is it possible?


In previous post (Managing document attachments  in Dynamics GP) I looked at the attachments feature in Dynamics GP 2013-R2 and above. I talked about how files attached to GP in this manner are stored in the database, specifically in the coAttachmentItems table. This table can get to be a huge size as a consequence of file attachments file sizes accumulating, then dwarfing the other GP tables by size and having knock on issues like size of and time taken for SQL back up and restores.

The table looks like this:

/****** Object:  Table [dbo].[coAttachmentItems]    Script Date: 17/12/2016 23:32:10 ******/



CREATE TABLE [dbo].[coAttachmentItems](
[Attachment_ID] [char](37) NOT NULL,
[BinaryBlob] [varbinary](max) NOT NULL,
[fileName] [char](255) NOT NULL,
[Attachment_ID] ASC

The files are stored in the BinaryBlob and filename stored for reference in the filename field. GP does this in two steps, creating the database row with empty blob (0x0), then populating the blob data.

INSERT INTO coAttachmentItems (
Now populate the blob
UPDATE coAttachmentItems
SET BinaryBlob.Write(@Bytes, @Offset, @Length)
WHERE Attachment_ID = @Attachment_ID
The Dex Script log shows this too with two stages or CreateAttachment then InsertAttachment methods.
22:27:30      'CreateAttachment() of form coAttachManagement', "", "0\RM\Customer Maintenance\TEST", "C:\Users\user\Documents\itemdesc.txt", "itemdesc.txt", 0, 1
22:27:30 '[0]'
22:27:30 '[0]'
22:27:30 '[0]'
22:27:30 '[0]'
22:27:30 '[0]'
22:27:30 '[0]'
22:27:30 '[0]'
22:27:30 'InsertAttachment() of form coAttachManagement', "4bb1e4e4-fe24-41ad-b102-a1f51ba76b34", "C:\Users\user\Documents\itemdesc.txt", "itemdesc.txt"

In the previous post I talked about how it would be helpful if the attach feature supported SQL Server FILESTREAM. This would allow the attachment files to be stored in file storage, which is a different tier of storage, cheaper for us to manage on the file system rather than in the database. I understand there are advantages to keeping the data together in the database but this does not scale well for us.
Today I wondered if it would be possible to just implement FILESTREAM myself for attachments in this table. Although this would be naughty and definitely would not be supported, it was an interesting experiment to try. 
To use file stream FILESTREAM columns must be accompanied by a corresponding uniqueidentifierROWGUID column and the column must be varbinary(MAX). These kinds of tables must also be accompanied by a unique index too.

Looking at the table, the data is being stored in a varbinary(MAX) however although the “Attachment_ID” field is a GUID in its content, sadly the content is being stored in a char(37) database data type. It needs to be a uniqueIdentifierROWGUID column to allow FILESTREAM. I suspect that GPs software language, Dexterity does not support a GUID data type, hence why its been put in a char(37). Annoyingly this means I can’t change the BinaryBlob to have a FILESTREAM attribute.


Thus it looks like we have hit a brick wall with trying to implement FILESTREAM this way.

Thoughts for next attempts:

  • Perhaps we can intercept the InsertAttachment method using a GP Visual Studio addin in the GP client, so as to redirect to a custom table that is backed with FILESTREAM? The problem here would be pulling the data back into the form again.
  • Perhaps we could create a new table that uses FILESTREAM and remove the existing table, replacing it with a view that converts the GUID back to char(37), so that GP is updating the view.  However this is going too far in changing the native product tables, I would not be comfortable with this nature of “table hacking”, even if it technically worked I’d be afraid of it breaking the GP install in the future.

Let me know what your thoughts are with a comment against this post.