Inventory Year End, Remove Sold Receipts and Cost Change History

Inventory year end closing window

FIFO has to be maintained somewhere so goods receipts are added to table IV10200, as the inventory is consumed the QTYSOLD field is incremented against the receipt, until the QTYRECVD = QTYSOLD and at this point the field RCPTSOLD is set from 0 to 1, to indicate that the receipt has been totally consumed. The order of receipts is chosen depending on the cost model, FIFO etc.

As can be seen above, as part of the year end closing, the option is provided to remove any consumed receipts before a specified date. If this option is chosen, then GP will look for receipts with RCPTSOLD=1 and DATERECD<{Date Provided}  and remove them from the IV10200 table. It will also remove the records in IV10201 that represent the demand against the receipt. You can see this if you run a SQL trace while it runs.

DELETE FROM IV10201 WHERE ITEMNMBR = 'myitem' AND SRCRCTSEQNM = 1 
 
I was worrying if this would affect the stock levels if I reconciled the inventory after removing the receipts. If you look at my other post Using SQL UNPIVOT operator to reconcile Dynamics GP inventory items then you will note that inventory reconcile calculates the available stock from the IV10200 table using:
SUM(QTYRECVD) - SUM(QTYSOLD)
As we are only removing rows where these are equal (as the whole receipt is sold), then they are rows that were not participating in the stock quantity anyway, hence no effect on stock reconciles.

In the above script SRCRCTSEQNM=1 (Source Receipt Sequence Number) is from the RCTSEQNM (Receipt Sequence Number) of the record removed from IV10200 binding demand and source. – or so it looks at first sight.
 
hey wait…
 
The key for receipts table IV10200 allows the same item to have the same shared RCTSEQNM if the dates received is different. Indeed our data has this a lot of the time, see the table screen shot for an item below, see how the item has multiple RCTSEQNM=1. I don’t think this should happen.
 
rectable
 
I had a look an our test copy of the live company, ran the above year end routine and indeed it removed all records from IV10201 for the sequence number, even though I think they related to records still remaining in IV10200! I’ve seen some posts regarding people finding missing data in their systems but not being able to track down why and what makes it happen, I think perhaps this may be an answer? It may be that I am interpreting what I am seeing wrong too, I need more information on how records relate.
 
Further investigation…
I don’t think I’m wrong, I have the feeling that we have some serious corruption in our table! After running the remove sold receipts then reconciling one of the items affected gives us this result:
 
itemreconcile
 
GP has added a record back into IV10201 to replace the one that was taken by the year end utility, I think this is a good indication that this is not supposed to have happened.
GP creates a new record in IV10201 with ORIGINDOCID field set to “RECON” (reconcile) for the full amount sold on the receipt, as it can’t now find the transaction details, as they were removed by the year end utility.
 
I’m guessing that the RCTSEQNM should auto increment for each receipt and that there should not duplicate RCTSEQNM for any item. The following script checks for instances of this happening in correctly.
 
SELECT 
ITEMNMBR,
RCTSEQNM,
COUNT(*) ,MAX(DATERECD)
FROM IV10200
GROUP BY ITEMNMBR, RCTSEQNM
HAVING COUNT(RCTSEQNM)>1
ORDER BY 4
(19498 row(s) affected)
Oh crickey! - there are 19498 item sequence number combinations where this has occurred. Looking at the first part of the result set we see this is real bad, some of the sequence numbers have been used 735 times, and that is just at first glance, I would rather not dig deeper. With the dates added, I see that this happened between years 1998 and 2009. I guess whatever the cause was went away after that year. So these days it does work how I would expect and the records are tied between the 200 and 201 tables by the RCTSEQNM (when the data is correct).
 
corruptseq
 
 
corruptseqdate
 
 
Where do I go from here?
So I need to clean up this mess, it looks like it may not be possible to recover from this and preserve data. As it happens I want to trim these receipts back to seven years anyway, so if I do that the problem will solve itself, I think. An item reconcile after the trim will rebuild dummy records for sold quantities if any are missing, so we end up in a better place than I am now.
 
 
It is not a good idea to delete a receipt and associated detail if consumed recently as it makes processing returns more difficult once a receipt has been removed.
 
If you don’t want to run year end for inventory, that updates starting qtys and summaries, you could try writing a SQL script to remove the records. This is something I've ended up doing, archiving the records in case I ever need them again too.
Here is the script I used, beware -this is only here as an example. You NEED to check this with your test environment first and be very careful about using it. I cannot fully guarantee this will be safe to run for your GP installation.
 
BEGIN TRANSACTION

DECLARE @DateThreshold DATETIME

SELECT @DateThreshold = '20091101'

DECLARE @ToRemove AS TABLE (
DATERECD DATETIME
,ITEMNMBR CHAR(31)
,TRXLOCTN CHAR(11)
,RCTSEQNM INT
,QTYTYPE SMALLINT
);

INSERT INTO @ToRemove (
DATERECD
,ITEMNMBR
,TRXLOCTN
,RCTSEQNM
,QTYTYPE
)
SELECT DISTINCT rcpt.DATERECD
,rcpt.ITEMNMBR
,rcpt.TRXLOCTN
,rcpt.RCTSEQNM
,rcpt.QTYTYPE
FROM IV10200 rcpt
WHERE rcpt.RCPTSOLD = 1
AND (rcpt.DATERECD < @DateThreshold)

--AND rcpt.ITEMNMBR='01-214'
DELETE IV10200
OUTPUT DELETED.[ITEMNMBR]
,DELETED.[TRXLOCTN]
,DELETED.[DATERECD]
,DELETED.[RCTSEQNM]
,DELETED.[RCPTSOLD]
,DELETED.[QTYRECVD]
,DELETED.[QTYSOLD]
,DELETED.[QTYCOMTD]
,DELETED.[QTYRESERVED]
,DELETED.[FLRPLNDT]
,DELETED.[PCHSRCTY]
,DELETED.[RCPTNMBR]
,DELETED.[VENDORID]
,DELETED.[PORDNMBR]
,DELETED.[UNITCOST]
,DELETED.[QTYTYPE]
,DELETED.[Landed_Cost]
,DELETED.[NEGQTYSOPINV]
,DELETED.[VCTNMTHD]
,DELETED.[ADJUNITCOST]
,DELETED.[QTYONHND]
INTO CANFDARCHIVE..IV10200
FROM IV10200
JOIN @ToRemove r ON IV10200.DATERECD = r.DATERECD
AND IV10200.ITEMNMBR = r.ITEMNMBR
AND IV10200.RCTSEQNM = r.RCTSEQNM
AND IV10200.TRXLOCTN = r.TRXLOCTN
AND IV10200.QTYTYPE = r.QTYTYPE
WHERE RCPTSOLD = 1
AND (IV10200.DATERECD < @DateThreshold)

--Remove orphan records
DELETE ld
OUTPUT DELETED.[ITEMNMBR]
,DELETED.[TRXLOCTN]
,DELETED.[QTYTYPE]
,DELETED.[DOCDATE]
,DELETED.[RCTSEQNM]
,DELETED.[ORIGINDOCTYPE]
,DELETED.[ORIGINDOCID]
,DELETED.[LNSEQNBR]
,DELETED.[CMPNTSEQ]
,DELETED.[QTYSOLD]
,DELETED.[UNITCOST]
,DELETED.[IVIVINDX]
,DELETED.[IVIVOFIX]
,DELETED.[SRCRCTSEQNM]
,DELETED.[TRXREFERENCE]
,DELETED.[PCHSRCTY]
INTO CANFDArchive.[dbo].[IV10201]
FROM IV10201 ld
LEFT JOIN IV10200 h
ON h.ITEMNMBR=ld.ITEMNMBR
AND h.TRXLOCTN=ld.TRXLOCTN
AND h.RCTSEQNM = ld.SRCRCTSEQNM
AND h.QTYTYPE=ld.QTYTYPE
WHERE h.DATERECD is null




COMMIT TRANSACTION

 
 
 

Updating Dynamics GP Comment ID in Sales Order Entry

A library of prewritten comments can be defined in Dynamics GP and stored with a name known as the Comment ID. When an sop document is entered, then by selection or typing the comment id, that order will be stamped with the prewritten comment. Think of it like the old rubber stampers for stamping comments onto documents using and ink pad. A COPY of the comment is put against the order in the SOP10106 table under the field CMMTEXT (and the other comment fields too, more on them later).

SOP Entry Comment ID Window showing manual text entry

If the source comment is changed at a later point in time, then any future documents will be stamped with the new texts, but the documents that have already been stamped will not have the comment changed as it is a COPY of the comment as it was when it was attached to the order. See later for ideas on “fixing” this.

It is also possible and normal to go in and edit the comment text, adding or changing the text that was added from the comment id. The comment ID name will remain in this case so that you may still see the comment has been applied at some point, and a pencil edit icon will be added next to the comment ID to show it has been manually updated.

Pencil Edit Icon on Comment ID

If the comment ID name is removed from the field then the whole comment is erased, including any manual edits.

For the developers out there, note that the comment is added to the SOP10106 table immediately on entering the comment ID, before the order is saved. Also note that the (text type) CMMTEXT field is also broken up into chunks of 51 characters over a number of comment fields. This is to support report outputs that cannot handle the massive text type field.

COMMENT_x fields 1-4 char(51) break up CMMTTEXT

I want to update the exiting comments from comment IDs

So something has changed about your standard comment so the comment in the comment ID is updated, that deals with future documents, but what about the exiting sales documents that have been stamped with the previous comment?

There are two obvious solutions:

  • Use a mail merge Dynamics GP macro to, for each order document of concern, update the comments, but opening each order, removing the comment ID and replacing it again, then finally saving the order.
  • Use SQL to update the database directly.

The first method of macro will work if you have no in house SQL skills. Be aware though that any edits to the comments will be also lost, check with the users that they are not in the habit of editing the text that has been stamped into the document.

The second method is ok if you are confident with SQL, or ask your dynamics GP partner to help if you are at all concerned.

UPDATE sc
SET CMMTTEXT=REPLACE(CAST( CMMTTEXT AS VARCHAR(MAX)),'Old Comment Text', 'New comment text')
FROM
SOP10100 so
JOIN
SOP10106 sc
ON so.SOPTYPE=sc.SOPTYPE AND so.SOPNUMBE=sc.SOPNUMBE
WHERE
CHARINDEX(CAST( CMMTTEXT AS VARCHAR(MAX)),'Old Comment Text') >0
AND so.COMMNTID='CommentIDName'

The above SQL will update only for orders stamped with the comment ID in question. It will then replace the old text with the new, leaving any extra text, so it is low risk in terms of losing data. However if someone has edited the text from the template, then this will not work. You may need to be careful if you have comments that span multiple lines, as the line returns will need to be taken care of, you could always source the text from an existing record for ease, using a sub query.
 
DECLARE @oldCommentText VARCHAR(MAX)
SELECT @oldCommentText = CAST( CMMTTEXT AS VARCHAR(MAX)) FROM SOP10106
WHERE SOPNUMBE='{sourcedoc}' and soptype={sourcetype}

UPDATE sc
SET CMMTTEXT=REPLACE(CAST( CMMTTEXT AS VARCHAR(MAX)),@oldCommentText, 'New comment text')
FROM
SOP10100 so
JOIN
SOP10106 sc
ON so.SOPTYPE=sc.SOPTYPE AND so.SOPNUMBE=sc.SOPNUMBE
WHERE
CHARINDEX(CAST( CMMTTEXT AS VARCHAR(MAX)),@oldCommentText) >0
AND so.COMMNTID='CommentIDName'

Replace CommentIDName with the comment id to target. And replace ‘Old comment text’ and ‘New comment text’ as appropriate.  Please check this SQL before trying it and run it on a copy of your data in your test system first. These scripts are for ideas only and not claiming to be a tested solution.

Hopefully this post will help someone, let me know with a comment if it did.

Dynamics GP Item Stock Enquiry Window taking too long to display data

If it takes a long time for the Item Sock Enquiry window to display data after entering an item number, this is a sign of large amounts of data in GP slowing things down.

Stock Inquiry

In my example, entering an item number into the stock enquiry window, it then takes nine minutes to display the data. That is not a typo, I timed it with my iPhone…

stopwatch

Why is this so slow?

The document that helps us is The Balance in Item Stock Inquiry is incorrect in Microsoft Dynamics GP , this document under “more information” explains that IV30300 (Transaction Amounts History detail) is loaded for the item and iterated through  totalling up the values to give the balance. Running the following SQL gives us the items with the most transactions in that table, and the item I’m looking at has 9253 rows.

SELECT ITEMNMBR, COUNT(ITEMNMBR) 
FROM IV30300
GROUP BY ITEMNMBR
ORDER BY 2 DESC

So this is more data that expected. The “problem” is that there is too much data for the way GP works. The solution is to trim the historical transactions. Removing history from the IV module will reduce  the number of records the window has to iterate through and make the performance acceptable.

Document attach feature Dynamics GP database and BusObjKey formats

I’ve been working on an archive project recently where I’ve been archiving records from GP into another archive database. This meant looking at how attachments are handled in Dynamics GP, in order to take any attachments related to records being moved into the archive.
 
The Table CO00102 is the table that relates the attachment to the record it is attached to and contains the BusObjKey column, that is implemented as a resource path back to the database record that the attachment is related to. Using a resource path makes it flexible, allowing the widest variety of GP object to be compatible with the attachment system.
 
CO00102

The resource path is implemented in a inconstant manner, varying due to the different nature of the records involved and also through misuse by developers too (I think). 

An example of the format pattern is:

0\PM\Purchase Order History\290941

Where the elements are as follows:

0 Dynamics GP Product Dictionary ID (see hyperlink)
PM Table Series ID
Purchase Order History Table “Display Name”
(see Microsoft Dynamics GP>>Resource Descriptions>> Tables)
290941 Record ID
Could be customer id or purchase order number or other identifier for record

However for attachments to notes windows, the Record ID is the NOTEINDX (ID of the record) but encoded into a hexadecimal version! Some developer was obviously thinking we didn’t have enough hoops to jump through when they imagined that one up and fancied burning some more CPU on the SQL server! 

Update 2017-01-12: If you've used the OLE notes migration tool, you will notice that the OLE notes look like they used this way of encoding the record ID too, this might be where this scheme came from.

The article Document Attach vs. Notes and how to locate whether a master record has a document attached explains how to get from the note index against an attachment record to the hex version of that note index for participating in the BusObjKey column.

CONVERT(VARCHAR(MAX), CONVERT(binary(4),cast(NOTEINDX as integer)), 2)
 
In the screen shot above, the note ending in number 000001CF2 is actually note index ID 7410.
 
Now examining the final format format in the list, this is for the Advanced Credit Control Module (module 1577). For this module they decided that they would shorted the format of the path. Note how the format is one “folder” shorter in depth than the other paths, it misses out the “Table Series”, then jumps right to the table display name. Doing this screws up some SQL scripts that you might encounter out there on the internet that are used for reporting on attachments. They assume the same number of path separators when deconstructing the path but this has one short–oh dear! So what I’m saying is that you must program/script defensively when working with these BusObjKey fields as you are at the mercy of developers, also expect to use a lot of case statements or to create a SQL scalar function to convert your BusObjKey values!

I would be really interested to expand this article if you have any other formats that you have encountered, please drop me a comment against this post!