Using SQL UNPIVOT operator to reconcile Dynamics GP inventory items

Working with the Dynamics GP inventory tables you will encounter the field named QTYTYPE a lot. This is usually seen with its partner LOCNCODE.

Inventory in GP can reside in different locations (for example depots New York, London, Sidney). That location then is broken down further into five item types. Item types can be though of condition/state of the item where the item states are an indexed as followed:

1 = On Hand
2 = Returned
3 = In Use
4 = In Service
5 = Damaged

This is a way we can categorise where and what state/status the inventory is in.

The item stock levels for each item is stored in the table IV00102, keyed by ITEMNMBR and LOCNCODE. To avoid creating four times more rows, the table architect decided to pivot the table, giving each quantity type its own field column in the table. The fields names are as follows:

ITEMTYPE Field Name Description
1 QTYONHND On Hand
2 QTYRTRND Returned
3 QTYINUSE In Use
4 QTYINSVC In Service
5 QTYDMGED Damaged

 

This presents a problem as the other inventory tables have a QTYTYPE field and have separate rows for different quantity types. If we need to join to the IV00102 table this becomes troublesome for us. One solution (I know there are others) is to use the UNPIVOT operator in TSQL to unpivot the IV00102 table, causing the columns to present themselves as extra rows.

SELECT ITEMNMBR
,LOCNCODE
,CASE valuename
WHEN 'QTYONHND'
THEN 1
WHEN 'QTYRTRND'
THEN 2
WHEN 'QTYINUSE'
THEN 3
WHEN 'QTYINSVC'
THEN 4
WHEN 'QTYDMGED'
THEN 5
END AS QTYTYPE
,QtyValue
FROM iv00102
UNPIVOT(QtyValue FOR valuename IN (
QTYONHND
,QTYRTRND
,QTYINUSE
,QTYINSVC
,QTYDMGED
)) UnPiv
WHERE ITEMNMBR = '100XLG'

This gives rise to the following result set.

SQL results

See how each quantity type now has its own row rather  than being named columns? We also used a CASE statement to alias the names of the columns back to index numbers.

Example, reconciling inventory values using SQL

In the blog post by Mahmood M. Alsaadi  Reconciling Quantity on Hand – SQL Script, he shows a script to reconcile inventory, so let us not reinvent the wheel and start with that script. The example provided in his post, at the time of writing did not take into account the quantity types. Unfortunately I ended up debugging the SQL to work this out, only to then return to the original post comments to find someone else had also done the same and pointed this out. However the solution they proposed was to tie the QTYTYPE=1 so that only on hand quantities are reconciled.  I felt I could do better than!

I build on the original script adding in the UNPIVOT introduced above:

SELECT TRX_BALANCE.ITEMNMBR AS ItemNumber
,TRXLOCTN AS Location
,Master_Balance.QTYTYPE AS QTYTYPE
,BALANCE AS TRX_BALNACE
,QtyValue AS Master_Balance
,ATYALLOC AS Master_AllocatedQuantity
,QtyAvailable
,BALANCE - QtyValue AS Variance
FROM (
SELECT ITEMNMBR
,TRXLOCTN
,QTYTYPE
,SUM(QTYRECVD) - SUM(QTYSOLD) AS BALANCE
FROM dbo.IV10200
--WHERE IV10200.ITEMNMBR='40-322'
GROUP BY ITEMNMBR
,TRXLOCTN
,QTYTYPE
) AS TRX_BALANCE
LEFT OUTER JOIN (
SELECT ITEMNMBR
,LOCNCODE
,CASE valuename
WHEN 'QTYONHND'
THEN 1
WHEN 'QTYRTRND'
THEN 2
WHEN 'QTYINUSE'
THEN 3
WHEN 'QTYINSVC'
THEN 4
WHEN 'QTYDMGED'
THEN 5
END AS QTYTYPE
,QtyValue
,CASE valuename
WHEN 'QTYONHND'
THEN ATYALLOC
ELSE 0
END AS ATYALLOC
,CASE valuename
WHEN 'QTYONHND'
THEN QtyValue - ATYALLOC
ELSE 0
END AS QtyAvailable
FROM IV00102
UNPIVOT(QtyValue FOR valuename IN (
QTYONHND
,QTYRTRND
,QTYINUSE
,QTYINSVC
,QTYDMGED
)) IV00102Pivot
) AS Master_Balance ON TRX_BALANCE.ITEMNMBR = Master_Balance.ITEMNMBR
AND TRX_BALANCE.TRXLOCTN = Master_Balance.LOCNCODE
AND TRX_BALANCE.QTYTYPE = Master_Balance.QTYTYPE
WHERE BALANCE - QtyValue <> 0

This will output the item, location and quantity type of the items that have an incorrect inventory level in the IV00102 table (this is the table viewed when looking at the Dynamics GP item enquiry form).

This can be a useful  script to set as a scheduled SQL Job to notify your GP admin that inventory needs reconciling, even listing the items that need reconciling. This could also lead to generation of a macro to do the reconcile… one for another day…..

Inventory Year End, Remove Sold Receipts and Cost Change History

Inventory year end closing window

FIFO has to be maintained somewhere so goods receipts are added to table IV10200, as the inventory is consumed the QTYSOLD field is incremented against the receipt, until the QTYRECVD = QTYSOLD and at this point the field RCPTSOLD is set from 0 to 1, to indicate that the receipt has been totally consumed. The order of receipts is chosen depending on the cost model, FIFO etc.

As can be seen above, as part of the year end closing, the option is provided to remove any consumed receipts before a specified date. If this option is chosen, then GP will look for receipts with RCPTSOLD=1 and DATERECD<{Date Provided}  and remove them from the IV10200 table. It will also remove the records in IV10201 that represent the demand against the receipt. You can see this if you run a SQL trace while it runs.

DELETE FROM IV10201 WHERE ITEMNMBR = 'myitem' AND SRCRCTSEQNM = 1 
 
I was worrying if this would affect the stock levels if I reconciled the inventory after removing the receipts. If you look at my other post Using SQL UNPIVOT operator to reconcile Dynamics GP inventory items then you will note that inventory reconcile calculates the available stock from the IV10200 table using:
SUM(QTYRECVD) - SUM(QTYSOLD)
As we are only removing rows where these are equal (as the whole receipt is sold), then they are rows that were not participating in the stock quantity anyway, hence no effect on stock reconciles.

In the above script SRCRCTSEQNM=1 (Source Receipt Sequence Number) is from the RCTSEQNM (Receipt Sequence Number) of the record removed from IV10200 binding demand and source. – or so it looks at first sight.
 
hey wait…
 
The key for receipts table IV10200 allows the same item to have the same shared RCTSEQNM if the dates received is different. Indeed our data has this a lot of the time, see the table screen shot for an item below, see how the item has multiple RCTSEQNM=1. I don’t think this should happen.
 
rectable
 
I had a look an our test copy of the live company, ran the above year end routine and indeed it removed all records from IV10201 for the sequence number, even though I think they related to records still remaining in IV10200! I’ve seen some posts regarding people finding missing data in their systems but not being able to track down why and what makes it happen, I think perhaps this may be an answer? It may be that I am interpreting what I am seeing wrong too, I need more information on how records relate.
 
Further investigation…
I don’t think I’m wrong, I have the feeling that we have some serious corruption in our table! After running the remove sold receipts then reconciling one of the items affected gives us this result:
 
itemreconcile
 
GP has added a record back into IV10201 to replace the one that was taken by the year end utility, I think this is a good indication that this is not supposed to have happened.
GP creates a new record in IV10201 with ORIGINDOCID field set to “RECON” (reconcile) for the full amount sold on the receipt, as it can’t now find the transaction details, as they were removed by the year end utility.
 
I’m guessing that the RCTSEQNM should auto increment for each receipt and that there should not duplicate RCTSEQNM for any item. The following script checks for instances of this happening in correctly.
 
SELECT 
ITEMNMBR,
RCTSEQNM,
COUNT(*) ,MAX(DATERECD)
FROM IV10200
GROUP BY ITEMNMBR, RCTSEQNM
HAVING COUNT(RCTSEQNM)>1
ORDER BY 4
(19498 row(s) affected)
Oh crickey! - there are 19498 item sequence number combinations where this has occurred. Looking at the first part of the result set we see this is real bad, some of the sequence numbers have been used 735 times, and that is just at first glance, I would rather not dig deeper. With the dates added, I see that this happened between years 1998 and 2009. I guess whatever the cause was went away after that year. So these days it does work how I would expect and the records are tied between the 200 and 201 tables by the RCTSEQNM (when the data is correct).
 
corruptseq
 
 
corruptseqdate
 
 
Where do I go from here?
So I need to clean up this mess, it looks like it may not be possible to recover from this and preserve data. As it happens I want to trim these receipts back to seven years anyway, so if I do that the problem will solve itself, I think. An item reconcile after the trim will rebuild dummy records for sold quantities if any are missing, so we end up in a better place than I am now.
 
 
It is not a good idea to delete a receipt and associated detail if consumed recently as it makes processing returns more difficult once a receipt has been removed.
 
If you don’t want to run year end for inventory, that updates starting qtys and summaries, you could try writing a SQL script to remove the records. This is something I've ended up doing, archiving the records in case I ever need them again too.
Here is the script I used, beware -this is only here as an example. You NEED to check this with your test environment first and be very careful about using it. I cannot fully guarantee this will be safe to run for your GP installation.
 
BEGIN TRANSACTION

DECLARE @DateThreshold DATETIME

SELECT @DateThreshold = '20091101'

DECLARE @ToRemove AS TABLE (
DATERECD DATETIME
,ITEMNMBR CHAR(31)
,TRXLOCTN CHAR(11)
,RCTSEQNM INT
,QTYTYPE SMALLINT
);

INSERT INTO @ToRemove (
DATERECD
,ITEMNMBR
,TRXLOCTN
,RCTSEQNM
,QTYTYPE
)
SELECT DISTINCT rcpt.DATERECD
,rcpt.ITEMNMBR
,rcpt.TRXLOCTN
,rcpt.RCTSEQNM
,rcpt.QTYTYPE
FROM IV10200 rcpt
WHERE rcpt.RCPTSOLD = 1
AND (rcpt.DATERECD < @DateThreshold)

--AND rcpt.ITEMNMBR='01-214'
DELETE IV10200
OUTPUT DELETED.[ITEMNMBR]
,DELETED.[TRXLOCTN]
,DELETED.[DATERECD]
,DELETED.[RCTSEQNM]
,DELETED.[RCPTSOLD]
,DELETED.[QTYRECVD]
,DELETED.[QTYSOLD]
,DELETED.[QTYCOMTD]
,DELETED.[QTYRESERVED]
,DELETED.[FLRPLNDT]
,DELETED.[PCHSRCTY]
,DELETED.[RCPTNMBR]
,DELETED.[VENDORID]
,DELETED.[PORDNMBR]
,DELETED.[UNITCOST]
,DELETED.[QTYTYPE]
,DELETED.[Landed_Cost]
,DELETED.[NEGQTYSOPINV]
,DELETED.[VCTNMTHD]
,DELETED.[ADJUNITCOST]
,DELETED.[QTYONHND]
INTO CANFDARCHIVE..IV10200
FROM IV10200
JOIN @ToRemove r ON IV10200.DATERECD = r.DATERECD
AND IV10200.ITEMNMBR = r.ITEMNMBR
AND IV10200.RCTSEQNM = r.RCTSEQNM
AND IV10200.TRXLOCTN = r.TRXLOCTN
AND IV10200.QTYTYPE = r.QTYTYPE
WHERE RCPTSOLD = 1
AND (IV10200.DATERECD < @DateThreshold)

--Remove orphan records
DELETE ld
OUTPUT DELETED.[ITEMNMBR]
,DELETED.[TRXLOCTN]
,DELETED.[QTYTYPE]
,DELETED.[DOCDATE]
,DELETED.[RCTSEQNM]
,DELETED.[ORIGINDOCTYPE]
,DELETED.[ORIGINDOCID]
,DELETED.[LNSEQNBR]
,DELETED.[CMPNTSEQ]
,DELETED.[QTYSOLD]
,DELETED.[UNITCOST]
,DELETED.[IVIVINDX]
,DELETED.[IVIVOFIX]
,DELETED.[SRCRCTSEQNM]
,DELETED.[TRXREFERENCE]
,DELETED.[PCHSRCTY]
INTO CANFDArchive.[dbo].[IV10201]
FROM IV10201 ld
LEFT JOIN IV10200 h
ON h.ITEMNMBR=ld.ITEMNMBR
AND h.TRXLOCTN=ld.TRXLOCTN
AND h.RCTSEQNM = ld.SRCRCTSEQNM
AND h.QTYTYPE=ld.QTYTYPE
WHERE h.DATERECD is null




COMMIT TRANSACTION

 
 
 

Updating Dynamics GP Comment ID in Sales Order Entry

A library of pre-written comments can be defined in Dynamics GP and then stored for later quick recall. A name known as the Comment ID is used for quick recall of the stored comments. The lookup table that stores the library of pre-written comments is SY04200. This table stores comments for many parts of GP, so SOP comments will have a the CMTSRIES field equal to 3. When an sop document is entered and a comment placed on it, by selection or typing the comment id. Then that order will be stamped with the pre-written comment corresponding to that comment id in the library. Think of this process as being like using the old rubber ink stampers for stamping previously prepared comments onto documents using an ink pad. A COPY of the comment is put against the order in the SOP10106 table under the field CMMTEXT (and the other comment fields too, more on them later).

SOP Entry Comment ID Window showing manual text entry

If the source comment is changed at a later point in time, then any future documents will be stamped with the new texts, but the documents that have already been stamped will not have the comment changed as it is a COPY of the comment as it was when it was attached to the order. See later for ideas on “fixing” this.

It is also possible and normal to go in and edit the comment text, adding or changing the text that was added from the comment id. The comment ID name will remain in this case so that you may still see the comment has been applied at some point, and a pencil edit icon will be added next to the comment ID to show it has been manually updated.

Pencil Edit Icon on Comment ID

If the comment ID name is removed from the field then the whole comment is erased, including any manual edits.

For the developers out there, note that the comment is added to the SOP10106 table immediately on entering the comment ID, before the order is saved. Also note that the (text type) CMMTEXT field is also broken up into chunks of 51 characters over a number of comment fields. This is to support report outputs that cannot handle the massive text type field.

COMMENT_x fields 1-4 char(51) break up CMMTTEXT

I want to update the exiting comments from comment IDs

So something has changed about your standard comment so the comment in the comment ID is updated, that deals with future documents, but what about the exiting sales documents that have been stamped with the previous comment?

There are two obvious solutions:

  • Use a mail merge Dynamics GP macro to, for each order document of concern, update the comments, but opening each order, removing the comment ID and replacing it again, then finally saving the order.
  • Use SQL to update the database directly.

The first method of macro will work if you have no in house SQL skills. Be aware though that any edits to the comments will be also lost, check with the users that they are not in the habit of editing the text that has been stamped into the document.

The second method is ok if you are confident with SQL, or ask your dynamics GP partner to help if you are at all concerned.

UPDATE sc
SET CMMTTEXT=REPLACE(CAST( CMMTTEXT AS VARCHAR(MAX)),'Old Comment Text', 'New comment text')
FROM
SOP10100 so
JOIN
SOP10106 sc
ON so.SOPTYPE=sc.SOPTYPE AND so.SOPNUMBE=sc.SOPNUMBE
WHERE
CHARINDEX(CAST( CMMTTEXT AS VARCHAR(MAX)),'Old Comment Text') >0
AND so.COMMNTID='CommentIDName'

The above SQL will update only for orders stamped with the comment ID in question. It will then replace the old text with the new, leaving any extra text, so it is low risk in terms of losing data. However if someone has edited the text from the template, then this will not work. You may need to be careful if you have comments that span multiple lines, as the line returns will need to be taken care of, you could always source the text from an existing record for ease, using a sub query.
 
DECLARE @oldCommentText VARCHAR(MAX)
SELECT @oldCommentText = CAST( CMMTTEXT AS VARCHAR(MAX)) FROM SOP10106
WHERE SOPNUMBE='{sourcedoc}' and soptype={sourcetype}

UPDATE sc
SET CMMTTEXT=REPLACE(CAST( CMMTTEXT AS VARCHAR(MAX)),@oldCommentText, 'New comment text')
FROM
SOP10100 so
JOIN
SOP10106 sc
ON so.SOPTYPE=sc.SOPTYPE AND so.SOPNUMBE=sc.SOPNUMBE
WHERE
CHARINDEX(CAST( CMMTTEXT AS VARCHAR(MAX)),@oldCommentText) >0
AND so.COMMNTID='CommentIDName'

Replace CommentIDName with the comment id to target. And replace ‘Old comment text’ and ‘New comment text’ as appropriate.  Please check this SQL before trying it and run it on a copy of your data in your test system first. These scripts are for ideas only and not claiming to be a tested solution.

Hopefully this post will help someone, let me know with a comment if it did.

Dynamics GP Item Stock Enquiry Window taking too long to display data

If it takes a long time for the Item Sock Enquiry window to display data after entering an item number, this is a sign of large amounts of data in GP slowing things down.

Stock Inquiry

In my example, entering an item number into the stock enquiry window, it then takes nine minutes to display the data. That is not a typo, I timed it with my iPhone…

stopwatch

Why is this so slow?

The document that helps us is The Balance in Item Stock Inquiry is incorrect in Microsoft Dynamics GP , this document under “more information” explains that IV30300 (Transaction Amounts History detail) is loaded for the item and iterated through  totalling up the values to give the balance. Running the following SQL gives us the items with the most transactions in that table, and the item I’m looking at has 9253 rows.

SELECT ITEMNMBR, COUNT(ITEMNMBR) 
FROM IV30300
GROUP BY ITEMNMBR
ORDER BY 2 DESC

So this is more data that expected. The “problem” is that there is too much data for the way GP works. The solution is to trim the historical transactions. Removing history from the IV module will reduce  the number of records the window has to iterate through and make the performance acceptable.