Empty Matched To Shipment in Purchasing Invoice Entry of Dynamics GP

About from time to time on Euro transactions we get a yellow triangle and empty “Matched to Shipment” field in the Purchasing Invoice Entry window of GP.

Purchasing Invoice Entry Yellow Triangle

The problem is always the same, the fields

  • CURNCYID
  • CURRNIDX
  • XCHGRATE
  • RATECALC

Are all either default or missing for the item row in the table POP10500.

I am still looking for the break through clue as to what causes this, I’m guessing network outages or other hardware failures interrupting processing somewhere. In the meantime I just fix the issue when it comes up, but having fixed it a couple of times I wrote a script to help.

If this is the problem that you are seeing too then the following script can be used to fix it for a given PO Number.

BEGIN TRANSACTION

UPDATE pl
SET CURNCYID = ph.CURNCYID
,CURRNIDX = ph.CURRNIDX
,XCHGRATE = ph.XCHGRATE
,RATECALC = ph.RATECALC
FROM POP10100 ph
JOIN POP10500 pl ON ph.PONUMBER = pl.PONUMBER
WHERE ph.PONUMBER = '{your po number}'
AND pl.CURNCYID = ''

ROLLBACK TRANSACTION

As always, I can’t know your GP environment, so this should be thoroughly tested in a test environment before running the script on production data. I can’t take any responsibility for what might happen on your systems.

Let me know if this helped you in the comments, it motivates me to blog more…

OLE Notes Migration

It was time to get our old OLE notes migrated. GP has stopped using OLE notes containers for attachments to notes in the system, mostly due to the need to work with the web browser hosting of GP.  I didn’t want to import the attachments back into the new GP attachments feature, as no one had been screaming for the ones that had vanished after the upgrade (only a few enquiries). However I did want to make them available should someone need them enough (don’t ask me for criteria for the enough!). So the plan is to run the first part of the migration tool only, the extract and not the import. For details of the tool see the post by encore:

The OLE note migration utility in Dynamics GP

The OLE notes directory totalled 46GB with 30k documents, running the migration utility brought the extracted size down to 3GB. Evidence is that the object containers were not efficient!

So the lesson is that you might want to extract your OLE notes even if you don’t intend to use them as it will help your file sizes, in our case the storage is backed by a SAN that will be compressing things anyway, extracting the files makes them useable from the directory.

What you get after extraction is a directory structure that starts with the dynamics GP product ID, and then has a folder for each noteidx under that. The note index is the record id for the note in the notes table. So for every product that uses notes and had attachments there should be a top level folder, then subfolders for the note attachments.

It looks like the same kind of scheme used by the newer GP attachments feature, see the post:

Document attach feature Dynamics GP database and BusObjKey formats

the directory name is the hex version of the note index that the contents of that directory link to.

This is good enough for me to recover any documents requested by looking at the note index and converting it.

SSMS SQL Hazard

Let me point out a hazard using SSMS, when developing a SQL delete query, especially more complex ones that take time to build up. I also think about some of the ways I work and how they minimise the chances of damage to data.
Let us say we are developing the simple query to remove orphan records from the prices table of Dynamics GP:
 
DELETE IV00107
SELECT *
FROM IV00107
LEFT JOIN IV00108
ON IV00107.ITEMNMBR=IV00108.ITEMNMBR
and IV00107.CURNCYID=IV00108.CURNCYID
AND IV00107.PRCLEVEL=IV00108.PRCLEVEL
AND IV00107.UOFM=IV00108.UOFM
WHERE
IV00107.PRCLEVEL='LIST'
AND IV00108.ITEMNMBR IS NULL
 
…as we develop it, I highlight from SELECT down to NULL and hit F5 to run it. That will only run the “select”, to see what it will be deleting later. Later I would put double dash in front of the select to and run the whole statement to get the delete to execute.
 
Let us say we have a break, then come back having got the approval to delete the records. So our eye catches the delete and so we highlight from DELETE to NULL and hit F5. Oh no! We have just deleted everything in IV00107! Think about it, the delete and SELECT are interpreted (correctly) as two different operations.
 
Working this way, developing the record set to remove as a SELECT, then adding the DELETE to the top ready to be ran is a common pattern for me. Although I usually put a double dash in front of the DELETE as shown below to prevent this mistake or the mistake of just hitting F5 during testing, without remembering to select first. The action of uncommenting the DELETE triggers my mind to also then comment out the SELECT.
 
--DELETE IV00107
SELECT *
FROM IV00107
LEFT JOIN IV00108
ON IV00107.ITEMNMBR=IV00108.ITEMNMBR
and IV00107.CURNCYID=IV00108.CURNCYID
AND IV00107.PRCLEVEL=IV00108.PRCLEVEL
AND IV00107.UOFM=IV00108.UOFM
WHERE
IV00107.PRCLEVEL='LIST'
AND IV00108.ITEMNMBR IS NULL

I made this mistake against a test SQL database today, a reminder of why you should develop against a test environment. Luckily I use SSMSBoost, a productivity pack plug in for SSMS. This addin to SSMS will warn when you are executing a delete without a where clause, throwing up a fatal action guard window,  so it prevented me from actually causing any damage.
 
ssmsboost fatal action guard window
 
Immediately I could see what I had done and hit the No button.
 
Another factor that led to this potential mistake was not aliasing my tables like I normally would do, if I had written the following then the DELETE would not have been able to find the table i7 and hence would not have executed and would return;

Msg 208, Level 16, State 1, Line x
Invalid object name 'i7'

DELETE TOP(1000) i7
SELECT *
FROM IV00107 i7
LEFT JOIN IV00108 i8
ON i7.ITEMNMBR=i8.ITEMNMBR
and i7.CURNCYID=i8.CURNCYID
AND i7.PRCLEVEL=i8.PRCLEVEL
AND i7.UOFM=i8.UOFM
WHERE
i7.PRCLEVEL='LIST'
AND i8.ITEMNMBR IS NULL
 
Yet another practice that would have helped is that I also would normally have the TOP statement in the query. I would then pressing F5 ten times, in this case until there are no more rows to process. I use TOP to make it run faster and not cause lock escalation (we have 2million records in price table). Other times if there is too much to remove from F5, I’d put it in a loop with a @ROWCOUNT check to see if anything is left to process, that would not have stopped a disaster though. In reality a delete on this table would have taken so long I would soon see my mistake and cancel the query before it committed, but I’m showing a principle here in this post. So the TOP statement would have prevented quite so much data loss.
 
Luckily I didn’t end up with any issues at all, but as is often the case an accident only happens when the perfect storm of factors come together. Keeping to my normal way of working protects me but I thought others might learn from why I work the way I do.
 
If I had removed the records in production, then it would not have been a bit deal in this case as we restore our production database into our test company regularly and automatically (see my post on how to do this). As these prices do not change much from day to to, I could have just squirted the missing records from that company into production, then restored to point in time the test company and again re-synced the prices from that restore, resulting in no impact on users.  This is one of the reasons I’m a fan of having fresh copies of production available in test, also useful for patching up mistakes users make in a timely manner.
 

Disable Enhanced Intrastat Dynamics GP

I have fallen out of friends with the GP Intrastat module, it has caused me problems with side effects in different parts of GP over the years. After another such incident, I decided it had to go, but I found it was not so obvious how to switch it off…

Microsoft Dynamics GP>> Tools>>Setup>> Company>>Company

Select the options button

Dynamics GP Company Options Window

The company setup options window will allow access to the “Enable Intrastat Tracking” option. Try unchecking the checkbox. You will get the following dialog.

Please detach all debtors, creditors and sites from all declarnts before you unmark Intratat Tracking

Please detach all debtors creditors and sites from all declarants before you unmark Intrastat Tracking

Microsoft Dynamics GP>> Tools>>Setup>> Company>>Enhanced Intrastat>>Setup

Intrastat Setup Window

You must ensure all users are out of the system before taking the next steps, as this changes the behaviour of the UI and back end. In the Intrastat Requirement Setup window, select unmark all then ok the window.

Go back to the company options window, you may now be able to deselect the intrastat module.

You must then log out of GP and back in again, this will remove the buttons etc that are related to the Intrastat module.

Please be aware that this will stop recording of Intrastat figures, make certain that your intrastat returns can be completed in some other way before disabling this module.

I used the word “may” above, as this worked for our test copy of our production company, but when it came to applying this to production over the weekend, it behaved differently, still throwing the above dialog.

Detaching declarants using SQL

As there does not seem to be any documentation around removing the module and it is holding up some other critical projects, I ran the following SQL script to essentially do what it was asking me to, remove the declarants.

Using DML SQL to query to search for tables containing the field DECLID, we find (for this version of GP and our modules), the following tables:

EDCEI016
VAT10302
*RM00102
*RM00101
*PM00200
*IV40700
EDCEI021
*PM00300

The tables marked with a * are the ones linking the sites, debtors and vendors to declarants. So lets remove the references to the declarant, backing up the data, just in case…

SELECT CUSTNMBR, ADRSCODE, DECLID INTO RM00101_IntrastatBackup 
FROM RM00101 WHERE DECLID!=''
UPDATE RM00101 SET DECLID='' WHERE DECLID!=''

SELECT CUSTNMBR, ADRSCODE, DECLID INTO RM00102_IntrastatBackup
FROM RM00102 WHERE DECLID!=''
UPDATE RM00102 SET DECLID='' WHERE DECLID!=''

SELECT VENDORID, DECLID INTO PM00200_IntrastatBackup
FROM PM00200 WHERE DECLID!=''
UPDATE PM00200 SET DECLID='' WHERE DECLID!=''

SELECT VENDORID, ADRSCODE, DECLID INTO PM00300_IntrastatBackup
FROM PM00300 WHERE DECLID!=''
UPDATE PM00300 SET DECLID='' WHERE DECLID!=''

SELECT LOCNCODE, DECLID INTO IV40700_IntrastatBackup
FROM IV40700 WHERE DECLID!=''
UPDATE IV40700 SET DECLID='' WHERE DECLID!=''
 
The above script worked for me, but I can not recommend you use it unless you have fully tested and know your install of GP does not have any other quirks that need addressing (sensible disclaimer!). Certainly something like this should involve your support partner…
 
Remember to put those backup tables somewhere safe, so they don’t have any danger of upsetting a future upgrades or sitting there forever (probably another database or export them then delete).
 
Now you will be able to uncheck the intrastat check box.
Remember to log out of GP and back in to see the changes.
 
This will break and switch off your intrastat module installation and historical reporting, so make certain you are not going to need anything from it before you do this!

You may want to remove the intrastat history too using the GP remove history menu before turning off the module. I’ve chosen to remove leave it there for now, I’ll remove it at a later point in time. I also cleared VAT10201 and VAT10101 tables to back up tables, just to highlight sooner rather than later where we might have integrations (I know we do) and jobs (I know we do) interacting with those tables.

After effects

After switching off the module some flaky bits of GP started working again, such as product suggestions and rolling down address changes to lines, after they have been changed in SOP.

rolldown address prompt

Bizarrely some mods I did a long time ago, but didn’t work and I left in place, started working, even though I’d moved on and used a work around since. Luckily the after effects were all positive, I’m now happy I’ve turned off this pesky module!

Also please note that if you use the country of origin and tax commodity code fields in item maintenance, these are turned off and on by the intrastat module being disabled. I don’t see why even those who do not use intrastat should be prevented from using these fields.

Item Maint window in GP showing optional content

I used a visual studio addin to re-enable those fields on the form.

ItemMaint.TaxCommodityCode.Show()
ItemMaint.TaxCommCodeDesc.Show()

ItemMaint.CountryOrigin.Show()
ItemMaint.CountryOriginDescription.Show()

ItemMaint.LookupButton5.Show()
ItemMaint.LookupButton7.Show()

Do comment on your experiences too, motivates me to write more!