eConnect Invalid object name ‘PA01901’ GP2010 sp1

After test upgrade of one of our GP installations we were getting the above error when trying to create a purchase order via eConnect. ‘PA01901’ is one of the project accounting tables, a module we don’t have installed.

Late into the night I Googled the issue and found almost nothing about it, also tried Customer Source where again there were some whispers but no substance.


I downloaded the service packs for eConnect. After applying eConnect Service packs for eConnect version11 up to pack 2 we still had the error.

Service packs tried;

After some pointers from a GP contact I have, it turns out that the eConnect stored procedure. taPoHdr is to blame. You can see the check below for the existence of the table in that procedure, the point at which the failure occurs.

if exists(select 1 from dbo.sysobjects (nolock) where name  = 'PA01901')
and exists (select 1 from POP10110 (nolock) where PONUMBER = @I_vPONUMBER)
and not exists
(select 1 from PA01901 (nolock) where PATranType = 6 and PADocnumber20 = @I_vPONUMBER)


The statement tries to see if it has any rows in the table if it exists or not. Obviously if it does not exist you can’t read the rows and falls over, it should have been nested logic here. An easy scripting mistake to make, shame it made it into production code though. It does prove the importance of coverage in testing and checking all code paths…


This can be corrected as shown below by nesting correctly. This is how the problem is solved if GP2010 SP2 is applied. This change to the stored procedure is prevented as a user as this is a protected stored procedure, encrypted to prevent tinkering. I imagine if you managed to decrypt the stored proc and apply the fix to a live environment, it would not be supported.

if exists(select 1 from dbo.sysobjects (nolock) where name  = 'PA01901')
if exists (select 1 from POP10110 (nolock) where PONUMBER = @I_vPONUMBER)
and not exists
(select 1 from PA01901 (nolock) where PATranType = 6 and PADocnumber20 = @I_vPONUMBER)


You could try scripting out the table from your Fabrikam or TWO sample databases and using that to create an empty PA01901 table. However if you have not installed project accounting, it is unlikely to be there either. My other concern with that approach is that if you put those table in, other scripts might start trying to behave as if Project Accounting is installed, trying to add data to other tables that may not exists.
Safer to just get GP2010 SP2 installed.

The above post was the only related post I could find.

Dynamics GP Macro Reference II

After working to support Dynamics GP Macros in our Visual Studio add in for GP I discovered a new layer of macro language not touched on in Mark Polno’s publication of Kevin Gross’s GP Macro Reference.

How to use macros to activate .NET add in forms

Use the following command:
NewActiveWin dictionary ‘default’ form [customID] window [.NET form name]

[customID] seems to be a jumble of characters to uniquely identify this as a addin form.
[.NET form name] is the .NET form name for the form we are trying to show.
NewActiveWin dictionary 'default'  form pARthOSt window AuxFormIV00101


Macro commands passed to RecordMacroItem

Any commands sent to the Macro subsystem from the .NET form are wrapped in a wrapper named “ShellCommand”. To record macro commands to the currently recording macro, call the following method on the form derived from Microsoft.Dexterity.Shell.DexUIForm.

RecordMacroItem(MacroCommandText as string, MacroComment as string)

The text passed in MacroCommandText is wrapped in a “ShellCommand” statement in the resulting macro file, as shown here where the highlighted text represents the string passed as the MacroCommandText when the macro was recorded;

ShellCommand 'ViewWebsiteInformationToolStripMenuItem_Click'
ShellCommand 'ClickHit field "btnOK"'


Long Macro Lines

There is another scenario that must be dealt with, the Add in macro equivalent of
ContTypeTo field 'Account Description' , 'nt 1'
This command would continue typing into the field Account Description appending to whatever is already there. The macro system wraps the whole lot in a ShellCommandBegin block, with each line starting ShellCommandAppend. The following is an example of this arrangement:

ShellCommandAppend 'TypeTo field "rebHTMLText", "<ul>
<li>HDMI cable with swivel ends - up to 180 De'

ShellCommandAppend 'grees </li>
<li>Reduces stress on your cables and risk of disconnecting </li>

ShellCommandAppend 'i>Provides both high definition video and multi-channel audio connection between'
ShellCommandAppend ' digital high definition AV sources such as Blu-ray, DVD players etc. </li>

ShellCommandAppend 'Transfer bandwidth: 10.2Gbps / 340Mhz (v1.3)</li>
<li>Signal Type: Transmission '

ShellCommandAppend 'minimised differential signalling (TMDS)</li>
<li>Connector Type: Gold plated </'

ShellCommandAppend 'li>


Dynamics GP Visual Studio Tools Toolbar

How annoying is it that docking can’t be used when developing add ins for Dynamics GP forms? If a panel is docked it slides under the toolbar at the top but upsets the visual styles such as the separator lines on the buttons.

See below where the panel has been put behind the toolbar, loosing the button effects and toolbar visuals (see highlighted area). The toolbar it seems is painted onto the form itself.


Just one of those niggles. I end up floating all my controls in a container with anchors set for all directions, nothing like as robust as just setting dock>fill.

Price table replicated to website

Replicating GP price table to website

To provide our website with bang up to date product prices as they are in our ERP system, we replicate the price table from our ERP system to the website SQL server database. The price table holds nearly two million price rows consisting of many combinations of currency, item, price quantity, units of measure, discount breaks and customer specific price lists.

The replicated table works great, until a big price update is required. If most of the prices are updated say in line with inflation, it hits a good number of those rows in the database. This causes a BIG transaction to make its way through the relatively thin wire to our website. From opening the transaction (and thus locking the subscriber table) to committing the transaction can take a long time locally and then for that to make its way through the slow connection to the website and be committed. All these processes take a finite amount of time. The lock caused on the price table at the website database while all this is happening causes a problem. That lock caused any reads on the price table to be blocked for a long time until everything had passed through, bringing the website to a halt for tens of minutes.

To avoid this queries at the website that interrogate the replicated publisher table could be set to READ_UNCOMMITTED transactions. However, potentially this could lead to problems in reading “dirty records” that are not ready for public consumption. This is significant when you consider these are price tables and reading prices that are in an unknown state is a no-no.


First Idea

One was to take a database snapshot before the bulk price update, switch all the views on the table to use the snapshot letting replication take its time and update the records in the underlying table. Once finished the views could  be switched back to point at the original table again. This could, perhaps be controlled by a replicated signalling table from the ERP system so that we don’t have to issue SQL between the databases. It should work well in that once the all clear signal is set in the publication database, it will not propagate through to the subscriber until after all the other changes in the log have been replayed and committed to the subscriber.

Second Idea

The second idea was to switch the subscriber database into ALLOW_SNAPSHOT_ISOLATION ON mode. Simply by executing the following commands:

ALTER DATABASE [ERPReplicationdatabase]

ALTER DATABASE [ERPReplicationdatabase]

Once the database is in snapshot isolation mode, the reads will use versioned row numbers from the database. The reads are never blocked as when a transaction begins, the row version before the transaction and below is used for any reads. So reading the website is not blocked while a transaction is underway, and what is read is the state before the transaction started, that for our application is perfect.

No DML commands need issuing or signalling between the databases. This is the cleanest solution for what is required. The next task is to ensure that all the changes made to the price tables are made inside one transaction to keep the reads off any of the new changed data until is fully committed to the database.