Dynamics GP Web Client is not compatible with Docker due to the Session Central Service not starting

Back in 2017 I attended NAVTechDays and after seeing the benefits the Dynamics NAV community were getting from Docker containers with NAV, I could see similar benefits for Dynamics GP and was also seeing interest in some of the conversations going on out of public within the GP community. I decided to see if I could get a good selection of Dynamics GP versions into Docker. Word of that work spread and brought out more people interested in containers for GP.

I spent a lot of hours attempting to learn Docker and to containerise Dynamics GP. I could build Docker images for the GP server database and add in various of the GP components like eConnect and oData. Sadly I was unable to get the Dynamics GP Web Client working – I really was just a cat's whisker away from success too. I was defeated by the Domain Account the Web Client runs under and its use of System.DirectoryServices.AccountManagement .  The following is a record of what I found when investigating that issue.

Although I can create a Docker container for the database and test company and even things like eConnect, there is a big problem. You see, there is no GUI with Docker containers, you can’t just remote desktop into one, you really need the Web Client running in order to make a self contained system to allow the host to interact with the Dynamics GP application in the container. If you can't access it from a browser, then you end up having to install the GP client on a non-docker machine (host) and connect using it to the running Docker container. Although this works its not really in the spirit of what I was trying to achieve, which was a fully self contained container for GP to allow multiple versions of GP to be easily spun up for testing and demo purposes. 

Why I failed?

After the Docker image has installed GP and the the web client product features are installed, one of these, the Session Central Service fails to start within the container. The Session Central Service is the component of the Dynamics GP web client installation that creates and tracks individual web client sessions as users connect and disconnect via web browsers. It is essential to getting web client to work.

At this point let me jump tight to the core issue, essentially the problem is that the “Lanman” windows service can’t be started in a Windows container. The Lanman service provides things like SMB file share services and is also referred to as the “Sever Service”, as that is how it appears, named in the services control panel list of services.

Containers are supposed to be stateless isolated single-app process groups, not mini virtual machines, so running Lanman would break that philosophy, hence it is not appropriate for it to be supported in Windows Docker containers as it doesn't make sense. Containers are supposed to be created and torn down at the drop of a hat, participating in active directory goes against the grain. This is why, along with a few other Windows features that are inappropriate for containers, we find the Lanman service just isn’t supported in a Windows Container.

So why is this a problem?

You find that the session central service, checks for the user running, see (2) in the following screenshot, and it does this using a call to the following method:

Dynamics.GP.Web.Foundation.DirectoryServices.PrincipalManager.GetPrincipal

This method in turn calls .NET method:

System.DirectoryServices.AccountManagement.Principal.FindByIdentityWithType(PrincipalContext  context, Type principalType, IdentityType identityType, String identityValue)

It is the above call that fails, and looking in the event log we see it reporting as indicated by (1) in the screenshot below that:

Exception message:
                      The Server service is not started.

Possible Root cause sml

Hey? - "The Server service is not started" – a confusing message as it means that the Lanman Service (aka server service) is not started. Yes this was super confusing error at the start of this investigation, not realising there was a "server service", so I assumed it was just a poor error message that wasn't telling me the service that had failed. So once I realised what it meant, I found out that, indeed if you go into the Docker container with Power Shell you can check that the Server Service is not started and even try to start it, however it will not start,  and this is due to the lack of support in windows containers, as previously stated. Lanman service can't be run on a windows container, thus preventing the use of .NETs System.DirectoryServices.AccountManagement. Microsoft in the past also confirm there are no plans for this to ever change, as it is fundamentally at odds with container philosophy to have this service running.

Without a small change to the Session Central Service code I was not going to get anywhere with this until…

Docker and Group Managed Service Account (gMSAs)

Hope then followed when I then discovered about the existence of Group Managed Service Account (gMSA), that I thought may provide a route to get around the lack of Lanman support. Using it we can authenticate to Active Directory resources from Windows container which is not part of your domain (remember the container is not part of domain), using gMSA. gMSA is a special account that can be used to (over simplifying it) allow passing over a domain account for the purposes of running a service. This could be perfect for replacing the user running the session central service, instead it can ran using a gMSA account for authentication. That in turn would bypass Lanman. When you run a container with a gMSA, the container host retrieves the gMSA password from an Active Directory domain controller and gives it to the container instance. The container will use the gMSA credentials whenever its computer account (SYSTEM) needs to access network resources. In our case, when building the Docker container we can use Power Shell to change the Session Central service to use the SYSTEM account having prepared the plumbing required for gMSA to authenticate against AD. SYSTEM become a kind of proxy for the AD account behind the gMSA account. For this to work certain prerequisites needs to be met, there is quite a bit of learning that is more than I want to get into here. I built this up in Docker to test.

Sadly on testing this thoroughly for a few days, it turns out there is another a problem. A docker container can access AD using the gMSAsaccount, allowing FindByIdentity to work WHEN the constructor for PrincipalContext is created with “ContextType.Domain” and specifying an accessible AD. The group managed service account technique does not work when the context type of “ContextType.Machine” is used to create the PrincipalContext.  

If we look at what is going on in the class in the screen shot– we find that the class is created with a constructor specifying machine as the context, stumping our attempts to use the gMSA to get around the lack of Lanman. This is due to no user name getting passed to the isMachineContext function (see last screen shot). I'm guessing that using the built in NETWORK user means no user name is passed into this method thus resulting in the highlighted code (1) being ran in the following screen shot.  This was upsetting to find as I thought I had it cracked at this point.

Not all that learning is a loss as it was interesting to learn about gMSA as it is an important container and enterprise computing concept to understand.

The following screenshot shows the code causing our issue (1) in the Microsoft.Dynamics.GP.Web.Foundation.DirectoryServices.PrincipalManager.

 

GetPrincipal2

 

User name being blank when calling the following IsMachineContext method then means the split in the logic in the above screen shot causes it to create the PrincipalContext using the wrong context type to use gMSA.

ismachinecontext

Finally


We can create containers with the database and test company but sadly we end up having to install the GP client on the host to access the container, that was not really my objective.

 Would be amazing if the code could be updated by Microsoft but I don’t see this happening. I need to refresh myself on the exact details of this issue and check again that nothing has changed in the years since I was trying as Docker is getting constantly improved with new features. 

Dynamics GP VAT rate change and how to deal with it

8th July 2020 and the UK government have announced that temporary reduction of VAT rate from 20% to 5% for hospitality sector (on food, accommodation and attractions), from 15th July until 12 January 2021. This gives you only one week to get advice from your ERP support on how to set up your system for the new VAT rate and for some (depending on your services and product mix), this will involve for the first time having to deal with different rates of tax on the same documents. Some people are SO going to be regretting not originally setting up invoice reports etc to dynamically pull the tax rates at the line level! Consultants are going to be super busy this week just helping people through the basics of reconfiguring their system to work with the new temporary VAT rate. This is also for many going to involve some consultancy to put the rates back in January (depending on how self sufficient the GP site is).

What should you think about?

You must  check in your supporting systems and documents, such as excel, your GP reports and reporting services reports for places where VAT rates have been “hard coded” (written in rather than looked up from system). These will need addressing, by preferably doing the extra work to dynamically look them up at run time, or replace with more generic wording if appropriate (say in terms and conditions of sale where VAT rate may have be specified in words.) As time is short you may just have be less perfect and note where they all are and go change them and get ready to swap the old report for the new report/document on the day and back again in January, assuming the rate actually goes back then…

Next you have three issues

  • Ensure new transactions from 15th July get the new 5% rate
  • Dealing with existing transaction particularly SOP document that will be fulfilled/processed after the 15th July as they will retain the previous tax rate unless further work to recalc is performed.
  • Putting the rate back again in January 2021

 

You have some avenues to explore:

  • Introduce a totally new Tax Schedule and Tax Detail IDs for the new rates. On the night of the change, update all the records to point at the new tax detail ids.
  • Just add new Tax Detail ID then take old one out of existing schedule.
  • Change the value on the exiting Tax Detail ID (not recommended).
  • Turn on effective tax dates feature (version GP2010+) and configure the effective records for the date in question. If no effective date exists it will fall back to the base tax rate so this will work well.
  • Use the Regenerate Tax tool to update your existing transactions.
  • Use a macro to update each document (depending on your transaction volume, could be very painful)

Some of the above options, provide for a clearer picture than others when it comes to reporting your VAT particularly don’t just change the tax percentage of the exiting Tax Detail, this is bound to give you issues, so consider that too.

Dynamics GP may from version GP2010 (with extra product) use Date-effective Tax. This allows a date range to be stored against a different tax rates for tax details , the appropriate tax rate then being used by a transaction based on that date in respect of document/posting/tax date (as configured). This is extra module that needs installing prior to GP 2015 R2, but after that it should be waiting for you to enable in company settings, if you haven't already.

There is also the useful tool of “Regenerate Taxes” that allow existing transactions to have the tax values updated, once the detail id has been updated.

Andy in his blog goes through some of the options with older versions of GP: All Change on 20% Tax Again January 2011 this should be read together with  Changing SOP Tax Rates using the Support Debugging Tool.

 

The “Whats New In Dynamics GP2015r2 document” explains all this nicely as copied below:

Date effective tax

You can now enter the tax rates in advance for transactions, and calculate the taxes based on the rates specified for a date range. You can also recalculate the taxes for the saved transactions by modifying the specified tax rates based on the latest tax rates specified by the tax authorities.

You can mass modify the tax rates, and regenerate the taxes for the saved transaction batches.

Follow these steps to set up a date effective tax:

1. Open the Company Setup Options window.

Company Setup Dynamics GP Effective Tax Dates

2. Microsoft Dynamics GP >> Tools >> Set up >> Company >> Company >> Options

3. In the Company Setup Options window, mark the Enable Tax Date checkbox, to allow you to use the tax date option for calculating the tax rate for the transaction based on a tax date.

4. Mark the Use Date-effective Tax checkbox. Select the option to calculate the tax. The option that you select here is used for updating the taxes for saved transactions within the tax effective date range. Transactions that are corrected will be based on the date you select here. You can select the Document date, Posting date or the Tax date to calculate the tax.

NOTE: Select posting date if you want to calculate tax based on the posting date that you specify. Select Tax date if you want to calculate tax based on the tax date that you specify. Select Document date if you want to calculate tax based on the document date that you specify.

You must set up the date range for the specified Tax ID to calculate the tax amount or percentage for the transaction. Follow these steps to create the date range for the tax ID.

1. Open the Date effective Tax Rates window.

DAte EFfective Tax Rates Window Dynamics GP

GP Tax Detail Window with Date Effective Tax Rates

2. Microsoft Dynamics GP >> Tools >> Set up >> Company >> Tax Details >> Select the tax ID in the Tax Detail ID field to set up the date range for >> Date effective Tax Rates.

3. Specify the date range for the specified tax amount or percentage. You can mass modify the tax percentage of multiple tax IDs.

You can choose to modify the tax percentage or amount for the taxes detail or the tax type based on date range, or for the tax without any specifying any date range, or both. Follow these steps to mass modify the tax percentage.

1. Open the Mass Modify Taxes window. Microsoft Dynamics GP >> Tools >> Utilities >> Company >> Mass Modify Tax Percentage

2. Select the option to modify the tax percentage.

You can choose:

• Date effective Tax – To update the dates specified in the date range, and the percentage that you have specified in the Date effective Tax rates window. What’s New in Microsoft Dynamics Page14
• Tax details – To update the tax percentage that you have specified in the Tax Detail Maintenance window.
• Both – To update the tax percentage that you have specified in the Date effective Tax rates window, and in the Tax Detail Maintenance window.

3. Click Insert to view the list of the tax IDs listed for the modification. NOTE: Within the specified Tax IDs, if you do not want to modify any Tax ID, you can select the Tax ID record, and click Remove.

4. Click Modify to modify the listed Tax IDs based on your specifications. You can also regenerate taxes for the transaction batches with the updated tax percentage. You can regenerate taxes for all the saved transactions of all the modules (except GL) or the selected module, only if you mark the use date effective tax in the Company Setup Options window.

You can specify the modules and the batches for which you want the taxes to be regenerated, or regenerate the taxes for all the batches in all the modules (except GL) at one time.

Follow these steps to regenerate the taxes for the saved batch transactions.

1. Open the Regenerate Taxes window. Microsoft Dynamics GP >> Tools >> Company >> Utilities >> Regenerate Taxes

2. Select the module to regenerate the updated tax for the batches. You can choose All to update the tax for the saved batch transactions in all the modules (except GL). Or, you can choose the module to update the tax for the saved batch transactions in the selected module.

3. If you choose a particular module to update the batch transactions, specify the batches in the batch range fields.

4. Click Insert to view the list of the batches that will be updated with the modified tax percentage. NOTE: You can select a batch ID record and click Remove, to prevent updating the transactions with the modified tax percentage.

5. Click Process to recalculate the tax for the transaction in the module and batch specified.

 Important:

1. The tax calculation of a transaction will be overridden if there is a date effective tax rate that exist for any tax detail.

2. If the tax calculation routine does not find the rate for a particular date range, then the percentage in the tax detail maintenance will be taken.

3. For the receiving transaction entry, only shipment/Invoice will be considered for date effective tax calculation.

4. For the returns transaction entry, only return with credit and inventory with credit type of transactions will be considered for date effective tax calculation.

5. You can regenerate taxes for transactions when workflow is active for the Receivables Management Batch Approval, Payables Management Transaction and Payables Management Batch Approval.

 

References:

How to handle VAT rate changes in Dynamics GP (perhaps dated now)

Changing Tax Rates and Date Effective Tax Rates in Dynamics GP

What’s New in Microsoft Dynamics GP (see 2015 R2 section)

Changing SOP Tax Rates using the Support Debugging Tool

All Change on 20% Tax Again January 2011

Dynamics GP will not remove hold or why timestamp format in GP matters

If Dynamics GP will not let a SOP module, transaction process hold be removed in the user interface, it may be due to formatting of time stamps. Check no integrating applications have created that hold incorrectly. 

The issue is that GP is real fussy about formats of timestamps, one place time stamps can cause problems is in use with the SOP Transaction Hold table, the following query illustrates what you need to know about this. 

SELECT [PRCHLDID], [HOLDDATE], [TIME1]
FROM SOP10104
WHERE SOPNUMBE='YourDocumentNumber' AND SOPTYPE=2

This query gives us some data to look at, that is if some holds exist for that document supplied to the SQL query...

The first thing to notice is that GP often splits/spreads the Time and the Date components of the point in time over two fields, in this case the point in time is represented by HOLDDATE and TIME1. These two fields are of the sql data type "datetime", something that can catch out people when new to GP databases as it may not be totally obvious what is going on. 

Lets look at the first row, it is for the hold type "CREDIT" and represents the point in time, 3rd of May 2020 at eleven minutes past three and 31 seconds

  • It is interesting and important to note that for the HOLDDATE field, it has a date but see how the date has zeros for all the time elements of that HOLDDATE datetime value. To fill up the unused time component, the time has been defaulted to midnight. A default time of zero (midnight) is used, as the unused half of the datetime sql data type may not be set to null (we need all those bytes). 
  • Now also note how the TIME1 field value has a time component, but the date has been defaulted to a "default" of the 1st of January 1900. This date by convention is used to fill the date component of the time, as the unused half of a datetime data type may not null. 
  • Further, it is vital to note that the milliseconds of the time has been zeroed too

Now if you disobey these format rules, the GP user interface will do weird things. More often than not, it just doesn't do anything, that can be really perplexing as a problem solve because the records to the untrained eye look fine and similar to other records right next to them. It may be as simple as including milliseconds (not zeroing them) that can prevent a user from removing a hold in the GP user interface!

These time stamps are used in this way for many tables, where the same rules apply. It is very important to provide a correctly formatted time and date field if integrating or manually scripting against the GP database. This is a really common rookie mistake for developers working with GP, its easy to get lazy or not notice the formatting to be wrong. It can also change behaviour and totals, as time span calculations may not result in the same outcome if time components are left in date fields etc. Imagine the impact when totalling values over time spans, urgh.  

So to save some head scratching, and to make a GP compatible timestamp for the current time field in .NET use the following code snippet:

DateTime now = new DateTime(1900, 1, 1).Add(DateTime.Now.TimeOfDay);
DateTime ForFieldUse= now - new TimeSpan(0, 0, 0, 0, now.TimeOfDay.Milliseconds);

Don't be tempted to simplify this and try to access the .now in two places on the same statement as you may find the times have shifted enough between calls to cause the milliseconds not to zero correctly as two different values would be returned and be subtracted, not resulting in zero! Date times are immutable so you can't simply zero the milliseconds component either.

To get the date only value for use in GP date fields, you may simply use the in build .NET method ".Today" as shown in this snippet;Snippet

DateTime.Today;

 

 Hopefully this will help get back up and running after Googling the problem you've just encountered! - If so do comment, it keeps me motivated to blog more.

SQL indexed views are incompatible with Dynamics GP

Trying to use indexed views with Dynamics GP? – well I’ve been there and I have seen others attempt the same. The Dynamics GP database and application are not really compatible with SQL indexed views. To help out those searching around this subject I thought I should write up our experience.

What is an indexed view?

An indexed view is a SQL database view that has had an index applied to it. This sounds obvious, but the important thing to realise is that the index will be materialised as an index on disk. Think of it as another table created on disk representing the data held in the view, this is why it is quick to get the data if it is summary data. This index "table" is maintained whenever the data that it covers is altered. So it is easy to imagine that indexed views are great for creating summaries, filtered views of data that cross multiple tables, as they will keep track of changes automatically and update the index "table", thus keeping the data "in-sync" with the data in the various tables that it derives from. As the data that represents the SQL view has been pushed to disk and persisted there, querying that summary data is lightweight and quick, because the work has already been done to process the data when it was stored, pre-joined between tables and/or summarised, rather than the database engine having to do all that work on the fly for each query as the query is ran.

However, there is a downside. When (any) index is created, you pay a price for the benefit of fast data recall when reading, by suffering with slightly longer writes times and those longer writes lead to longer living locks on the data and then this can lead to blocking. This may cause issues with performance in some circumstances. The extra writing is because the data held in the index "table" that represents the view needs to be maintained (updated) whenever any of the data underpinning that view changes. Depending on the workload sizes of tables, IO performance etc, this may be significant work if large numbers of records are updated at once. Again as with any SQL index, it takes space on disk too. For large wide views over large tables this may be a consideration too bloating storage and the knock on consequences that brings.

One of the most frustrating matters when working with indexed views is that there are a whole heap of constraints and restrictions around what is permitted in the query that forms them. For example, when using GROUP BY, it must contain the function BIG_COUNT(*) as a column and various database settings restrictions can apply too (there are many, many more). This means when designing the simplest views SQL compile errors warning that it is not possible to do "this and that” frequently cause annoyance. It is very obvious why this is the case when you think about it. It is due to the fact the data is persisted to disk, so it needs to be unchanging to store it, thus any SQL function used by the view has to be deterministic -aha, no “getdate()” function! It is surprising how often this will catch people out. I could go on and on about the restrictions and requirements of indexed views, but just go try making one and you’ll discover the pain yourself, then go read up the documentation to realise how much there is to it!

Another example below are the SQL SET OPTIONS that are required by SQL indexed views…

IndexViewSettings

Using indexed view with Dynamics GP

Professionally we use indexed views a lot in our applications for the speed of access and real time integrity of summary data they offers, so sooner or later a GP admin or GP developer decides they would quite like to use an indexed view... -then it all ends in tears, let us see why…

WITH SCHEMABINDING

This is the first problem. To build an index view the view must use SCHEMABINDING. This locks the database schema and the view together. Unfortunately this may then cause issues with application updates. When Dynamics GP is updated for a service pack or upgrade, sometimes the tables or other database objects may need to be dropped and recreated. This happens during the upgrade process, however if the object that is to dropped is schema bound, then the upgrade script will not be allowed to do what it wants, causing the upgrade to fall over. Hence if indexed views have been bound to the GP tables this is a real risk when a site comes to do an upgrade. Obviously if the person performing the upgrade was aware, then they could drop the view and recreate it after the upgrade, but in real life the knowledge is lost as employees leave or contractors move on causing upgrade pain. 

The second problem is more terminal. Let's try it and see what happens if we create a view over the sales order header and lines to speed up the number of sales lines by country that and not voided in historical transactions. Create the view on a non-production database like this…

 

SET NUMERIC_ROUNDABORT OFF;
SET ANSI_PADDING, ANSI_WARNINGS, CONCAT_NULL_YIELDS_NULL, ARITHABORT,
    QUOTED_IDENTIFIER, ANSI_NULLS ON;
-- We must schema bind
IF OBJECT_ID ('SalesVoidedLinesSummary', 'view') IS NOT NULL
    DROP VIEW SalesVoidedLinesSummary ;
GO
CREATE VIEW SalesVoidedLinesSummary
    WITH SCHEMABINDING
    AS 
       SELECT  COUNT_BIG(*) Cnt,
               SOP30200.SOPTYPE,
               SOP30300.CCODE as Country
       FROM 
       dbo.SOP30200
       JOIN
       dbo.SOP30300 ON SOP30200.SOPTYPE=SOP30300.SOPTYPE AND SOP30200.SOPNUMBE=SOP30300.SOPNUMBE
       WHERE
       SOP30200.VOIDSTTS=0 --Not Voided
       AND SOP30200.SOPTYPE IN (1,2)
       GROUP BY  SOP30200.SOPTYPE, SOP30300.CCODE
GO

 

Then create the indexed view by adding the index. - REMEMBER I SAID NO PRODUCTION DATABASES WITH THIS LITTLE EXPERIMENT!


--We materialise the view by creating an index on it
CREATE UNIQUE CLUSTERED INDEX IX_SOP30200SOP30300VoidSummary
    ON SalesVoidedLinesSummary (SOPTYPE, Country);
GO

 

We can now query the view data like so..

SELECT  * FROM SalesVoidedLinesSummary where SOPTYPE=1 AND Country='PT'

and directly before we created the view with this…

SELECT  COUNT_BIG(*) Cnt,
               SOP30200.SOPTYPE,
               SOP30300.CCODE as Country
       FROM 
       dbo.SOP30200
       JOIN
       dbo.SOP30300 ON SOP30200.SOPTYPE=SOP30300.SOPTYPE AND SOP30200.SOPNUMBE=SOP30300.SOPNUMBE
       WHERE
       SOP30200.VOIDSTTS=0 --Voided
       AND SOP30200.SOPTYPE = 1
       AND SOP30300.CCODE='PT'
       GROUP BY  SOP30200.SOPTYPE, SOP30300.CCODE

 

...and we find that from not having the view to having the view we have dramatically reduced the query time as shown below. I’m not sure about the integrity of those figures but this isn’t a discussion around query optimisations, just accept that indexed views solve performance problems.

    Total execution time without view:  17390   
     Total execution time with view:        505   

 

In production

So developer or IT pro after testing that out on the test company database says, “great, now lets create it on the production database as that worked like a charm”, and then goes to login to GP and create a sales order having created the view…

...to then get the dismay of the error that follows...

“An edit operation on table ‘SOP_Master_Number_SETP’ failed. A record was already locked'.”

IndexViewGPError1

and

INSERT failed because the following SET options have incorrect settings ‘QUOTED_IDENTIFIER, CONCAT_NULL_YEALS_NULL,ANSI_WARNINGS’. Verify that SET options are correct for use with indexed views and”…

GPIndexViewError2

… then starts the support calls flowing in from the users! 

So why we can’t have nice things?

Go back up in this post, as you saw in the graphic, that there are certain SQL SET options required to make indexed views work. Sadly these are not compatible with the SET options required to make Dynamics GP work shown below! -thus the application breaks, simple as that! If those phones are still ringing from your users right now, then just simply drop the view and your users will be happy again.

We did play around a bit but never found a solution to this issue, but if you have a work around do let me know.

References

Mariano Gomez has a nice post about why this is required by Dexterity to make GP run correctly in this post:

Microsoft SQL Server DSN Configuration

Microsoft - Creating Indexed Views

Gavin wrote this which inspired me to document the issue before others fall into it...

Brief overview and comparison of how summary values are stored and calculated in Dynamics GP, Dynamics NAV and Dynamics 365 Business Central