Enforcing field level security in Dynamics GP

Field level security is a useful Dynamics GP feature. It used for easily securing a field or form, however I have seen the admin user interface form confuse IT admins who are not used to the way GP does things, or simply mistakes being made setting up security, leading some users not having the correct security applied.

fls

The problem I am solving in this post is where some of the field level security policies need to be always enforced/applied to (almost) all users  in GP. To solve this I wrote a quick SQL script to enforce the field level security policies for one of the GP companies. The script ensures that all users who are not in the SYSADMIN class, have the supplied list of field level security policies applied to them, for a particular company. This can be put in a stored procedure and ran within a loop to apply to a list of GP companies, should that be required. It could also be set to run at frequent intervals via a SQL scheduled job, for peace of mind.

If automating the script I would recommend introducing a step in the script to check all the field security IDs hard coded in the script still exist, perhaps raise an exception if they don't so the SQL job will report a fail to admins. For simplicity I have omitted that stage in the example below.


-- Script will apply LISTED field level security ids
-- to ALL users, that are not of class of SYSADMIN
-- for the GP company id specified
--
-- Intention of script is to prevent human error in setting field level security
-- allowing users to slip though and not having it applied to them.
-- It will do this enforcing the security on selected company for all those users.
-- T.Wappat 2016-11-09
-- Check the validity of this script on a test instance before running in production
-- Author assume no responsibility for errors and omissions, or for damages resulting
-- from the use of the information contained herein.

DECLARE @CMPANYID as smallint
SET @CMPANYID=2
MERGE WDC41500 AS target
USING (
SELECT ut.USERID
,ua.CMPANYID
,secid.Field_Security_ID
FROM SY01400 ut
JOIN SY60100 ua ON ut.USERID = ua.USERID
CROSS JOIN [WDC41300] secid
WHERE ua.CMPANYID = @CMPANYID
AND ut.UserStatus = 1
AND secid.Field_Security_ID IN (
'IV_INACTIVE'
,'PRICELEVEL'
,'SOPSHIPTO'
,'SOPSHIPTO2'
)
AND USRCLASS != 'SYSADMIN'
) AS Source(UserID, CompanyID, Field_Security_ID)
ON source.USERID = target.USERID
AND source.CompanyID = target.CMPANYID
AND target.Field_Security_ID = source.Field_Security_ID
WHEN NOT MATCHED BY target
THEN
INSERT (
CMPANYID
,USERID
,Field_Security_ID
)
VALUES (
2
,source.USERID
,source.Field_Security_ID
);

Even if you don’t have the same example problem to solve, the above script may be a leg up and help orientate the reader so they can solve the problem that they do have. Apologies that it is in a MERGE statement, when I started writing it, I was expecting it to get more complicated than it ended up being, an insert on a join would have done this too.

 

Dynamics 365 Financials - Dynamics GP Migration Extension

Had a quick play with the migration tooling for Dynamics GP to Dynamics 365 Financials today, but failed pretty quickly with the lack of a data export option in GP. Here is where I got to:

In Dynamics 365 Financials go to: Services & Extensions –> Extensions –> Extension Marketplace, download “Dynamics GP Data Migration

From the home screen go to Assisted Setup & Tasks –> Migrate business data

Use the … button to select Import from Dynamics GP rather than Excel.

GPMig0

The next step will ask for the file exported from Dynamics GP. I was puzzled as to how to generate this. To my knowledge is no “Export Data” in GP, I looked and nope none there. The instruction is to:

2) Open Export Data (File->Maintenance->Export Data)

GPMig1

from this reference, Migration options for Dynamics 365 for Financials , it turns out that this is a new GP option in “GP 2016 June Hotfix, or GP 2013/2015 Nov 2016 Tax release” onward. There is some more information on the above linked post regarding how the GP data will be imported too, go read.

So time to install and launch a newer version of GP before continuing….

Dynamics GP - Arithmetic overflow error converting IDENTITY to data type int

dexRowID

Anyone who looks at raw Dynamics GP tables is familiar with this column DEX_ROW_ID that occurs on most of them. It is an identity column, it auto increments an integer for each new record inserted. In GP it is an of integer data type, this means if the column was seeded with an initial value of  -2,147,483,648 it would increment up to 2,147,483,647 before running out of numbers to use. GP starts and zero as a seed so this halves the quantity of numbers available.

Today I started getting notifications from GP of an error that users were experiencing. It was essentially the following error when trying to use a product creation wizard add-in that I created in GP.

Type: eConnect.eConnectWebServiceException
Message: Arithmetic overflow error converting IDENTITY to data type int.
Arithmetic overflow occurred.

AS it turns out this is an issue I had expected to bite one day and never did anything about it.  I feel guilty, however on the plus side I found the issue very quickly, it being related to the third table I checked for issues. You may want to use the script further down to find the problem table if you are reading this while fire fighting a production issue that is similar in nature.

The error indicates that the IDENTITY column of a table is trying to use a value over that which the data type (int) supports. In plain English this means the DEX_ROW_ID of one of our tables is trying to use a number over 2,147,483,647 and it can’t as it is not allowed as that is bigger than an integer. This does not mean there will be that many records in the table, as some may have been deleted since they were created. DBCC CHYECKIDENT in SQL can help find the next identity number about to be used.

In this case there are overnight processes that manage the 1.5 million selling price points, triggered from changes in item standard cost and exchange rates. This over the years builds up to a lot of inserts and deletes of price (currency) records. It would seem last night that process broke the IV00105 table, the currency price table. – later I found an error report from that process of that issue in my in box. I would have found it after my morning coffee, had this not all kicked off first.

So what to do?

We have a large number of users hammering GP all day so downtime is not desirable, so thinking on my feet I looked to see what the min and max values were in GP for DEX_ROW_ID of that table.

select MAX(DEX_ROW_ID), MIN(DEX_ROW_ID) from iv00105

Min: 637644
Max: 2147403452

I was originally running this, intending to reseed back to zero as my assumption was there would be, as there is, a gap that would be big enough to get though the day.

However look again 2147403452 is not 2147483647, the maximum integer value.

I checked what the actual next number the column would use would be with  the DBCC CHECKIDENT command:

DBCC CHECKIDENT (IV00105)

Indeed this verified the identity column was currently sat at 2147483647. This is helpful, some identity numbers must have been used up but records never created, something in itself that needs checking out later, but for now it gave me the option of simply reseeding the identity column to the last used number, 2147403452. Thus new records could then reuse the gap up to the ceiling limit. Executing the following SQL reset the seed on the table and let the users carry on.
DBCC CHECKIDENT (IV00105,RESEED,2147403453)

Problem solved for today -as there are now 2147483647 - 2147403452 = 80,195 records for use today, before again reaching the limit of the column. That is plenty enough for one day, tonight I will renumber the DEX_ROW_ID column to take out any gaps, compressing it to take minimum number of values up, after which I will reseed to the new highest value. That should keep us going for another ten years…

Detecting DEX_ROW_ID identity columns near to the limit

Lastly I was bothered that there may be other tables close to the limit, for example I know the IV00108 gets a lot of activity too. So I looked for a monitoring solution as prevention is better than cure.

I found Shaun J Stuart has a script similar to others I’ve seen on stack overflow and others from Google sources, this would seem to suit my purpose!

I’ve added a parameter for setting the threshold and reproduced it here in case that source site ever goes down:

DECLARE @SchemaName sysname
DECLARE @TableName sysname
DECLARE @ColumnName sysname

DECLARE @threshold DECIMAL(3,2);
SET @threshold = .95;

DECLARE @Name sysname
DECLARE DBNameCursor CURSOR
FOR
SELECT name
FROM sys.databases
WHERE source_database_id IS NULL /* no database snapshots */
AND is_read_only = 0 /* no read-only dbs (log shipping) */
AND database_id <> 2 /* skip tempdb */
AND state = 0 /* online databases */
ORDER BY name;

CREATE TABLE #IdentityCheck
(
PK INT IDENTITY(1, 1)
,DatabaseName sysname
,SchemaName sysname
,TableName sysname
,ColumnName sysname
,ColumnType VARCHAR(8)
,MaxValue BIGINT
)

OPEN DBNameCursor
FETCH NEXT FROM DBNameCursor INTO @Name
WHILE @@fetch_status = 0
BEGIN

EXEC('USE [' + @Name + ']; INSERT INTO #IdentityCheck
SELECT '
'' + @Name + ''' as DatabaseName
,SCHEMA_NAME(o.SCHEMA_ID) AS SchemaName
,OBJECT_NAME(o.OBJECT_ID) AS TableName
,c.[name] AS ColumnName
,t.name AS ColumnType
,CAST(last_value AS bigint) AS MaxValue
FROM sys.identity_columns c
,sys.types t
,sys.objects o
WHERE c.is_identity = 1
AND t.system_type_id = c.system_type_id
AND o.object_id = c.object_id
AND o.type = '
'u''
AND t.is_user_defined = 0 /* only look at system defined types */
AND t.name IN ('
'int'', ''bigint'', ''smallint'', ''tinyint'')')
FETCH NEXT FROM DBNameCursor INTO @Name
END

CLOSE DBNameCursor
DEALLOCATE DBNameCursor

SELECT 'Table [' + SchemaName + '].[' + TableName + '], column ['
+ ColumnName + '] in database [' + DatabaseName
+ '] is an identity column that has reached over 90% of the maximum value for that datatype.' AS Details
,'' AS ChangeRequirements
FROM #IdentityCheck
WHERE 1 = CASE WHEN ColumnType = 'bigint'
AND (CONVERT(DECIMAL(19, 0), MaxValue)
/ 9223372036854775807.0) > @threshold THEN 1
WHEN ColumnType = 'int'
AND (CONVERT(DECIMAL(10, 0), MaxValue)
/ 2147483648.0) > @threshold THEN 1
WHEN ColumnType = 'smallint'
AND (CONVERT(DECIMAL(5, 0), MaxValue) / 32767.0) > @threshold
THEN 1
WHEN ColumnType = 'tinyint'
AND (CONVERT(DECIMAL(3, 0), MaxValue) / 255.0) > @threshold
THEN 1
END

DROP TABLE #IdentityCheck



--Read more: http://shaunjstuart.com/archive/2013/05/identifying-identity-columns-approaching-their-limit/#ixzz4PPdawizh
--Under Creative Commons License: Attribution Non-Commercial No Derivatives

Running the above gives results as two tables, the table from the production and the same from test (restored from production) companies. These are IV00105, the one causing me a problem today.

IdentResults

So it can be seen that this script should be added into our SQL alerting system, to email this result set when ever there are any results to the query. Thus providing an early warning of this issue happening again in the future.

Compressing the identity column

For this table I used the following script, I do so because know there are NO

  • triggers
  • cascading deletes
  • any change tracking
  • table relationships

I would warn you to be very careful on how you decide to tackle this and think about what impact ripping the records out and putting them back again may have before doing it. There are some different approaches that could be used,  as always check on a test environment first! I cannot recommend you use the same script for your system as I don’t know your install and how it would interplay with it.

BEGIN TRANSACTION
--Create copy of data
select * into #temp From IV00105
--Empty exiting table
truncate table IV00105
--Reseed identity to 1
DBCC CHECKIDENT (IV00105, RESEED, 1)
--Put the records back
insert into IV00105
SELECT [ITEMNMBR]
,[CURNCYID]
,[CURRNIDX]
,[DECPLCUR]
,[LISTPRCE]
from #temp
DROP table #temp
--Check it looks good
SELECT TOP (100) * FROM IV00105 ORDER BY DEX_ROW_ID ASC
SELECT TOP (100) * FROM IV00105 ORDER BY DEX_ROW_ID DESC
--Seed look good too compared to max record?
DBCC CHECKIDENT (IV00105)
ROLLBACK TRANSACTION
--COMMIT TRANSACTION

If this helped let me know with a comment…

GPUG Summit 2016 Tampa USA - Part 5

Sadly Last Summit Day started with a last walk into the conference centre along the river walk with its art works, miniature trams and other artefacts along the way.

miss

Today Bob starts the day off announcing next year’s summit details, Nashville October 2017.

Summit 2017

He also lets us know what the other events look like. Note how reIMAGINE has been made earlier to deal with the issue of too many days out the office next to each other. We also see AMPLIFY back again. Some were wondering if this would be come an biannual or on demand event, but it looks like its annual for now.

calendar

 

The GPUG all star awards were awarded and I got to have a close look at Amber’s award, she was so happy!

Allstars[3]

Amber

Steve Endow and I had an advanced .NET developer session in the afternoon that was fun. I brought some IoT kit through airport security, for the session to demo and also talked about harnessing GP protocol handers for your own applications. Steve showed some really cool demos using reflection to attach buttons dynamically and without programming to GP windows among other things. I was expecting eight people on a Friday afternoon session and for something as specialist as our session, so when I counted thirty people while Steve was talking I was quite amazed – thank you, every one of you for turning up.

iot2

iot1

With that it was time for most people to go home. My flight was not until Saturday afternoon, so was potentially on a lonely last evening.
What better way to end the week, Belinda Allen asked me to join herself, Jen Kuntz, Aaron Back, John Lowther and Mark Polino for dinner. I was feeling a bit star struck in such company!

THANK YOU EVERYONE for the experience and friendship.

 

Saturday Morning

I explored Tampa central library, that has linking walkways between buildings, some curious books and a train in the kids book section.

last1last2

I then checked out the Art Museum – and it was family day so I got in free! – bonus!

We are spoilt with the big arts museums we have at home, but it was nice to potter around. While I was sitting deciding what to do next Jen tweeted that she was heading to the airport via a cheap shuttle. I contacted her to find the details and arranged to meet up for the shuttle.

last3last4

I grabbed a lovely lunch at the Duckweed Urban Grocery store/cafe, a great place to check out if you are in Tampa.

last5

I then got the hotel shuttle to meet Jen and started the long trek back to the United Kingdom. Arriving home on Sunday afternoon, having bypassed a nights sleep.

jen and Tim

GPUG Summit 2016 Tampa USA - Part 1
GPUG Summit 2016 Tampa USA - Part 2
GPUG Summit 2016 Tampa USA - Part 3
GPUG Summit 2016 Tampa USA - Part 4