Mark Polino has released Analytics for Microsoft Dynamics GP

This is a polished Excel document that you can point at your Dynamics GP database and get dashboard information presented in a nice interface.

See this post here Analytics For Dynamics GP – BI Solution From Mark Polino (DynamicAccounting.Net) #MSDYNG

Looking at the dashboard got me thinking, I really struggle with dashboards, I guess I worry about what the summaries are hiding, desire to see more, but then they are useful for some KPI measurements, early high level warning signs too.

Personally with 30,000+ customers and similar number of products, typical dash boards need to take a different shape, so its not for me, but great to see for others who may find it useful especially where you have less customers and products to keep tabs on than we do.

 

Now back to my power pivot and DAX – loving the BI experience in office 2013!  

Case TSQL statement in WHERE clause causing scanning (ss2005)

AND 1 = CASE
WHEN @IncludeFreight='Y' THEN 1
WHEN (ItemNumber NOT LIKE '9898%') AND (ItemNumber <> 'FREIGHT') THEN 1
ELSE 0
END

The above is an snippet from a “catch all” query, I uncovered it in one of our reporting SQL statements it, was declared “WITH RECOMPILE”  as it should, on order to properly optomise, but the query was refusing to finish executing on the 4.5million rows in the table involved. It lots of IO wait states on our table.

It seems that the query optomiser just couldn’t handle the case in the where clause with parameters.

Changed it to the following and it executed in seconds.

AND ( @IncludeFreight='Y'
OR (@IncludeFreight='N' AND ItemNumber != 'FREIGHT' AND ItemNumber NOT LIKE '9898%'))

Most of the theory behind this behaviour can be seen in this article: http://sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries/

Hacking Dynamics GP Perpetual Inventory

Background

It is a requirement that all inventory should be checked annually at a minimum. There is a further desire to ensure that inventory on the shelf reflects accurately that shown in Dyanamics GP inventory. Various factors can cause a mismatch so since inventory management was invented, stock checks are used to verify the quantities by physically counting the items in situe.

By stock counting items more frequently we are more likely to keep the levels in sync between the shelf and in the inventory module of GP.

GP Stock Take

GP has a stock count feature were it is possible to assign a stock count frequency (how many times a year we seek to count an item). The dates when the last and next stock counts were performed and are due is stored against each item, for each location.

A stock take in GP is simply a list of all the items we want to stock check. The user can use various criteria to select the attributes of items they want to put into this list. For the purposes of this discussion I am concentrating on using next count date as a basic for including items in the stock count. The user can select all items who next stock count due date is today or in the past (i.e. overdue).

Counting

When a stock take is started all the current inventory levels are snapshotted from inventory and updated against the items in the stock take list. The user then counts all the items putting next to the expected value from inventory, the actual quantity found by counting the item on the shelf.

An inventory transaction can then be built to correct the stock levels to those entered from the stock take.

The challenge

Setting the stock count intervals can become a challenge if you find a number of this sort of scale

image

Promoting a hands-off approach to system, what is needed is to default the information for stock takes.

SQL Help

It makes sense that the fastest turnover items should be counted more often as they are more likely to drift from the values held in GP. Most companies will classify items, there is an item class in GP that could be used to classify the different types of items or one of the user defined fields that are provided in GP. 

A classic A,B,C,D code can be used for the frequency of item movements, stored in the user defined field.

A table can then be created that has the classification against the frequency that it should be checked. Lets call this table IV_STK_COUNT_INTERVAL.COUNTINTERVAL.

Classification
(REORDERCAT)
Stock Check Frequency in working days
(COUNTINTERVAL)
A 30
B 60
C 90
D 120

The stock count facility in GP has a calendar where working days can be defined, so all stock count activities are bound by working days.

For each item in the inventory sites table IV00102, there are three key fields of interest held against each item in each site;

NXTCNTDT Next count date
NXTCNTTM Next count time
LSTCNTDT Last count date
LSTCNTTM Last count time
STCKCNTINTRVL Stock count interval

For the purposes of this example, ignore the time fields as that level of accuracy is not required.

Hence it can be seen that by updating a join between the classification stock check interval table created earlier and the inventory table, the stock take interval can be tied on a nightly basis to the classification of the product, so automating the maintenance of this field.

Discontinued items and items that have never had inventory

Discontinued items with no stock left can have stock take intervals set to zero, so they don’t get counted. The next stock count date should also be “zeroed” to 1900-01-01. The same applies to items that have never had stock.

UPDATE  IV00102
SET STCKCNTINTRVL = 0,
NXTCNTDT = '1900-01-01',
NXTCNTTM = '00:00:00'
FROM iv00101
INNER JOIN
IV_STK_COUNT_INTERVAL
ON USCATVLS_6 = REORDERCAT
LEFT OUTER JOIN
IV00102
ON IV00101.ITEMNMBR = IV00102.ITEMNMBR
WHERE IV00102.LOCNCODE = 'MainWarehouse'
AND STCKCNTINTRVL != 0
AND IV00102.RCRDTYPE = 2
AND
(IV00101.ITEMTYPE = 2 -- discontinued
OR NOT (IV00102.BGNGQTY>0 OR IV00102.LRCPTQTY>0))
-- if has been no activity on this item ever
;
-- Next if the stock count interval is zero , set next stock
-- check to be 1/1/1900 so it don't come out on reports
UPDATE IV00102
SET NXTCNTDT = '1900-01-01',
NXTCNTTM = '00:00:00'
WHERE IV00102.LOCNCODE = 'MainWarehouse'
AND STCKCNTINTRVL = 0
AND NXTCNTDT != '1900-01-01'
AND IV00102.RCRDTYPE = 2;

Next Count Date

The next count date can be set by adding the stock take interval to the current date (taking into account non-working days). This is ok unless the creation of new products is lumpy in nature, in that case it is better to put some randomness into the interval to ensure the spread of products is better. Preventing no items to stock take one day and thousands the next!

In the script below, for the purposes of this article it has been simplified to not use a calendar table, instead it assumes a five day working week. This is part of a stored procedure called by SQL agent job nightly.

UPDATE  IV00102

SET STCKCNTINTRVL =CASE WHEN iv00102.QTYONHND=0 THEN 0 ELSE IV_STK_COUNT_INTERVAL.COUNTINTERVAL END,
NXTCNTDT =
--stock interval for us does not include the weekends so add them on for this period for the date addidtion
dateadd(day, IV_STK_COUNT_INTERVAL.COUNTINTERVAL +
ROUND(((IV_STK_COUNT_INTERVAL.COUNTINTERVAL / 5) * 2), 0),
CASE WHEN LSTCNTDT='1900-01-01'THEN DATEADD(dd, 0, DATEDIFF(dd, 0, GETDATE())) ELSE LSTCNTDT END )

FROM iv00101
INNER JOIN
IV_STK_COUNT_INTERVAL
ON USCATVLS_6 = REORDERCAT
JOIN IV00102
ON IV00101.ITEMNMBR = IV00102.ITEMNMBR
WHERE IV00102.LOCNCODE = 'MainWarehouse'
AND CASE WHEN iv00102.QTYONHND=0 THEN 0 ELSE IV_STK_COUNT_INTERVAL.COUNTINTERVAL END != STCKCNTINTRVL
AND IV00102.RCRDTYPE = 2
AND IV00101.ITEMTYPE = 1
AND (IV00102.BGNGQTY>0 OR IV00102.LRCPTQTY>0)

Stock take

When in GP select the items to add to the stock take by choosing next stock take date as your criteria, selecting your location and choosing 2nd Jan 1900 to today’s date as the date range. This will capture all over due items and todays count items. You may find it useful to set up a reporting services report to show how many items are overdue, emailed daily as part of your KPI setup. This can be done by simply count records where nxtcntdt is not 1900-01-01 and less than today by site.

After Stock Take

When an item is processed as part of the stock take, the next date updated to the interval plus the current date. The last checked date is updated to that of the stock take. This does cause a problem with uneven distributions over time, hot dates can be created when too many items end up with similar dates to be checked. For this reason it is wise to create a report showing the distribution of items by date over a year, to anticipate and take action when peaks are identified. Run the report regularly as the distribution is in constant flux as the different intervals are applied and thrust forward the next check dates.

Set up

Evenly distribute the dates over the year, for the above reason, something relevant when seeding the dates for the first time too.

One more note is that a stock check should be performed by location on a regular basis too, in order to find items placed in wrong bins or items that are showing no stock where stock exists.

Summary

Reporting services scheduled reports

  • Show number of items to count by date as distribution graph over year
  • Show number of items overdue count for KPI purposes

Counts

  • Count all the items due or over due a count
  • Check all locations regularly for items misplaced

SQL

  • Create regular job to align the stock take dates and frequency with the item class
  • Use a levelling script to smooth out the dates should the distribution get too lumpy
--Over due stock checks
select NXTCNTDT as DueDate,count(*) as OutstandingPIChecks
FROM iv00102
where NXTCNTDT >'1900-01-01'
AND NXTCNTDT<getdate()
AND STCKCNTINTRVL>0
AND LOCNCODE='1'
GROUP BY NXTCNTDT
order by 1

--Stock Calendar Exception Days
select * FROM iv41001
-- Stock Calendar Setup
SELECT * FROM IV41000

--Show counts by week

--Uses helper table
--CREATE TABLE [CompanyCalendar](
-- [DateID] [datetime] NOT NULL,
-- [Day of Year] [smallint] NOT NULL,
-- [Day] [varchar](10) NOT NULL,
-- [Day of Week] [tinyint] NOT NULL,
-- [Day of Month] [tinyint] NOT NULL,
-- [Week] [tinyint] NOT NULL,
-- [Month] [tinyint] NOT NULL,
-- [Quarter] [tinyint] NOT NULL,
-- [Year] [smallint] NOT NULL,
-- [WeekDay] [bit] NOT NULL,
-- [Business_Day_Flag] [bit] NOT NULL,
-- CONSTRAINT [PK_CompanyCalendar] PRIMARY KEY CLUSTERED
--(
-- [DateID] ASC
--)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 90) ON [PRIMARY]
--) ON [PRIMARY]
--GO

select [Week] CalendarWeek,count(*) as QtyPIChecks from iv00102
JOIN CompanyCalendar
ON NXTCNTDT=DateID
where NXTCNTDT >'1900-01-01'
AND STCKCNTINTRVL>0
AND LOCNCODE='1'
GROUP BY [Week]
order by 1

Finding execution plan of executing query

If you have ever had a query that won’t run to completion, try this…

Captured to protect the information and as personal quick reference from Viewing the Execution Plan of a Running Query in SQL Server

  1. From SSMS find the session id of interest
  2. SELECT plan_handle FROM sys.dm_exec_requests WHERE session_id = {sessionid}
  3. SELECT query_plan FROM sys.dm_exec_query_plan (0x06001D009FE38431400399D4000000000000000000000000);
  4. Save the results as from above as .sqlplan and then open the file in management studio

There you go! Job saver!

.image