NAV TechDays 2017 - Day two

Day one is here: #NAV Tech Days 2017–Antwerp Belgium, Day one

Day two and a tough start, Friday and a busy month was catching up with me. Dragged myself out of bed at a hideously early hour, to catch the shuttle bus to the venue. Again the organisation is faultless with stewards outside the hotel helping people to the buses and said buses running right on time.

Breakfast was laid on again at the venue, with pastries, sandwiches,  cereal and milk, yoghurt, coffee fruit juices etc. quick bite and right into the first session of the day…

Find below some highlights of the day -  I try include enough to give you a taste of the sessions without reproducing the whole session – you need to attend to get the proper experience (or watch the you tube videos when they get released)!

PowerApps, Common  Data Services and Common Data Model – Michael Nielsen, Mark Brummel

I was tired so didn’t take many notes, but Mark and Michael showed us about the common data model and the common data services layer in Azure.

Michael pointed out how there are shortfalls in the CDM, when it comes to basic ERP functionality. For example, where sending invoices to a different person to the delivery address person, this is not possible as the person name is held at a level above the addresses. There are limitations/problems with the model all over due to the data model coming from AX/CRM or other product teams that do not necessarily map well to NAV (or Dynamics GP I would add).


Slide below shows the model that you can navigate around if you so wish from repository.


Slide below shows swagger (an API documentation tool).



Easier and DevOps-friendly Dynamics NAV environments using Docker /Windows Containers -  Freddy Kristiansen,  Jakub Vanak, Tobias Fenster


This turned out to be the best session at NAV TechDays. Partly due to timing as I’ve been playing with Docker recently.

Here is a summary of the points from the presentation:

NAV container is quite a large one due to the number of dependencies, such as .NET, SQL, IIS etc.

IF you are new to Docker, don’t go looking to find a remote desktop connection into the container, you can’t it simply does not work that way.

Advantages of containers is uses much less space as only the difference is applied if you have multiple Docker images.

Immutable base and only changes are stored from there so more containers that are similar, then that makes for more efficiency.

The images are easily accessed and pulled down, gives reliable state of environment that can be reproduced for test, dev etc.

Can configure to allocate resources to different containers, resource governance.



Can map ports from external into to container using port mapping, if access required outside the machine host.

Docker uses NAT to allow access to the container from host.

File System:

The file system is like chroot in linux – mapped file system to directory on host.


Then demo of first container.

Using Docker run to start new instance


How is Docker NAV image created:

  • Base image based on Microsoft/Windows server code
  • Then sql express and iis layered on top
  • plus installations scripts (PowerShell)
  • NAV generic image applied
  • Parts of the NAV installation DVD installed
  • then install scripts from generic layer is added
  • DVD is removed from the image, after running
  • Country database added on top of (country images)

What happens when image is started?

The scripts can be overridden and only those parts required are ran. For instance a company SQL server could be used instead of SQL express.

Note the “if necessary” on the what running.

Extending the NAV Docker image

It was then covered how to extend the standard NAV images:

Making your own image does not make sense as you can’t get the benefit of the regular releases, better to extend the provided container.

The PowerShell scripts created in way to extend Docker images on  the fly..

PowerShell will start everything up SetupConfiguration.pa1 in folder of start.

The scripts look for override scripts if they are present in folder and runs if there

Many scripts are empty placeholders provided as hooks, they are empty for you to override such as addionalsetup.ps1

this provides for many extension points

Demo on extending the container

In your extension still call the original for best practices, unless it is setup certificate for example where it does not make sense.

Example of script could add domain users from domain to NAV Docker image or practically anything else



Build and resuse your own images

After extending the standard NAV Docker images you need to persist that and resuse your images.

/dev /qa / test  need to be as close to each other as possible so this is a good use case for containers

To capture the image, can stop the image then label with a save changes

Normally images are in a public registry but you can set up private registry lets you store your own images in your environment, you pull from your own registry.

Create your own dockerfile = “image recipe” and build images

Resource governance:

Use the resource limits for the containers so the instances don’t impact the whole host.

example If out of memory error, it stops and restarts the container,  if configured that way.

warm mode where able to scaling and high availability run the same container on different hosts. The others take over.  This is good for High Availability.

Multi-cumulative update environment

Docker solves massive headache of the cumulative updates

Github navcontainerhelper project

a number of helpful PowerShell scripts for running NAV containers. Can run them from the PS gallery too.

This session was interesting, I would like to see this applied to Dynamics GP which is my usual development target. As Docker is only dealing with the service layer, it suits the webclient most, but with click once or copy deploy we can get value from it for use in windows client too.

There were many questions from the audience after this session. The buzz around Docker continued throughout the conference. Partners can see the benefit of managing many customer setups in a more efficient manner than VMs.

Source Code Management with Visual Studio Code Made Easy – Soren Klemmensen, Janas Andersen


First half was a intro to Source Control. Rapid intro as to WHY you should use source code management to introducing Git then to how this looks in Visual Studio Code.


There are some good helper commands for NAV in VSCode- NAV export to Git for example that pushes all the objects into Git from NAV.

If you work with source control, apart from seeing the VSCode Git and NAV integration parts, it was pretty slow as explaining basics of source control actions.

Demo showing triggering automatic builds from check ins, launching Docker for the build.

If this session represents the normal state of a NAV development team then I’m worried, he talked as if no one is doing Agile or using any kind of source control – really?


Lunch arrangements, soup and selection of filled buns with quality fresh salad, cheeses, fillings etc. A large number of these “stations” were around the exhibition area meaning no big queues for lunch and good footfall for the exhibitors.


The conversations on Docker were very prevalent in the Airport shuttle waiting lines and on the coaches back to the airports. Partner developers really bought into how Docker can radically change how they manage customer images.

It was interesting to also hear concerns about how to deal with the new world of GDPR where having customer’s data on your system becomes a liability.

#NAV Tech Days 2017–Antwerp Belgium, Day one

For day two: Day two summary

So here we are in Antwerp Belgium to attend my first NAV event. Being primarily involved with GP this is both a research trip and a chance to network with the wider Microsoft Dynamics community.

NAV TechDays model

Venue and organisation

There are 1236 participants at todays event, all there to find out more about NAV development and that is a sizeable focused audience! Look at the global magnet NAV TechDays has become!


That is an amazing turnout for such a niche, in fact, it is 200 up on the previous year. At this rate the event could well out grow the venue in a couple of more years if numbers continue to grow  like this. The venue is a multiplex cinema complex that has been designed to be a conference venue too. It splits in two when a conference is on, providing very acceptable facilities. For the sessions, it was nice to be somewhere where the seats were plush and comfy, the projection screen, obviously were giant and excellent quality. They were so big as to be able to show a real time view of the presenter combined with the computer feed.

Exhibitors area

I feel it worthwhile to praise the NAVTechdays conference logistics and general arrangements, it has been effortless to attend, from an attendee perspective, staying at the conference hotel. Shuttle busses from the hotel to the venue, and shuttle busses laid on from the conference to the airports after the event. Food & open coffee, Coke/Fanta etc chiller cabinets open and free throughout. Quality food, breads, soups and a wonderful selection of excellent food choices on the evening. The venue is arranged with the food and beverage counters intermingled with the exhibitor stands. This makes for high volume, repeated footfall for the exhibitors at each break, lunch and walking dinner on the evening. The free drinks bars on the evening were amazing with a good selection of drink options, including Belgium beer tasting with a wide selections of beer strengths and types to try. All this makes it easier for the exhibitors to prey on the drunk and helpless! As the sponsors and exhibitors make these sorts of events economically viable it is great to see it give them the best opportunities. The signage around the venue and conference hotel meant all information was available up front, with shuttle time tables, cloakrooms etc all well advertised. The social evening went well with casino games and a busy floor right into the evening. The sessions I attended all ran to time starting promptly and ending on the scheduled time.

belgium beer

I found it interesting that there was mini presentation area off the exhibition hall for min presentations by exhibitors during breaks. As the area gets very loud during these times with everyone chatting and getting drinks, bluetooth headphones were worn by anyone wanting to watch. I thought this solved the problem of condensing more into a limited time, meaning those wishing to attend these mini sessions could grab pastry and drink and sit for the mini session immediately, it also was good for the people running the sessions as they were very visible, and thus the ones I saw where well attended.

The A/V set up was on the whole good- helped by the fact it was a cinema so acoustics were good but the AV company did a good job in addition. I’ve seen it before, but worth pointing out that the Q&A at the end of each session used these catch box microphones, basically a microphone mounted in a foam block so the block can be chucked to the person wanting to ask the question allowing them to quickly get the microphone and for everyone else to hear the question. Also the block acts as a baton for those with the right to talk as sometimes found in debates. The block shown has a headset mic lying on top of it that is nothing to do with the device.

catch box

This conference is an indy conference, these tend to be the best as no commercial agenda (other than conference’s survival) is influencing it, generally meaning the content should be much more in tune with what the visitors would like to see. It was nice to hear Microsoft say they will support the event as long as it is around. The NAV community seemed really friendly and much the day felt much like a GP community event, just with less familiar faces.

Keynote opening Session – Vincent Nicolas, Thomas Hejlsberg

Next year NAV will be getting a rebrand – to become Dynamics 365, current working code name is “Tenerife”. This lead to speculation as to what name #NAVTechDays will become next year for 2018.

The common data service (CDS) of Dynamics 365 has not been widely adopted yet, that must change.

Machine learning and other Azure services such as Cognitive services are touted as becoming areas to watch going forward, transforming the way businesses work in new ways not seen before.

It was noted that machine learning examples can already be found in NAV ( forecasting).

Pushing for adoption of the now comprehensive and constantly growing offerings in the Azure cloud.

A deeper insight of how the service fabric provisions a NAV instance so quickly and provide adequate reliability was provided. Buffer tenants, pre-provisioned databases are waiting in the wings and are dynamically allocated to an tenant when provisioning occurs, thus avoiding excessive preparation, working within a few seconds. The database is shared between tenants on Azure SQL server, the load each shard is under is monitored, the databases are moved around the Elastic pool to even out load on machines. Shared databases leads to extra work in backing up only the correct information required for that user, but offers overall cost savings by sharing machine resources.

The elastic pool optimiser is responsible for making sure the load is balanced, if a database gets too hot, it will be moved to a quieter host machine.

This is all App Fabric – can kind of be thought of as the Azure operating system.

It was noted that the NAV team can now roll it out to a new datacentre quickly as required.

Telemetry required to keep Azure running to required service levels

Monitoring – Diagnostics – Analysis

There is extensive telemetry used to keep the services running sweet. Logs are stripped of personal as the monitoring software or nodes may not be in the same geo location as the server, so this is done to prevent personal info crossing country borders. Kirsto Geneva and other names were thrown around in context of these operations.

ICM incidents are logged and we saw behind the scenes at how the network engineers can dill right into any problems, right into the blocks of code running that may be causing the issue.

Extensions 1 is dead, 2 is current, 3 is in planning.

This is a CSIDE replacement with new complier.

We saw how the click and drag GUI visual designer works for pages. This was impressive. It makes it easy to move a field, remove it, or a a new one. The visual designer mode seems very nice and helpful to use. The terms page extension and form must be sued carefully or you’ll get a laugh from the audience.

Once the visual design is in place, use visual studio code editor with the AL extension to start coding against it. The design changes are exported to Zip that is then extracted to the developer file system and VS code opened against it. Rather than being linked to the NAV database Visual Studio Code, simply uses the file system as its repository.

There is not project file, merely a folder structure.

Project is defined by the folders and the launch.json file.

The launch.json file contains the connection details of the server and is not normally checked into source control.

To know what NAV looks like, it is necessary to extract the symbol reference from the  NAV instance.

It was demonstrated how Visual studio code updates the NAV instance with F6 that overwrites any exiting configuration.

If was shown that a start object ID is required to get going too in the launch.json file.

The Type PageExtension object was shown in the context of extending the customer card.

VS Code gets the symbols information from NAV so it knows what is present in the NAV instance. A bit like running the DAG.exe against Dynamics GP to get the reference assemblies for GP addins.

F6 and F7 must be used carefully as you can overwrite changes if you don’t pay attention.

If further changes are made then they can be synchronised either way as required by VS Code.

It was announced that PageExtension now has been fixed so it has access to curpage.

Some changes to architecture are being made. As NAV holds its code in the database, the limit on the row record size has caused problems for some. Also multi tenant causes issues. Thus now each extension will get its own table for extensions making things better isolated. Companion tables are used for database extensions that make more sense on shared databases where one customer has extension to DB and yet, others do not. The companion tables are SQL joined to the base product tables so the AL code does not care as it sees the record fields through the join.

Deploy and Install

These operations have extra event triggers are introduced per company database? and per company, to allow upgrade code to run appropriately and to be able to tell difference between upgrades and installs.

If schema changes mean a column is not longer used in the database, then the column is abandoned, to be later cleaned up by a clean up routine where the column is then actually removed. Upgrades also run like a transaction, so if anything fails, everything reverts back to how it was. 


There are 44 API all documented for use in NAV and the REST APIs now support actions too.  Aim is APIs need to be really easy for non-NAV people to consume.


For 2018 the UX team have worked hard. Although people seem to call the windows client the role taylored client, really the web client is role taylored too, so you still see this in the web client. New cleaner, simplified navigation comes in, the outlook style side bar is gone as it doesn’t fit for the modern application. There is now also a new click preview that pops up to preview things that can be clicked, this may replace the fact boxes eventually.

The grid views have been refined, inspired from the Financial Times website treatment of figures in grids.


A quick introduction to Docker and how it can make life so much easier for getting the latest working images of the NAV product on a local developer machine.

This included a demo, but as I’ve done this before, I didn’t pay attention.

Future is here…

Emphasis that cognitive services will change the way we work and offer many new services, by showing the voice comprehension trial video of the McDonalds drive through ordering we saw at GPUG Summit in Tampa last the previous October. It still is impressive and does show how in controlled subsets these technologies can make business processes more efficient and less error prone.

It was also pointed out that SQL server now has the PREDICT() function so predictions using Azure ML can be made right from the database itself.

Deep Dive into the New Tools Stanislaw Stempin, Jesper Schulz-Wedde

This was after the break, this turned into an enjoyable, joined up look at the development process in the new world of designer and VS Code.

Get VS Code, install the AL extension.

Hit F1 and type al.go,  this will kick you off.

After the wizard you have an empty project. The project is defined by the file structure, no project files. You do have a Launch.Json that should not go into source control.

We saw how to build an extension that uses bing translation service to translate language on item.

Remember VS Code is not connected to the NAV database, instead it is connected via a service, using symbol references to NAV. When finished, it complies and zips up the content, sending it back to the service that compiles it again and makes the objects that get bunged into the database.

Symbols is like the DAG in GP, it gets the current state of NAV and captures it so that at design time you can get intellisense etc.

For formats of the files are cleaned up versions of the txt obtained by exporting code from NAV, made more structured and human readable.

The many NAV designers are gone for now. You have to start with a blank code page, but intellisence, context aware filling etc in the VS Code help you find your way.

The Code resembles the grids (so they say).

There are a number of code snippets also available for extending pages and tables etc that also make it easier.

CTRL F5 F6/F7 are your friends in this world.  Got to watch you don’t overwrite your DB if it has changes in it with code from VS Code, take care with F6/F7.

Again it was pointed out that on the cloud you are not alone on the machine, so resources are shared and data is sandboxed.

Note that:

.NET Interop is not available any more – although some useful frequently used .NET functions have been implemented like httpclient. However no access to file system as dangerous on shared cloud machine. Replacements are being though about for File.IO in another form.

Some platform APIs are not available

Methods not supported.

Can debug using a service. Debugging is started in a debugging context to allow others to continue to use the application. It uses bi-directional signal-R connection to debug remotely.

Multiple sessions may be debugged at the same time due to context.

Apps can have dependency on each other. They are complied into a .dll like assembly for referencing.

Client Addin

this used to take 3pages to explain and painful to implement. So improvements made. Now really simple to do.

Client Translation

Resources can now be exported as XLIFF a universal file format that language translators use with many commonly available editors. Just set the meta data up appropriately to get it output for translation.


Legacy upgrades

it is possible using


and txt2al.exe

to export AL code and get it converted. It will not re-architect the code to be event driven but helps get leg work done.

Mark Brummel has a webcast about this I’ve watched in the past on the SQL Skills you tube channel.(think its in this one?

Improvements in testability.

Improvements in developer experience around testability – I lost concentration by this point, but have seen blog posts and webinars out there on this.

Azure Functions deep dive Vjekoslav Babic

Most of the Azure functions talk I’d done before, it was more a introduction to than a deep dive, although it did give the NAV context their use. A common case may be to get around the lack of .NET interop.

Performance considerations due to latency were also given by stress testing functions in different geo locations. Something to be aware of if you have an organisation split over the globe.


Also nice demo of how to get continuous deployment working with Github and Azure functions.

Creating Great APIs Anders Larsen, Nikola Kukrika


api entities

Turned out not to be about how to version, plan and document your API with swagger. Instead it was about the NAV apis, which to be fair was as interesting. I tried to play back in the early preview days, but I guess nothing was ready back then which is why i failed to get anything working.

The three ways to authenticate were covered, and how to get keys etc.

The APIs are off by default, so you have to turn them on, this was also shown.

end points overview

It seems the NAV team are being wagged by the Office team as they have to conform to their standards, as the API is part of the office APIs so must be performant to be included in the Graph API. To get the performance it has been necessary to build shadow tables to pre-compute the computed columns and the like to get the performance. Guess this is one advantage to we have in Dynamics GP  as it is holding the summary data in DB rather than dynamically calculating it.

There was also covered complex types and parent child relationships. Reading binary data like images and pdfs.

The slides when they come out will be self explanatory on this session so I can’t add much. We did get some handy URLs for getting started though.


getstarted apis

Walking Dinner

Good chance to network, got to meet Mark Brummel & James Crowter – everyone says it – but one of the best things is meeting in person social media friends.


Finally I got a picture with #IamDynGP and NAVTechDays combined…