Bob Dylan sang “Times they are a changing” and his words could not be more true for the Microsoft ERP sector this year. We are all anticipating news on Dynamics 365 at the NAV User Group Summit in Tampa tomorrow. Later this month NAV 2017 will be released and project Madeira is in full swing.
Microsoft is making a huge push to the Cloud and the key is going to be around a development concept called extensions. We already have examples of those built for project Madeira in Microsoft’s AppSource.
Extensions were introduced in the NAV 2016 version, but with limited capabilities. The idea is to code outside the main codebase of NAV and connect your code with hooks into the standard system. This of course will make upgrade of the base much easier and should in theory also make upgrades of extensions easier. The code is more modularized and more transparent to handle. NAV 2017 is supposed to improve extensions significantly and we can sure expect that trend to keep going.
The concept of extensions does fire up some interesting questions. For example when you need to fire your code inside a standard codeunit, you will only have a limited number of places to have your code executed. If none of those hooks work for you then you might have issues. Additionally, I am not sure how you would extend an extension. I am sure Microsoft is working on those questions and already might have answers.
Obviously with extensions, you can upload you code change to the cloud and improve an existing system without knowing the underlying base code. Which I assume is the ultimate goal.
iNECTA will be in full swing at the NAV UG summit in Tampa. Please check us out at booth #1045.
Also, if you are new to NAV, check out the session “Stump the Experts: New 2 NAV” which I am speaking at, Thursday, October 13th 3:00 – 4:15.
I, along with the team at iNECTA, are very excited to attend NAV UG Summit this week. The NAV user group has grown tremendously over the recent years to become the “go to” conference for Dynamics NAV users and partners alike.
One of the main things I like about NAV UG is that users can share experiences and solutions between themselves. As a partner, our role is to elevate the user as much as we can. We are the trainers that teach the users to fly. Once the user has gotten some basic understanding of the system and is starting to feel comfortable moving around, then NAV UG is an excellent venue to take your knowledge to the next level.
I encourage all to try to socialize with other users and talk about their experiences. Even if you are not in the same field of business the knowledge will usually apply. That is one of the greatest benefits of working with a system like NAV. It is built on generic business concepts that are meant to solve any type of operational issue that arises. The same instrument could be solving one issue at your company and a different issue at another. A good example of this is an old favorite called item charges. Those I have used to solve multiple different problems.
I want to send my thanks and adoration to the organizers of NAV UG for their efforts to build this group to such size and success. With all the new things that are being delivered by Microsoft every year, these conferences are now an absolute must to attend for the progressive company which puts technology as the key to their success.
Cash flow is usually a concern for every business. Historically, ERP systems haven’t been very good at providing realistic cash flow reports. The reason is mainly because the computer assumes things happen on the day you predetermine in the system. For example, if you have an outstanding sales invoice, you’re supposed to get paid on the date due. The system will assume you’ll get the funds into the bank on that day. This is of course, not always the case. The same goes with your accounts payables.
Similarly, if you have investments being made on a schedule, you wouldn’t want to log those investments into your chart of accounts or bank before they happen, but you would want to see them on your cash flow statement.
This is the reason why Excel has been the tool of choice for CFOs managing cash flow. Although Excel is great at many things, it’s not connected to your ERP. There are solutions that allow you to connect Excel, but in my mind, it’s not the way to go. The best way to manage the process is inside the ERP, where you have access to all the data and can build logic to connect the dots.
A couple of years ago Microsoft released a cash flow specific module inside NAV. The idea is to maintain a separate cash flow ledger that is not connected to the General Ledger or any of the customer, vendor sub-ledgers. You start by creating a cash flow forecast and then suggesting entries from the sub-ledgers into a worksheet.
This makes dealing with the ever changing cash flow easier. Your AR and AP are now suggestions, which you can manipulate. You can also add manual entries. Everything is posted and recorded, so you can go over your assumptions at a later date and see if you were on the right track or not.
Finally, the system generates cash flow forecast pages and charts so you can keep up with where you’re at, helping you make sure your business has enough to run comfortably. For further information about this module, check out the Youtube video.
When discussing the operational system of a company the conversation usually revolves around how to push the right buttons to get a product in and out of the supply chain. Every department can detail what they think is the right way of receiving the baton and passing it on. In the countless meetings I’ve attended, I feel most of the time, the essential point is missed…
People will obsess whether or not it’s easy to work with the system:
Can the system copy/paste information into Excel?
Can I easily find things?
Is it easy to navigate?
Does it look nice?
Does it have lots of reports?
Although Dynamics NAV has all of those qualities, I want to take a step back and clear the table for a minute. Imagine we are looking for the perfect system…
The perfect system of course would look nice and be easy to work with but those would not be its greatest qualities. The single most important quality of a perfect system would be AUTOMATION. The perfect system would do everything for you!
I often wonder why this is not the subject of every meeting that I attend. Let’s try to install the perfect system. Is it that we’ve given up on the possibility? Or are we afraid of losing our jobs to the omnipotent computer system in the sky… or cloud rather?
Do we think that if we’re not coming to work and spending our mornings typing in bank transactions or matching out accounts receivables, then the world isn’t right? If this system makes it easier to key in and is pretty, then the world is great? We would be able export to excel to double check our work and spend a couple of hours making a nice report for our managers.
The need for a new system is often driven by the people who work with it every day. They want a better system, but not a system that takes away their job. I believe this is the reason why we’re still missing the point.
The inevitable advance of technology will ultimately automate and optimize every process. The companies that are too slow to adapt will lose to the competition and be left by the wayside. All levels of the corporation must embrace technology every day and race onwards. The goal simply has to be the perfect system.
If you push for this within your organization, you will not be out of a job. Your value will INCREASE as a proponent of technology.
With the perfect system as a goal, we would then start talking about automating business processes. How can we absolutely minimize the human interaction with a particular process?
Some processes can be completely automated with today’s technology, others cannot. For the ones that still need human interaction, I emphasize adopting an exception handling process. The system can be setup so it handles most of the process automatically and for the instances that is does not, there should be an exception list. An operator would then deal with the exceptions as they come their way.
Dynamics NAV is an excellent choice for people adopting this way of thinking. The standard system already ships with many instruments to automate process flows and with the plethora of independent solution vendors available, the journey to the perfect system is made easier.
Finally, Dynamics NAV acts like a clay in your hands. Even though I don’t advocate heavily customizing your system, you don’t want the system architecture to slow you down on your way to achieving ERP Nirvana. NAV practically begs you to align it to your needs.
Here at iNECTA, we wake up every morning with our guts full of ideas on how to streamline the world. Seeking technological perfection is our way of life. It’s our belief that as we’re improving each company, we are at the same time enhancing value in the world. Value that everyone can benefit from.
In implementing dynamics NAV, I often go through explaining supply and demand. Companies which trade products often do so by managing the supply and demand parameters outside the ERP system. Often the ERP system does not support real planning tools and therefore the staff is forced to come up with alternative ways. In those cases, I would go as far as saying the system is not an ERP system. It is more like an order and invoice printing system. In my opinion Quick books falls into that category.
Real ERP systems, like Dynamics NAV, are built around supply and demand. That is the whole idea of an ERP. Whenever we have a demand, like a sales order, the system automatically triggers to fulfill that demand with either a purchase order or inventory. This can get complicated. You could have demand from sales orders, production orders, transfers and forecast to be filled by inventory, purchase orders, production orders and transfers.
Some people insist on filling each demand with a particular way of supply. Such as, a sales order is fulfilled by a particular purchase order and so on. If the item is homogeneous there is no need to work this way. The system should aggregate the demand and generate purchase orders for the supply. Further it could forecast further demand and make sure you are covered for anticipated demand. All you have to do is monitor the decisions.
All of the statements above seem natural and reasonable, but it is incredible how often companies think they are different and they cannot adopt this principle. Most of the time they are wrong and their business processes simply have to be refined.
In Dynamics NAV, the user uses worksheets to monitor supply and demand. The system suggests what to do give the current state and lists that in a worksheet. Sometimes it is difficult to see in the worksheet why the system is suggesting the action to be taken. Below is an example of an output in a worksheet. We are particularly interested in the Black Loudspeaker. There seems to be a lot of suggestions of cancelling and changing quantities.
There is a new page which lists Supply and Demand by Event. I particularly like this format because instead of listing periods where the user had to look for the changes, it just picks the dates that are affected and lists those. (see below)
This page perfectly outlines all the documents affecting the decision that the system is making regarding inventory. As you can see these can be quite many. Excellent contribution to an already great system!
Every year Microsoft hosts a conference called Convergence. It is meant for all of the Dynamics partners and customers. This year it was held in Atlanta and it did not disappoint. Roughly around 15.000 people attended. The audience this year was widened to include also people interested in any Microsoft business related product, such as SQL Server, Office 365 and Azure.
This year Microsoft’s CEO, Satya Nadella, was the keynote speaker. I had never heard him or seen him live before. I thought he was a very engaging speaker. He spoke about Microsoft being a company set to empower people. This is exactly what I think Microsoft has always done and should continue doing. All the way from productivity tools like Office and Dynamics, down to developer tools like visual studio. Everything programmable, configurable and adaptable. The power is put in your hands.
Of the Dynamics products, it seems that NAV and AX have the most relative growth, with Dynamics NAV out selling the other products by a landslide. This is very exciting news for us who have sworn our life to NAV. Last year NAV added 9,000 customers to go at around 130,000 worldwide. None of the other products come close to this number.
Microsoft’s goal is to unify all their business services so they function seamlessly together. In my opinion, they have done a great job at that. Once a customer goes on the cloud with Office 365 and Azure, adding on the Dynamics ERP/CRM systems is a breeze. All of the power behind the signature products in the Microsoft portfolio basically works in harmony as the data is being fed from one system to the next automatically.
Microsoft made the announcement that they are renaming Lync to Skype for Business. This is probably going to move Lync closer to skype, which is already a solid communication platform. Companies have been wanting a more corporate style communication, which behaves well with firewalls, and they get that with Skype for business, but hopefully will also get all the benefits that come with skype.
Azure and Office 365 have taken massive steps forward. What was once a mysterious concept, the cloud, has now taken form as a technology leap. Users are able to manage all kinds of things, including virtual servers, virtual domains, cloud storage, SQL server and web services, right from the Azure platform. Office 365 gives you access to the complete office including e-mail in the cloud as well as on client. These products have now been integrated further into the dynamics products to allow for an integrated experience. For example, pieces of Dynamics can be spliced into Sharepoint, so the user does not know he is interacting with Dynamics, when he is working in Office 365.
There were many memorable sessions outside the keynote. Claus Busk Andersen and Kurt Juvyns talked about deploying NAV on Azure. NAV2015 is now available in the Azure gallery and can be deployed in minutes on a VM. I did try this and it indeed works. The problem, if you want a demo machine, it does not include the office package so export to excel or word does not work. To configure office 365 and single sign on take a little effort, but power shell scripts exists to get that done without much thought.
I did, out of religion, attend Freddy’s (Freddy Kristiansen) session “Developing in Microsoft Dynamics NAV 2015” He went through showing all the different ways you can extract and modify the customer table in NAV. Although I had seen most of those before, a couple of items stood out. He showed dot net code next to CAL code referencing dot net libraries and they were very similar. Although referencing the dot net libraries can be a little tedious in NAV it is clear that you are not limited in the CAL environment. Additionally he confirmed that getting date through queries, instead of pages is faster. Both support the ODATA standard.
I was surprised to see that Tom Blaisdell did not talk about costing this convergence. He instead provided moral support to the current speaker Benjamin Leposa. Benjamin talked about the costing structure in general, outlining the item ledger entries and value entries. It was a good session as an introduction to costing which is probably appropriate for convergence. I would have liked to hear a Q/A in the end. Perhaps next time.
One of the anticipated features of NAV are the improvements in cash management. Brian Nielsen did a good job in going through those in his session “Tips and Tricks: Cash Management in Microsoft Dymamics NAV”. NAV can now automatically interact with the bank through bank reconciliations and payment journals. This is done through web services with a third party “clearing house” that manages the mapping of data to the banks. Finally we can rival Quickbooks in this area! The system also tries to auto reconcile the bank statement based on rules. A very welcome addition that unfortunately bulldozes some third parties out there.
I can’t to a rundown of important sessions at convergence without mentioning Jesper Raebild. He did a few sessions, one of them called “Tips and Tricks: Understanding the value of Microsoft Dynamics NAV in office 365” In this session he showed some very cool features where NAV data is being pulled into Office 365 Sharepoint and manipulated seamlessly. The used almost never has to know he is actually using Dynamics NAV inside Office 365. Jesper is a natural speaker and has comedic timing which is rare to find in technical conferences. It is always a treat to attend his talks.
Finally, I wanted to mention a session that was general for all the Dynamics products and focused on SQL performance. This is a pretty hot field. Rod “Hot Rod” Hansen and Pepe Sifuentes lead a session called “Securing Performance with Performance Analyzer for Microsoft Dynamics”. They presented a tool they use in the field, DynamicsPerf, where they have accumulated a lot of data gathering scripts in order to troubleshoot where the performance issue is. We at iNECTA have started using this tool an d sincerely welcome efforts like this. Big thanks to “Hot Rod” and Pepe!
I have been attending the annual NAV conference since 2001 and things have certainly changed since then. It is amazing that every year I attend I find new things to be excited about and empowered. It feels very good to be a part of the Microsoft team and know that big bets are being made towards the future. Convergence 2016 will be in New Orleans and we at iNECTA are already booking our ticket!
The concept of catch weight in ERP is widely used, but often misunderstood. It has become a buzz word for companies specializing in the food industry. They claim they have provided a solution to the problem and they in essential “support” catch weight.
But what is catch weight and why is it so important to have this feature? In my analysis, working with different food companies, catch weight simply means that we have to account for the container of a product and the weight of the product at the same time. So for example if you buy two fish and they weight 3.2 lbs together then you have a container for two weighing 3.2 lbs. If you then decide to sell one fish, you will have to split the container into two and “catch” the weight of each. This usually means you have to weigh the fish on outbound and assume the left over fish is the remainder.
So in essence the ERP system needs to be able to handle a “container” of weight, or more generally a container of quantity. The base unit of measure in this setup is usually weight, at least when dealing with food and perishables.
The interesting thing, is that NAV already a feature out of the box that supports the container of quantity concept, called Lot Tracking. With a few precise modifications you can set the lot tracking functionality to behave as catch weight. The lot is the container and NAV provides isolation both from the cost and the depletion process.
There are variations to the concept. Sometimes companies do not want to track the container/weight accurately until at the outbound level. This is usually done to save effort in not having to weigh everywhere, which is time consuming. The weight is assumed at the inbound transaction and when the product is finally ready to be shipped, then the weight is “caught”. In this case there is no need for a container until the final transaction. Since the weight was assumed all the way until the end, there is a good chance that it is different. The system then needs to be able to adjust the discrepancy. There is a way to handle that without introducing the concept of a container with, again, some precise modifications. The system basically knows the unit of measure that the transaction is in, in inventory and it knows the base unit of measure, so when the shipment occurs and the weight is caught, we know how many containers and how much weight. We also know the entry being closed in the ledger and the assumed weight. The modification needed is the ability to post discrepancy and the ledger is in balance.
So in the end the largely hyped catch weight term can be accomplished with a relatively minor adjustment to dynamics NAV. This is good news for people like me who like to keep NAV pure and dislike adding large amount of code where it is not needed.
When companies outgrow Quickbooks and are faced with migrating to a more functionally rich package like Microsoft Dynamics NAV, there are a few things the users miss. Since Quickbooks is geared for the small business user its primary goal is to make the user experience smooth and easy. Sometimes the result is great and for other systems to learn from. Sometimes it highly compromise the integrity of the system. I am going to list a few examples in this blog that I have come across.
Automatic bank statement import
This has been a holy grail for Quickbooks and Quicken. They have somehow worked with most banks in the US and generated a relationship with them where the bank provides a Quickbooks button which automatically fetches transactions into QB. Dynamics NAV recently decided to include this feature in its NAV 2015 version. It introduces a middle layer (third party) which takes care of mapping all the banks. This functionality has been taken further than QB. You can also send wires, stop payments and ACH’s. The system also works with international banks and takes advantage of the SEPA, a recently established European protocol for bank communication. In my opinion, Dynamics NAV is finally besting QB in this area.
Editing G/L transactions
In QB you are able to edit the general ledger transactions directly. Although many people see this as an advantage, I cannot find anything helpful about this. It is an example where the user convenience has drastically jeopardized the integrity of the system. You can basically put your G/L out of balance by editing the numbers. It certainly breaks GAAP rules of audit trail.
Recently I did come across a question from an ex QB user regarding posting of inventory. In QB you have to accounts that get hit when you post a receipt on a purchase order. In NAV you have four.
Let’s say you are buying an item A, quantity 1 for $100. In QB when you post the PO you would get the following result:
In NAV you have the following:
Why does NAV add two extra accounts? The reason has to do with maintaining inventory valuation correct in all cases.
Let’s say for example you then return the goods back to the vendor at $80. If you only had two accounts you would be left with a balance in inventory but no quantity as seen below.
This of course makes no sense. With NAV you’ll have the following.
The difference is between direct cost and purchases, and can be treated as expense/income. This of course make much more sense and is only one of serveral cases which can offset your inventory.
QB is certainly a decent product for its price, but any serious growing business will eventually hit a wall with the product. I don’t think Intuit will fix these issues in the coming future since many small business users actually celebrate the flaws. It gives them the flexibility they need and often they only need to answer to themselves. NAV however is a much more grown up package where exceptions are handled properly and audit trails preserved.
To celebrate that iNECTA is turning 14 today, I am going to list out important points that are often missed in 14 different functional areas within NAV.
1. General Ledger
Accounting periods need to be created and managed for all years. It is possible to post and operate NAV normally without any accounting periods set. But this is not advisable. Make sure you have created all the periods for all the years you have been operating and for the year ahead. It is impossible to close the year and post to retained earnings without them.
Make sure your accounts receivable GL Account does not allow direct post. This is a simple, but often overlooked detail. If you allow direct posting to your AR GL account then your AR aging and GL might not tie out. That is the first flag your auditor will look for.
3. Accounts Payable
As with Accounts Receivables, the GL account for AP needs to also have the direct post flag off.
4. Sales Order Management
Do not overflow your customer list with accounts that you never sold to. You can quote contacts in NAV, and when the quote becomes an order, the contact automatically turns into a customer. You customer list can quickly become messy if you do not use this feature.
5. Purchase Order Management
If you are managing any type of distribution business and haven´t looked at the requisition worksheets, you should. The requisition worksheet is the piece that connects supply with demand allowing your business to elevate from managing order by order to managing aggregate demand with supply.
6. Inventory Control
Make sure you understand automatic cost posting and expected cost posting fields. Many users just set these fields to what they think they should have and move on. Not having a full understanding of these fields, will lead to confusion when deciphering the inventory and cost of goods accounts. I could probably write a whole separate blog entry on these features.
7. Warehouse Management
Understand the difference between bin management and shelf management. It is usually better to be very selective on when you turn on bin management in a location. Although bin management might solve the immediate problem of not being able to store things in bins in the system, it introduces the overhead of managing everything in a bin. If you are implementing scan guns at the same time, then the overhead is less.
8. Fixed Assets
NAV has fixed assets, so use them! For some reason many users do not implement fixed assets in NAV, and prefer to run the depreciation in Excel or another program. NAV has excellent features in the fixed asset module and can handle almost any depreciation method. It ties to the sales and purchasing systems to automate acquisition and disposal postings among other things.
Check out the Production Journal, found in the line level on a Production Order. It is a very handy way to post consumption and output on a all-in-one page. Users often go straight into the standard consumption and output journals without investigating this little feature.
There is a relatively new page called WIP cockpit which is an excellent overview of what your WIP consists of. Work in process can be a difficult account to decipher if you do not have right tools. WIP cockpit provides an onscreen overview of WIP.
11. Resource Planning
This entire area is often overlooked. Resources are an excellent feature that can be used for things that are bought and sold, but you do not want to maintain inventory for. It provides a separate ledger for each resource which can be used for analysis. Although the system does not allow for resource purchases out of the box, it can be easily extended to do so.
12. Service Management
NAV is capable of managing service items that are not actual items. For example a service company can take over existing service items that the customer has. Users often think that the system will only take care of items sold.
13. Human Resource
Even if you outsource your payroll, it is still a good idea to maintain the employee file. One nice feature, is that you can connect the employee to the fixed asset to label who is responsible for the property, such as a laptop, etc.
Understand the posting date restriction in NAV. You can setup restrictions on posting both on the top level for all users and the user level. NAV does not hard close periods, only years. By properly managing the allowed posting dates, you can take at least some of the chaos out of your Chart of Accounts.
Over the years, we at iNECTA have been brought in many times to troubleshoot performance. The system being for some reason slow in a business that has high volume. There are many reasons why a system can get slow and in the blog post I wanted to highlight the differences between infrastructure and application bottlenecks.
The first area people blame when the system slows down is infrastructure. Either the machine is slow or the network congested. This is the most visual to the user. There is a box on your desk and another box in a closet that blink with lights. They are connected with a wire, so any one of those three things must be the fault. The solution must be to either improve the components or replace these things. Although this is often the case and great improvements can be made by upgrading infrastructure there is another layer which often gets overlooked, namely the application.
The application layer is not as visual to the user. There is nothing you can physically touch or see. It´s just a mysterious thing that happens when the user interacts with the computer. Most people misperceive the speed of this to be only related to the performance of the actual workstation, server or network. This could not be further from the truth. An application can be badly designed or programmed and can cause much worse performance issues than any hardware could.
Most applications are installed on your computer and there is not much you can do about the way they are designed. You might be able to tune some settings, but usually that is the extent of it. You are at the mercy of what is called the application black box. You send input into the box and it gives you output. How it goes about it, you have little or no control over.
With Dynamics NAV, the consultant has enormous control over the design and programming of the application. The consultant can change the way the system goes about accepting the input and producing the output. This can result in much more dramatic improvements than the infrastructure layer. I would like to outline a very simple example that occurs too often in the programming layer and it might not be as obvious to spot to the unexperienced.
Let´s assume in our example that we have N numbered documents that are scattered in random order on one table in a room. Your job is to find a number of documents, X, and move them to another table across the room. One way to do this is to look for the first document to move, out of the N documents, and then when you find it, walk with it over to the other table and then back. This process would be repeated X times.
We can immediately see that this process can be sped up by first finding all the documents, putting them in a stack and move the stack over to the other table. If walking over to the other table would take one second we have just shaved X-1 seconds of our process. In many cases X could be in the 1000s.
How about that random order of documents? When we are looking for something which is not sorted we have to go through each one, one by one. So imagine looking for a name in a phone book that is not sorted. Let’s say it takes 0.5 second to compare. If you are looking for a document out of 1200, it would take you on average 600 seconds or about 10 minutes. Whereas if it was sorted it would take you about 5 seconds.
So if we put these numbers into formulas to compare. Let’s assume we have 10,000 documents and we have to find 1,000 to move. Let’s also assume it is a computer doing the work (not a very fast one) and it takes us 0.01 second to move and 0.01 second to compare.
In the first scenario our approximation formula would look something like this:
X(0.5 * 0.01 * N) + X * 0.01 = 1000(0.5 * 0.01 * 10000) + 1000 * 0.01
= 50,010 seconds or about 14 hours
In the second scenario our formula would look something like this:
X(Log2(N) * 0.01) + 0.01 = 1000( 0.0997 ) + 0.01
= 99.71 seconds or about 2 minutes
It is evident that the difference is incredible. We have 2 minutes compared to 14 hours. When the numbers of documents is very low, then the difference is not much, but as the system accumulates data, the previous scenario will pretty much render the system useless.
In conclusion, we can see that when you improve your hardware, you might get significant improvements. But very real improvements can be made in the actual application, when the consultant has access to how the data is calculated and manipulated. It does require an experienced person to figure out where the bottlenecks are and how to solve them. We at iNECTA have solved many such problems over the years, transforming a system that people thought was not able to handle the amount of data, to a lean, streamlined machine ready to take on a lot more.