Multi BC Company Master Data Sync with Power Automate

Let’s say you have a multi company database structure and some of the data is shared i.e. same Chart of Accounts. A user then says “What!? I have to create the account in all the other companies as well!?”. There are apps that handle this and I’ve seen a colleagues version of it recently which is great. However, why code when you don’t have to? Limiting factor with no code will be that there is no restriction on other companies i.e. other companies can create data. Some solutions look to reduce this possibility by having a “master company”. This is a basic version that doesn’t cover that 🤪

So here’s the goal – have one BC company create/modify or delete master data and pass this into other companies

“Why code when you don’t have to” isn’t quite what I’ve gone for but it is certainly low code. My reasons are totally justified as you’ll see. This solution could be used on G/L Accounts, Customers, Vendors, Items, Dimensions, Fixed Assets, Posting Groups etc. I have chosen to demo it with G/L Accounts as you need a little extra there. If you check the standard API for accounts (g/l accounts) it only has a GET command. Which is fine for one part of the process but no good after that. I’ve knocked up a page with the idea of using it as a web service for all the fields I felt were needed. (link to github repo is at the end 😉)

Once you have that page you are able to POST, PATCH or DELETE. The start of the flow can use an event based trigger from the BC connector for insert, modified or deleted. Insert and modify follow the same concept so I’ll cover just one. For DELETE something else is needed.

Here is the basic structure for MODIFY or INSERT. Use the BC trigger for a data insert or change and then follow that on with a series of HTTP connections. The 1st one does a GET so that we have all the fields we want to pass over. Could have used the standard API page but it is missing some extra fields like posting groups. Once the JSON has been parsed you can do parallel HTTP calls to the different companies.

The DELETE scenario can’t follow this logic exactly as the ID you get from the BC trigger won’t match that of the other companies and you’ve just deleted the other data 😒

The solution is to have a codeunit, table and page to one pick up deleted data, store it and then show it.

Codeunit subscribing to all the areas of data you want to watch is needed. I have created a supporting ENUM as well but that is optional. Check the link to the github repo for the page and table that also support this feature. You will need the page as a web service so that you can do the next Power Automate flow 👍

The structure of the DELETE flow is slightly altered as you have to do a call on the newly stored deleted data before you can determine which record needs deleting from the other companies.

The flow takes on this type of structure where the 1st HTTP call looks at the master company and checks the “Master Data Deletion” table for the ID that has just been deleted for that data type. Once the JSON has been parsed you can then make parallel calls to other companies looping through the JSON body. In this instance I have added an extra check on the child companies to see if the account exists before then deleting it. An optional step but it enables you to but a condition in at a later stage if required for if the account does not exist.

Code I used: https://github.com/JAng13sea/Blogs/tree/master/Master%20Data%20Sync

If you need extra information on using the HTTP connector or parsing JSON check my previous posts as I go into more detail on that process.

PowerApps biz card reader to BC

This one comes with a premium as the PowerApps business card reader isn’t included in the license for BC. However, it’s a really good use of AI and something that could be utilised in the right scenario. Ironic that I’m blogging about something that won’t be getting much use right now. I haven’t acquired a new business card since March 2020. However, it feels as though the exchange of fancy pieces of card might be behind us and scanning something is a cleaner option, in more ways than one 😊

A number of other online resources can explain the PowerApps build up so I will highlight the important parts. The goal here will be to take an image of a business card and have the details from the AI model sent to BC so a company or person contact can be created.

Add the business card reader from the AI builder and then add text input boxes which reference the properties of the business card reader. Use text input boxes as the data from the reader won’t always be perfect so some editing might be needed.

As you can see here the Company Name doesn’t quite work out so some manual adjustment is needed.

The next thing is to have the data passed and to do this I’ve used Power Automate. A button will be created in PowerApps that calls the flow that is needed. A few steps to the flow so I’ll break it into sections. Use a PowerApps trigger as the starting point:

In PowerApps we will build a JSON so add a Parse JSON and request the content from PowerApps. Note that it will most likely change the name as mine is above. A sample schema is needed:

{    “type”: “array”,    “items”: {        “type”: “object”,        “properties”: {            “Address1”: {                “type”: “string”            },            “AddressCity”: {                “type”: “string”            },            “CompanyName”: {                “type”: “string”            },            “Country”: {                “type”: “string”            },            “Email”: {                “type”: “string”            },            “FirstName”: {                “type”: “string”            },            “JobTitle”: {                “type”: “string”            },            “LastName”: {                “type”: “string”            },            “Mobile”: {                “type”: “string”            },            “OfficePhone”: {                “type”: “string”            },            “PostCode”: {                “type”: “string”            },            “Website”: {                “type”: “string”            }        },        “required”: [            “Address1”,            “AddressCity”,            “CompanyName”,            “Country”,            “Email”,            “FirstName”,            “JobTitle”,            “LastName”,            “Mobile”,            “OfficePhone”,            “PostCode”,            “Website”        ]    }}

Next up an “Apply to each” section is required where the body of the JSON is used. In our case there will be one value at a time but a JSON can handle multiples. A series of HTTP triggers will follow along:

  1. A GET to determine if the “Company Name” value can be found in BC already. For this to happen I have published page 5050 as a web service and particular ODATA filtering is needed ?$filter=Type%20eq%20%27Company%27%20and%20Company_Name%20eq%20%27′<CompanyName>’%27
  2. A POST for a Company Contact for the occasions where this detail isn’t available yet. The displayed “Headers” will be needed and the “Body” will be made up of a mix of static values and those from the PowerApps JSON:

3. A final POST to handle the creation of a person contact. This will be repeated so use the “Copy to Clipboard” feature to save time having to type it out again.

A “Condition” has been added off the back of using the GET command and the returned “Body” is checked to see if the Company Name from the PowerApps JSON exists in the JSON of the GET Company HTTP. This will result in the “Yes” command or the “No” commands taking place. Once this is saved then it is ready for hooking up to PowerApps.

Create a new button in PowerApps and use the “Action” part of the ribbon to call on Power Automate. Choose the flow that was devised from the earlier steps. This will add one line to the formula area of PowerApps. Use the shift key and enter to add some additional lines above that inserted line. Some earlier steps are needed before the detail for Power Automate is ready. I’ll break this into sections too:

A Collection will be built up of all the values from the Biz card reader. Refer to the online documentation to understand the PowerApps formulas further:

ClearCollect(BizCardContact,{CompanyName: CompanyName.Text,FirstName:Concatenate(firstname.Text,” “,Lastname.Text),LastName: Lastname.Text,Email:Email.Text,Address1: BusinessCardReader1.AddressStreet,AddressCity: BusinessCardReader1.AddressCity,PostCode: BusinessCardReader1.AddressPostalCode, Country: BusinessCardReader1.AddressCountry,JobTitle: JobTitle.Text,OfficePhone: OfficePhone.Text,Mobile:MobilePhone.Text,Website: Website.Text});

A JSON is required in Power Automate and it can be built in PowerApps using the collection from the formula above:

Set(ContactJSON,JSON(BizCardContact));

Last thing is to pass the JSON to Power Automate:

‘Copyof-BizCardReadertoBC’.Run(ContactJSON)

In full flight you will get one of two results, two contacts or just a person contact. Note the green ticks in the top right of each window to show what has been executed:

Power Automate with BC update via codeunit using SharePoint data

Writing this is a quicker style than usual as the understanding I have around it was gained from another blog so I deserve no credit on that front. Grazie Stefano: https://demiliani.com/2019/06/12/dynamics-365-business-central-using-odata-v4-bound-actions/

Also a mention to my colleague Dan Kinsella (https://dankinsella.blog/). He helped with the codeunit element of this solution. Cheers Dan 🍻 I owe you a beer!

Why have I decided to blog about this? Well I had a meeting with a prospect recently that sparked it. Idea was that they would use SharePoint to hold website images as a repository. Naturally they wanted to see the images in BC. A modification to store more images could have been explored but I left that alone. Instead I suggested the use of the “Links” feature. User then clicks on the SharePoint URL to see the image…no need to store it twice, so better for database size 👍

Getting the data from SharePoint to BC though was another thing as the “Links” area is a system table so no chance of a direct HTTP call with Power Automate. So the goal here is: Create a codeunit that is accessed from a web service that stamps a link on an item record.

So we start off with a very basic codeunit

To go along with that we need a page as it is the page we publish as a web service. Read Stefano’s blog for the reason why, he discovered it after all.

Add the fields that you need for the circumstance. In my case it is just the item number, system ID and the description – only the item number is needed really in my scenario. A function is then needed on the page and you need to ensure you have the [ServiceEnabled] part. Again check out Stefano’s blog to understand why.

Testing it in postman the final result looks like this. Note the addition to the URL which has /NAV.AddItemURL which is the name of the page function from the last screenshot

After posting with Power Automate I get the following:

Copy of the AL code is here: https://github.com/JAng13sea/Blogs/tree/master/Add%20Item%20Link%20-%20AL

Batch post and e-mail with Power Automate – no code

Batch posting has been in the product for some time but I am not seeing a clear way, happy to be wrong, on batch post and e-mail. Here are my options (below) as a user conducting the posting. Yes, I have post via the job queue as well but from what I’ve tested this isn’t sending my e-mail for me. Even the print option breeds no results here. Thought it would at least use the “Document Sending Profile” on the customer record.

Even when you use the “Post Batch” you get nothing about e-mailing? 😬 Not forgetting the workflows in BC have no option for “Post & Send” it’s just “Post”. Usually these gaps would need filling with a code modification but not in this blog.

Where there is a will there is a way and my idea here is to use Power Automate to do the heavy lifting. On this occasion I will use a manual trigger but if this was a production ready scenario I would use a scheduled type flow.

The connector for BC in Power Automate is fairly small but I’m sure it will get better over time. There is a slight restriction though as the action of posting and emailing exists but it’s on a singular level like it is in the regular UI. So the first thing I need to do is fire in some data I can use in Power Automate from BC. I will of course need the GUID ID reference to the sales invoice as that’s what the BC Power Automate connector likes. The page that fits the bill here is available as a web service already. I will use the top one in the list 2811. This page is a mixture of open and posted sales invoices so some filtering is needed:

Example of the payload from this page:

Adding a filter to the ODATA query so I only post invoices that are ready to be posted. In the case of this page the “status” values are different to regular BC. In my case I have a choice between draft aka “Open” or open aka “Released”. So I have added ?$filter=status%20eq%20%27Draft%27%20and%20totalAmountIncludingTax%20gt%200 to the end of the web service URL I got from BC. Reason being the feature in the Power Automate BC connector restricts me to draft only and I don’t want to post invoices with a 0 value. Will log something on the Power Automate forum about the connector wanting “Open” invoices rather than “Released”. Backwards logic otherwise, what if I’m using approvals for instance. I think this is possibly wrong so I have posted this to find out more: https://powerusers.microsoft.com/t5/Connecting-To-Data/Is-the-Business-Central-Post-amp-Send-Invoice-action-correct/m-p/637081#M9492

Anyway….

Our flow starts to take shape like this. Check out my last blog for a more detailed account on the parse JSON part (https://d365bcangle.wordpress.com/2020/07/22/power-automate-purchase-order-alerts-to-vendor/)

Once we add the “id” from the parse JSON step you will get an “Apply to each” wrap around the chosen BC action which is exactly what we want (automatically). Remember our goal Batch post and e-mail invoices – so we better placed than standard.

In my case a small batch of invoices have been handled and emails dished out accordingly. I added an extra step for updating the posting date as well. Just so it takes something useful from the existing “Batch Posting” feature 👍

Power Automate purchase order alerts to vendor

Recently had a requirement where alerts to a vendor need sending for orders expected in the next two weeks. A reminder is needed as the lead times for the goods are so long.

The goal is therefore, have an email sent to a vendors email address for purchase order lines in the next two weeks from today’s date.

To achieve this Power Automate or Logic Apps are the most appropriate choices. For this blog I have chosen Power Automate.

Given that I will need line data and I want to minimise the need to make multiple HTTP calls so some small DEV BC side will help. I have produced a query object so that I have all the data I require for all the areas I want data from (link to my code is at the end). I’ve chosen header level here for the “Expected Receipt Date” but you could do it for the line level if needed:

Ensure the query is published as a web service and check you have output by using the web link or from an API test tool like Postman:

To get the exact data we want for the scenario some ODATA query logic is needed. Add the following to the end of the web service URL. Replace <Your Date> with the date you want to filter by. The format must be like this: 2020-07-21T00:00:00Z:

?$filter=Expected_Receipt_Date ge (<Your Date>) and Expected_Receipt_Date le (<Your Date>) and Outstanding_Quantity gt 0

The date values will be replaced with calculated fields in the flow which will be a scheduled type flow so that is our starting point along with the ability to create variables for the date/time values which are then used in our ODATA query above:

Use the HTTP connector and call the web service with a GET command and use the current time and future time variables.

The HTTP will produce JSON which we saw earlier, from the web service call, and to use the data from this we need to use the PARSE JSON feature and we’ll be able to then select the content for use in subsequent steps like sending the email. To generate the schema just paste in an example from calling the web service. Works nicely by using an API test tool like Postman as you get it formatted in a nicer structure.

Once the JSON is read by the PARSE JSON step you are able to create a “Apply to each” step which will iterate through each of the received rows of data until it has read them all. Whilst that is happening we can initiate further actions like our email:

Each reference comes from the PARSE JSON and you can use format functions on certain data like I have for the date. The result of this flow is an email much like this one:

Code: https://github.com/JAng13sea/Blogs/tree/master/Purchase_Query

Demand Forecast – needs something extra!

Have you ever looked at the demand forecast in D365 BC and thought that is missing features which actually make it usable! I mean look at it…but actually look at it:

Yes I can toggle the matrix view to alter the dates and change to component items instead of sales but if I want to forecast surely I need to do it for items that actually need manufacturing first! After all that is the premise with it. You populate it and then it gets used as part of MRP and MPS. How can we change this situation so the page is actually friendly and a user can apply sensible filters? Initial thoughts would be just personalise it, oh wait you don’t get any fields to choose…

Well the answer lies in the make up of the page object. Which as it turns out is the Item table:

So our goal here is Add sensible fields to the page that can be used for better filtering when devising a demand forecast. This might seem overly simple as an idea but the value comes from the fact the item table, especially for a manufacturer, can become very large indeed. You will have items that get blocked, you will get items that have been replaced by others or are now defective. The list in it’s current state over time will become too difficult to work with.

The method to achieve this goal is simple. Create a page extension to page 9245 and throw in the necessary fields the user is most likely to plan with.

Other fields can be added of course but straight away a user has something way more friendly:

For more information on demand forecasting check this out: https://docs.microsoft.com/en-gb/dynamics365/business-central/production-how-to-create-a-forecast

Code available here: https://github.com/JAng13sea/Blogs/tree/master/Demand%20Forecast

Gen. Jrnl. Imports with Data Exchange Definitions

Have you seen this button on the general journal page and thought that’s useful?

Some of you might not have seen it though as the button only appears if you complete some setup in the General Ledger Setup page:

The above field shows a list of Data Exchange Definitions for the type of “Payroll Import”. So it is fit for purpose but only usable for payroll – and more importantly just one import file structure. For demos I have often mocked up a Data Exchange Definition as a payroll import type and just passed over a CSV file. This falls into my one of many ways a journal can have external data added to it. This is fine for one off demos sessions but how can this idea be taken seriously in a project/production scenario? The answer lies in how is the current “Payroll Import” button working and can we change it for our situation? My end goal here would be to have the user pick from a range of different data exchange definitions which work for different CSV files. In the past customers may of had bespoke import routines passing data through the excel buffer. If they can compromise on having the file converted to CSV this could be a worthwhile solution!

Over in the base app the “Payroll Import” action button looks like this:

So it is only ever looking at that setup page to know which data exchange definition to use. It also has it’s own codeunit to run the process. So the component parts are small and we can adjust them with ease for our end goal.

Very simply a page extension is needed for the general journal and you can go with something like below to add the type of page action we want. In my case I have opted to use the “Generic Import” type on the data exchange definition – which I have set as a global variable on the page.

Codeunit wise I have done a complete copy of the base app and then renamed the function names. Very easy indeed. The process itself works fine so no need to change anything it is more about what you offer the user when they make that button selection.

The end result is a dynamic journal import selection. The user can now maintain and update a data exchange definition or add new one’s as new journal import requirements come along! 👍😁

Don’t know much about data exchange definitions? Check this out: https://docs.microsoft.com/en-gb/dynamics365/business-central/across-how-to-set-up-data-exchange-definitions

Code available here: https://github.com/JAng13sea/DED-Jrnl-Import

Create your website at WordPress.com
Get started