Application Integration: SAP B1: MYOB Advanced: Business Central


Application Integration

As our industry faces a new decade, one age old statement rings true! “The only thing that is constant is change”! For some of us who have been in this industry for several decades the rate of change, especially in recent times, has been exponential and we say explosive and exciting and this change must be embraced.  But why?

Once upon a time it was good enough that your system could create a quote to order to invoice (some systems didn’t have the notion of the difference of a quote or an estimate versus a confirmed order) or perhaps, wonders upon wonders, one could load up and manage an assembly process or could create a Purchase Order or was real time versus various batch updates.... In essence all of these systems are simply a collection of forms (admittedly many with complex capability) allowing us humans to enter data into a database from which we can run reports.  Whilst time has always been money – it is now that we can see the benefit of the explosion of understanding and capability for new Business Platform capabilities in the areas of Application and Device Integration (Internet of Things), Machine Learning, Analytics and Artificial Intelligence.   

Image result for erp system growth over years

It is fair to say that the cost of entry to deliver these capabilities were once the province of the major providers but today we see this capability ever more so even in entry level solutions. This article focuses on the growth of integration and seeks to share a foundation of knowledge.

Implementing a “Business Platform” (BP) for your most valuable of asset, your team, that bridges the gap for all departments can be a daunting task, albeit once could argue that any top rated Business Platform will be a great choice. However, finding a single BP that connects every single aspect of a business without flaws is near impossible. With integration, the processes that were once specific to a task might be simplified by deploying streamlined “app-specific” solutions for such processes as Project Management, Quality Control and even Payroll! 

Project Management

shutterstock_197740547 (1)

Let’s examine an aspect that is close to our hearts, Project Management.

There are wonderful systems in-market and one solution might suit more than the other.  But where is the join between this person specific, departmental solution and the BP of choice?  Well possibly its in the area of Timesheets connecting to the billing system and/or payroll and suddenly the integration has taken on a different nuance but the requirement is always different project to project. With a BP featuring class leading integration capability we can now see that the more efficient a business runs, and the less reconciliation required between the various aspects of that business, the more successful that business will be.

Integrating data from 3rd party applications to The Business platform.

As a bare minimum client projects seek to integrate a single focused application data to the Business Platform such as SAP Business One, MYOB Advanced, Business Central and HighJump and yes we consult for these solutions!


Therefore, we are starting with two distinct applications and we need to pull data from one into the other and/or exchange data in both directions. So, what must occur for this event to happen?  Essentially there are four key components:

  • An Initiating Action. Firstly, we need a “triggering action” to start the data interchange process. For an automatic data exchange, this is likely to be either a timed process such as a scheduler kicking off a program at 2AM every night, or a user action – for instance, a timesheet coming in from the Project Management system.
  • A Data Format. We also need to consider “data transformation” that might need to be affected.  The data to be transferred needs to be stored and transferred in some kind of logical data format – for instance, a comma delineated text file or XML – that both systems can understand. What does your target application support?
  • A Data Transfer Mechanism. If both applications reside on your own network, then a transfer is likely to be straightforward – perhaps you can just write a file to a location where another application can read it. But if one or both applications live offsite, Business Platform in Cloud 1 and Project Management in Cloud 2 you will need a process that transfers the data over the internet.

Let’s look at these areas in more depth.

Initiating Action

An initiating action is what starts things rolling in the data exchange process. In most cases, it would take one of three forms:

  • Manual Kick-off. If you’re manually exporting and importing files or need to run a process on a schedule that’s hard to determine in advance, regular old human intervention can start the process. An administrator might download a file, run a command line program, or click a button in an admin interface.
  • Many data exchanges rely on a schedule – checking for new information every day, every hour, every five minutes, or some other period. These kinds of exchanges are initiated by a scheduler application. More complex applications might have a scheduling application built-in or might integrate with Windows Scheduler or Unix/Linux Chron commands.
  • End User Action. If you want two applications to be constantly in synch, you’ll need to try to catch updates as they happen. Typically, this is done by initiating a data exchange based on a real-time end user action, such as a Consultant entering a Timesheet which in itself has been authorised and now needs to update your Billing and Project Status as held within the central repository of truth, your BP!

Data Formats

Image result for csv

In order to transfer data from one system to another, the systems need to have a common understanding of how the data will be formatted. In the old days, things were simple: you could store data in fixed format text files, or as bits of information with standard delimiting characters, commonly called

  • TXT for (Fixed Text Format)
  • CSV for (Comma Separated Variable)

Today, we have a more dynamic formats:

  • XML (eXtensible Mark-up Language)
  • JSON (JavaScript Object Notation)

An example fixed format file could be made up of three lines, each 24 characters long:

Client (20)

Chargeable (1)

Hours (3)







A program receiving this data would have to understand the lengths and data types of each field and programmed to receive data in that exact format.

CSV is easier to work with than fixed formats, because the receiving system do not have to be as explicitly informed about the incoming data. CSV is almost universally supported by applications, but it poses challenges as well. What if your data has quotes and commas in it already? And as with fixed formats, the receiving system will still need to be programmed (or “mapped”) to know what type of data it’s receiving.

CSV is the defacto data format standard for one-time exports and data migration projects. However, automating CSV transfers requires additional consideration – batch files or scripts that will work with a scheduling function. Newer standards, like XML, are web-based and work in browsers, allowing for a more dynamic relationship with the data sets and less external programming.

CSV (Comma Separated Variable).



XML (eXtensible Mark-up Language).

The XML format is known as a “self-describing” format, which makes it a bit harder to look at but far easier to work with. The information about the data, such as field names and types, is encoded with the data, so a receiving system that ‘speaks” XML can dynamically receive it. A simple XML file looks like this:

<?xml version="1.0" encoding="UTF-8" ?>













An XML friendly system can use the information file itself to dynamically map the data to its own database, making the process of getting a data set from one application to another far less laborious than with a CSV or fixed width file. XML is the de facto standard for transferring data over the internet.

JSON (JavaScript Object Notation)



      "CLIENT": "Acme",

      "CHARGEGABLE": "Y",

      "HOURS": "25"



      "CLIENT": "BHP",

      "CHARGEGABLE": "N",

      "HOURS": "37"



Data Transfer Mechanisms

As we’ve talked about, an initiating action can spur an application to create a formatted data set. However, getting that data set from one application to another requires some additional work.

If both of your applications are sitting on the same network, then this work is likely pretty minimal. One application’s export file can easily be seen and uploaded by another, or you might even be able to establish a database connection directly from one application to another. However, what if the applications are in different locations? Or if one or both are hosted by outside vendors? This is where things get interesting.

There are multiple ways to exchange data over the web. Many of them are specific to the type of web server (Apache vs. Microsoft’s IIS) or operating system (Unix vs Linux vs Microsoft) you’re using. However, two standards – called “web services” – have emerged as by far the most common methods for simple transfers:

  • SOAP (Standard Object Access Protocol)
  • REST (REpresentational State Transfer).

Both SOAP and REST transfer data via the standard transfer protocol mechanism of the web: HTTP. To explain the difference between REST and SOAP, we’ll take a brief detour and look at HTTP itself.

HTTP is a very simple-minded thing. It allows you to send data from one place to another and, optionally, receive data back. Most of it is done via the familiar Uniform Resource Identifier (URI) that is typed into the address bar of a web browser, or encoded in a link on a web page, with a format similar to:

There are two methods built into HTTP for exchanging data: GET and POST.

  • GET exchanges data strictly through the parameters to the URL, which are always in “this equals that” pairs. It is a one-way communication method – once the information is sent to the receiving page, the web server doesn’t retain the parameter data or do anything else with it.
  • POST stores the transferred information in a packet that is sent along with the URI – you don’t see the information attached to the URI in the address bar. Post values can be altered by the receiving page and returned. In almost any situation where you’re creating an account on a web page or doing a shopping transaction, POST is used.

The advantage to GET is that it’s very simple and easy to share. The advantages to POST are that it is more flexible and more secure. You can put a GET URI in any link, on or offline, while a POST transfer has to be initiated via an HTML Form.

However, add to the mix that Microsoft was one of the principal developers of the SOAP specification, and most Microsoft applications require that you use SOAP to transfer data. REST might be more appealing if you only need to do a simple data exchange, but if you’re working with Microsoft servers or applications, it is likely not an option.

Transformation and Validation Processes

While this article is focused on the mechanics of extracting and moving data, it’s important not to lose sight of the fact that data often needs a lot of work before it should be loaded into another system. Automated data exchange processes need to be designed with extreme care, as it’s quite possible to trash an entire application by corrupting data, introducing errors, or flooding the system with duplicates.

In order to get the data ready for upload, use transformation and validation processes. These processes could be kicked off either before or after the data transfer, or multiple processes could even take place at different points in time. An automated process could be written in almost any programming language, depending on the requirements of your target applications and your technical environment.

  1. Converting file formats. Often, one application will export a data file with a particular layout of columns and field names, while the destination application will demand another.
  2. Preventing duplicates. Before loading in a new record, it’s important to ensure that it doesn’t already exist in the destination application.
  3. Backup and logging. It’s likely a good idea to kick-off a backup of your destination database before importing the data, or at least to log what you’ve changed.
  4. User interface. For complex processes, it can be very useful to provide an administrative interface that allows someone to review what data will change and resolve errors prior to the import

Additional Data Mining. If you’re writing a process that analyses data, adding routines that flag unusual occurrences for review can be very useful. Or if you’re uploading Timesheet data that also has to go to Payroll, why not concurrently process that for the Project Accounting in the BP? There are plenty of organisational efficiencies that can be folded into this process.

We hope you found this introduction helpful.  If you have any questions on the topic of Integration or Cloud ERP, please give us a call or click below to contacts us.


Free Initial Assessment

Book a free initial assessment with us where we understand your business problem and suggest a solution which suits you best and ensures growth for your business.