As our industry faces a new decade, one age old statement rings true! “The only thing that is constant is change”! For some of us who have been in this industry for several decades the rate of change, especially in recent times, has been exponential and we say explosive and exciting and this change must be embraced. But why?
Once upon a time it was good enough that your system could create a quote to order to invoice (some systems didn’t have the notion of the difference of a quote or an estimate versus a confirmed order) or perhaps, wonders upon wonders, one could load up and manage an assembly process or could create a Purchase Order or was real time versus various batch updates.... In essence all of these systems are simply a collection of forms (admittedly many with complex capability) allowing us humans to enter data into a database from which we can run reports. Whilst time has always been money – it is now that we can see the benefit of the explosion of understanding and capability for new Business Platform capabilities in the areas of Application and Device Integration (Internet of Things), Machine Learning, Analytics and Artificial Intelligence.
It is fair to say that the cost of entry to deliver these capabilities were once the province of the major providers but today we see this capability ever more so even in entry level solutions. This article focuses on the growth of integration and seeks to share a foundation of knowledge.
Implementing a “Business Platform” (BP) for your most valuable of asset, your team, that bridges the gap for all departments can be a daunting task, albeit once could argue that any top rated Business Platform will be a great choice. However, finding a single BP that connects every single aspect of a business without flaws is near impossible. With integration, the processes that were once specific to a task might be simplified by deploying streamlined “app-specific” solutions for such processes as Project Management, Quality Control and even Payroll!
Let’s examine an aspect that is close to our hearts, Project Management.
There are wonderful systems in-market and one solution might suit more than the other. But where is the join between this person specific, departmental solution and the BP of choice? Well possibly its in the area of Timesheets connecting to the billing system and/or payroll and suddenly the integration has taken on a different nuance but the requirement is always different project to project. With a BP featuring class leading integration capability we can now see that the more efficient a business runs, and the less reconciliation required between the various aspects of that business, the more successful that business will be.
As a bare minimum client projects seek to integrate a single focused application data to the Business Platform such as SAP Business One, MYOB Advanced, Business Central and HighJump and yes we consult for these solutions!
Therefore, we are starting with two distinct applications and we need to pull data from one into the other and/or exchange data in both directions. So, what must occur for this event to happen? Essentially there are four key components:
Let’s look at these areas in more depth.
An initiating action is what starts things rolling in the data exchange process. In most cases, it would take one of three forms:
In order to transfer data from one system to another, the systems need to have a common understanding of how the data will be formatted. In the old days, things were simple: you could store data in fixed format text files, or as bits of information with standard delimiting characters, commonly called
Today, we have a more dynamic formats:
An example fixed format file could be made up of three lines, each 24 characters long:
Client (20) |
Chargeable (1) |
Hours (3) |
Acme |
Y |
25 |
BHP |
N |
37 |
A program receiving this data would have to understand the lengths and data types of each field and programmed to receive data in that exact format.
CSV is easier to work with than fixed formats, because the receiving system do not have to be as explicitly informed about the incoming data. CSV is almost universally supported by applications, but it poses challenges as well. What if your data has quotes and commas in it already? And as with fixed formats, the receiving system will still need to be programmed (or “mapped”) to know what type of data it’s receiving.
CSV is the defacto data format standard for one-time exports and data migration projects. However, automating CSV transfers requires additional consideration – batch files or scripts that will work with a scheduling function. Newer standards, like XML, are web-based and work in browsers, allowing for a more dynamic relationship with the data sets and less external programming.
CSV (Comma Separated Variable).
Acme,Y,25
BHP,N,37
XML (eXtensible Mark-up Language).
The XML format is known as a “self-describing” format, which makes it a bit harder to look at but far easier to work with. The information about the data, such as field names and types, is encoded with the data, so a receiving system that ‘speaks” XML can dynamically receive it. A simple XML file looks like this:
<?xml version="1.0" encoding="UTF-8" ?>
<BILLING>
<DETAIL>
<CLIENT>Acme</Client>
<Chargeable>Y</Chargeable>
<Hours>25</Hours>
</DETAIL>
<DETAIL>
<CLIENT>BHP</Client>
<Chargeable>N</Chargeable>
<Hours>37</Hours>
</DETAIL>
</BILLING>
An XML friendly system can use the information file itself to dynamically map the data to its own database, making the process of getting a data set from one application to another far less laborious than with a CSV or fixed width file. XML is the de facto standard for transferring data over the internet.
JSON (JavaScript Object Notation)
[
{
"CLIENT": "Acme",
"CHARGEGABLE": "Y",
"HOURS": "25"
},
{
"CLIENT": "BHP",
"CHARGEGABLE": "N",
"HOURS": "37"
}
]
As we’ve talked about, an initiating action can spur an application to create a formatted data set. However, getting that data set from one application to another requires some additional work.
If both of your applications are sitting on the same network, then this work is likely pretty minimal. One application’s export file can easily be seen and uploaded by another, or you might even be able to establish a database connection directly from one application to another. However, what if the applications are in different locations? Or if one or both are hosted by outside vendors? This is where things get interesting.
There are multiple ways to exchange data over the web. Many of them are specific to the type of web server (Apache vs. Microsoft’s IIS) or operating system (Unix vs Linux vs Microsoft) you’re using. However, two standards – called “web services” – have emerged as by far the most common methods for simple transfers:
Both SOAP and REST transfer data via the standard transfer protocol mechanism of the web: HTTP. To explain the difference between REST and SOAP, we’ll take a brief detour and look at HTTP itself.
HTTP is a very simple-minded thing. It allows you to send data from one place to another and, optionally, receive data back. Most of it is done via the familiar Uniform Resource Identifier (URI) that is typed into the address bar of a web browser, or encoded in a link on a web page, with a format similar to:
http://www.cloudfactory.co?parameter1=something¶meter2=somethingelse
There are two methods built into HTTP for exchanging data: GET and POST.
The advantage to GET is that it’s very simple and easy to share. The advantages to POST are that it is more flexible and more secure. You can put a GET URI in any link, on or offline, while a POST transfer has to be initiated via an HTML Form.
However, add to the mix that Microsoft was one of the principal developers of the SOAP specification, and most Microsoft applications require that you use SOAP to transfer data. REST might be more appealing if you only need to do a simple data exchange, but if you’re working with Microsoft servers or applications, it is likely not an option.
While this article is focused on the mechanics of extracting and moving data, it’s important not to lose sight of the fact that data often needs a lot of work before it should be loaded into another system. Automated data exchange processes need to be designed with extreme care, as it’s quite possible to trash an entire application by corrupting data, introducing errors, or flooding the system with duplicates.
In order to get the data ready for upload, use transformation and validation processes. These processes could be kicked off either before or after the data transfer, or multiple processes could even take place at different points in time. An automated process could be written in almost any programming language, depending on the requirements of your target applications and your technical environment.
Additional Data Mining. If you’re writing a process that analyses data, adding routines that flag unusual occurrences for review can be very useful. Or if you’re uploading Timesheet data that also has to go to Payroll, why not concurrently process that for the Project Accounting in the BP? There are plenty of organisational efficiencies that can be folded into this process.
We hope you found this introduction helpful. If you have any questions on the topic of Integration or Cloud ERP, please give us a call or click below to contacts us.
Book a free initial assessment with us where we understand your business problem and suggest a solution which suits you best and ensures growth for your business.
Cloud Factory is an ERP specialist, helping you implement ERP systems such as MYOB Advanced, SAP Business One, Microsoft Dynamics 365, Microsoft Dynamics 365 Business Central, Wiise, PowerPlatform and Korber K.Motion Warehouse Edge.