The role of interoperability in the evolution of generalist software

By Team bluQube

 

The traditional model of software delivery, often referred to as the ‘generalist’ or super-app approach is the direction many providers were heading in around 40 years ago, broadening out their tools and applications to deliver a wider range of services a business needs. In this way, many were becoming more like ERPs, says Simon Kearsley, CEO of bluQube.

The generalist model is often advantageous in terms of cost and convenience. There is a vendor that handles all aspects of the IT system, which makes sense for some industries. However, for others the generalist route means they are having to make compromises; buying the software for the one or two functionalities that deliver exactly what they need, and ‘making do’ with the rest. While this might seem easy because it keeps all data in one system, it’s likely compromises will have to be made as it won’t tick all the boxes with every product.

As a result, the super-app or generalist model is quickly becoming redundant in favour of interoperability. It gives businesses the liberty to choose exactly what software they want from different specialist providers, knowing that they’ll automatically share data with other systems in the organisation.

 

The role of interoperability

Interoperable accounting software provides unparalleled advantages for businesses, allowing them to manage their finances more efficiently and effectively. Rather than manually entering data and updating disparate systems, interoperable software interacts and shares information across mission-critical programmes within the organisation automatically, regardless of vendor. This provides greater data visibility which permits greater oversight of operations and opportunities for revenue growth.

 

Interoperability vs integration

There are three methods of synchronising data across systems. These are manual intervention, integration and interoperability. Manual intervention is arguably the least effective method despite it having the benefit of businesses not being tied to one supplier.

It relies on different departments within the business to move relevant data across systems. This is unnecessary, additional work that is easily forgotten and prone to human error. Using software that connects systems seamlessly is much more practical for modern businesses.

This is where integration and interoperability come into play. Often the terms ‘integration’ and ‘interoperability’ get confused, or even used interchangeably. Integration does have its benefits as it means all data is in one place and it removes the need to move data that comes with manual intervention. However, it does also have some limitations.

Unlike with manual intervention or interoperability, data integration usually relies on the organisation using systems from the same supplier. This is necessary to keep the system versions in sync and running. However, it is unlikely that one supplier will provide systems that suit the needs of the entire business.

This is not the case with interoperability. Interoperable software enables data to be shared freely with third-party systems throughout the organisation regardless of supplier. This means businesses can connect finance and HR systems to each other, for example, to automate the transfer of valuable data between them. It removes the risk of human error, whilst allowing the company to choose the best systems for them and enjoy access to accurate and insightful data that can be used to inform key business decisions.

 

The ‘post office’ approach to interoperability

We’re seeing some providers take a ‘Post Office’ approach to interoperability. This involves suppliers using APIs and connection points to integrate with different applications, and actively reaching out to new partners to expand their portfolio. Ultimately, suppliers are working to their own strengths and doing what they do best, whilst relying on other providers to fulfil other functions through interoperability.

Additionally, it eliminates the need for expensive custom solutions that may not always remain secure or function correctly over time. This approach enables companies to focus on tailoring a solution that meets their specific needs without compromising on safety or reliability. Therefore, this is an attractive option for any business seeking to benefit from reliable and cost-effective software interoperability. It’s likely that this will continue to mature to become a standard B2B software feature in the future.

 

Final thoughts

The emergence of interoperability has changed the software industry. Companies are no longer constrained by rigid generalist models that rely on businesses only using one software platform across departments.

Instead, they can take advantage of technology like cloud computing to create bespoke solutions for their specific requirements. By connecting various systems within the company, businesses can enjoy streamlined processes provided by effective data synchronisation. Having easily accessible data across the company allows for strategic decision-making for commercial growth. Ultimately, this evolution has enabled businesses to build closely-tailored IT systems, without compromising on quality or reliability.

The author is Simon Kearsley, CEO of bluQube.

We use cookies to improve website performance.

Click here to view our privacy policy