Markets by Trading view

Comfort for data’s quality control problem


We live in the age of data, but we all know that data doesn’t mean useful information. Enterprises face big challenges in extracting value from all the data that flows through an organization, and the usual solutions add to the strain – can blockchain help to soothe the anxiety?

The Information Age has led business ever deeper into a data quagmire. It’s inescapable and increasingly unmanageable. Even as the field of data science evolves, expands and chases trends – from big data to small data, wide data, right data – businesses in every field are struggling to get on top of the mess on a basic level. It’s not just a question of analytics, or storage, or access, and it’s not just about trends such as remote working and cloud computing… it’s everything, all the time.

Consider these facts: a typical large organization is running literally hundreds of separate applications, and as a result, according to EIU research, employees spend a quarter of their time just hunting across applications for the information they need. We know this is a huge problem; in a recent survey, 90% of the 800 respondents said they faced “significant problems” as a result of data fragmentation. Meanwhile, MIT research points to the crucial role of data liquidity – ease of use across departments – in unlocking value.

The problems with bad data

The fragmented, inconsistent data that results is quite simply bad data. While “good data” is characterized by consistency, accuracy, completeness, auditability and orderliness, data that lacks these qualities isn’t just hard to use effectively, it carries direct and indirect costs to the business. Data industry body the TDWI estimates that these costs can amount to up to a shocking 25% of total revenue!

If this all seems a bit abstract, consider the pedestrian example of a poorly maintained customer database, with records being added and edited inconsistently from a range of access points. The consequences could include wasted costs, with duplicate or misaddressed mailings being sent out; lost revenue, from marketing efforts that never reach their audience; customer frustration and loss of reputation, resulting from inaccurate transaction histories and account records… the list goes on. And given that most organizations face similar challenges across departments from purchasing to payroll, the figure of 25% of revenue maybe isn’t that surprising. So what is to be done?

Any attempt to tackle the challenges of data fragmentation must address a host of factors. It must aim to make it easier to integrate data from disparate sources, with consistent indexing. It must provide robust security and confidentiality, while allowing the right people within an organization to access the data they need easily. It must provide reliable accuracy, so that users can trust their data quality. And it must certainly offer flexibility, since organizational change is the only constant.

Are such systems available? Of course. Enterprise data integration is a burgeoning field, with a host of consultancies and other providers competing to solve these problems. But their solutions are as complex as the problems themselves. EDI projects are typically expensive, slow and disruptive. Organizations are likely to face internal resistance in implementing the new systems (nobody enjoys having to rewrite their work processes!) and will only start to see a return on their significant investment after months of effort. On top of that, new data management systems inevitably introduce new security risks, and may not be easily upscaled as the organization evolves.

All in all, making an EDI decision is such a commitment, the risk of getting stuck in the process of trying to select a solution can be added to the list of problems generated by fragmented data! 

A blanket solution?

A hot trend in data integration is the creation of a kind of connective tissue that links the various data silos, rather than a point-to-point integration (which must be applied individually, and expensively, to every additional point) or data hub (which makes it harder to maintain control over data quality). The buzzwords are data fabric, or data mesh; they’re not quite the same, but then, they’re also not really used consistently by different people. It’s a helpful visual though: a decentralized layer that spreads across an organization, providing connections and coordination between existing data points.

Of course, once you’re looking at a decentralized system, and with tamper-proof security a high priority, blockchain presents itself as a helpful platform. Geeq Data follows this logical path: a blockchain database provides the secure, flexible consistency layer that can be attached to an existing architecture, even one that encompasses a wide and uncoordinated range of applications, locations and processes. It’s a plug-in solution that delivers company-wide data visibility and utility, while maintaining record integrity and traceability.

As an add-on layer, it can be implemented far faster than conventional EDI solutions (including building a custom data mesh) and at far lower cost; perhaps best of all, it doesn’t demand that anyone migrate systems or change their workflow, making for much easier (and less fraught) adoption. And because it’s modular, it can be brought on in stages, allowing time for the organization to familiarize itself with the system and reducing the fear of new tech.

This approach ticks all the boxes for an effective, and cost-effective, solution to the usual data headaches. It’s simple, secure, platform-agnostic. It’s flexible and can respond easily to changing enterprise needs. It provides the necessary security and confidentiality to ensure compliance with increasingly stringent privacy regulations – access rights for each record can be closely controlled.

As data management becomes both increasingly challenging, and increasingly critical to business success, enterprises urgently need to find a better way of handling things. Building a complete architecture to suit current conditions, sticking with it until it creaks under the strain of change and then redesigning everything is not an option in a world that changes this fast. A plug-and-play data layer that delivers improvements from day one, doesn’t add to departmental stress, and adapts to changing circumstances? That sounds like an IT comfort blanket.

Story contributed by Ric Asselstine, CEO of Geeq Corporation

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts


Write your email to verify subscription


Sign up for our free newsletter and receive the latest banking and fintech stories, straight to your inbox - every week