The Data Maturity Curve Leads To Microsecond Business Operations

The Data Maturity Curve Leads To Microsecond Business Operations

Things are getting smaller. Not just in terms of gadget miniaturization, medical nanotechnology, increasingly sophisticated industrial electromechanical units and the process of so-called shrinkflation that leads our candy bars be thinner or shorter at the same price, but also data - data is getting smaller too.

Data is getting smaller in two key senses: a) we are breaking down the component parts of application data flows into smaller containerized elements to work inside similarly compartmentalized and containerized application services – and b) the time windows within which business needs to react to data events is reducing.

This latter time constraint on data of course leads us to the reality of real-time data and the need to be able to work with it.

In terms of how the spacetime universe we live in actually works, real-time data is something of a tautology i.e. data always has some time disbursement which has to be paid in order for it to exist. Data might travel at light speed, but it’s still a speed. When we talk about real-time, we mean data transports that work fast enough for a human not to be able to perceive any time lag having occurred. Thus, real-time expresses a human perception of time rather than a machine perception or definition.

This is all important stuff because we’re now supposed to be embracing anIndustry 4.0 world where our factories are run by AI-augmented intelligence and smart automation. But manufacturers may not be ready for Industry 4.0 if they are sitting on complex data issues thrown up by production bottlenecks caused by disparate information systems within an organization, many of which will still require human intervention – from manually inputting sensor readings into databases to inefficientClear-To-Build status (i.e. ready to go) monitoring and lack of integration with Enterprise Resource Management (ERP) systems.

Keen to write a few wrongs in this space is Palo Alto headquartered KX. Variously known as KX and KX Systems, the company is recognized for its work in high-speed real-time data streaming analytics inside intelligent systems that can also concurrently shoulder tasks related to historical data workloads.

Looking at the speed of current industrial data processing and the need to achieve its personal Nirvana state of speedy streamed data-intensive analytics, KX calls any given firm’s state of evolution its point on the data ‘analytics maturity curve’. Marketing-spun try-hard naming pursuits notwithstanding, KX does have a point i.e. the commercial window to create differentiated value is narrowing for organizations in every market and sector. Logically then, the faster they can act on insights derived from data created in-the-moment, the better the outcome.

As KX CTO Eric Raab hasstated before, “The opportunities for streaming analytics have never been greater. In fact, according to my company's research,90% of firms believe that in order to remain competitive over the next three years, they need to increase investment in real-time data analytics solutions. Whether it's a financial institution that needs to adjust customer portfolio settings according to ever-changing stock prices, a utility monitoring throughput across the power grid or an e-commerce site that needs to generate a monthly report, data accuracy at speed is enormously challenging.”

What kind of data analytics can we get from enterprise software platforms that can perform at this kind of speed? KX says finding (and acting upon) anomalous data will be a key use case.

Generally defined and explained as data points, events or observations outside of a dataset’s normal behavior, anomalous data can be a key flag and indicator to alert a business that that something has either already caused (or is likely to cause) an issue somewhere in the business.

“The ability to detect and respond to anomalous incidents quickly is critical, particularly because gaining the ability to react in real-time can limit the cost of anomalies. As well as preventing problems from lingering within the business, the adoption of real-time data can also improve process efficiency. The types of positive [advancements and innovations possible here include] faster services, increased sales, better product quality and reduced prices – showing how far-reaching and varied the impact of real-time data can be,” notes KX, its speed to business value research report.

The company insists that use of real-time data systems brings productivity gains by reducing the person-hours people-time spent in processing and managing data. This type of platform enables users to automate complex workflows that would otherwise be time-consuming and so use tested Machine Learning (ML) models that provide some level of practical workable insight to direct business actions

If we, collectively, have been through this argument and agreed (even by one percentage point) that we need an increased focus on real-time data and analytics technologies capable of working with complex high-speed information sources, then we may be on the way to implementing platforms like KX and/or its competitors.

On this orange tree, KX is not the only fruit. A list of data streaming specialists of note today might include Confluent for its fully-managed Kafka services, Tibco for its Tibco Spotfire product, Amazon Web Services Kineses, Microsoft Azure’s IoT offerings and of course Apache Kafka itself for open source purists. That’s not to say that KX isn’t special, it merely underlines and perhaps validates the company’s position in what is clearly a defined technology discipline working to solve a critical need.

Businesses in any industry vertical implementing this level of technology are on the road to what we might soon call ‘microsecond business operations’, it's a term that just might stick.

Images Powered by Shutterstock