object centric process mining 2560x1440 blog header

5 reasons why object-centric process mining is the fastest and most scalable way to realize process transformation

It’s no secret that companies understand the value of their data. But it's not just any data. Data that holds the most value is mature, process-centric data that is clean, quantified and actionable with a shelf life beyond its originating systems.

“Data is a precious thing and will last longer than the systems themselves.” ~ Tim Berners-Lee

“We are surrounded by data, but starved for insights.” ~ Jay Baer

Tim Berners-Lee and Jay Baer express the heart of what customers really want; the timeless value of common and actionable intelligence beyond system constraints. Data is precious and there is a hunger for insights. Systems come and go and data generated by these systems often remain trapped within them, or when extracted, often lack the end-to-end process context that makes it most valuable. To make matters worse, companies operate within increasingly heterogeneous application landscapes that make understanding end-to-end processes across systems very challenging.

Process data is the key to driving performance improvement, but to do that it must be free from the constraints of the underlying systems. And it must be liberated from those systems in a manner that is efficient, portable and constant.

The quest is to have the right intelligence at the right time to capitalize on in-flight process opportunities. It’s not just about understanding how processes were executed in the past, but what is happening now and how human and digital workers can act together with quantified intelligence at the right moment. This capability of responding to valuable business moments requires a highly refined form of data that is universally maintained with its process context and quantifiable for value and action. A process digital twin, if you will.

Process Mining has made a significant step in helping companies realize this vision, generating significant results in terms of cash value and business impact. But, traditional process mining is limited due to the source-system dependencies of data models used to support classical process mining analysis. This has made it challenging to scale the benefits of process mining widely.

With the recent launch of Celonis’ Object-Centric Data Model (OCDM), these limits have been overcome through a new object-centric process mining (OCPM) data foundation that aims to help companies realize the actionable and constant value of their process data at speed and scale. This new data foundation serves as a standardized way for companies to create actionable value from their data across systems without being constrained by those systems. It changes the game for how quickly process mining delivers value and the new value that can be realized from it.

To help you understand this more, here are five reasons why object-centric process mining is the fastest and most scalable way to realize process transformation.

1. Singularity

Traditional process mining requires an event log data model for each process mining use case that is very much tied to the system it comes from. As companies want to deploy the benefits of process mining across many use cases, they need many system-dependent data models. The resulting models are put at risk because they are susceptible to the changes and evolution of the underlying source systems. The Object-Centric Data Model is a single, common model of the object and event types used across business processes from which all use cases are delivered. Every object and event type is only modeled and defined once. This drastically reduces the modeling effort involved and leaves you with a process digital twin that is free from system dependencies. The output is a constant repository of system-agnostic process data to measure and act toward improved business performance. In the Object-Centric Data Model, the data lives beyond systems, proving a more valuable asset to the business in the context of fluid and changing application landscapes. 

2. Modularity

The Object-Centric Data Model consists of reusable objects and event types. This is a significant innovation that reduces the effort to deploy new use cases. As the data is modeled only once so that events are related to multiple objects, the objects become interchangeable and can easily be related in many ways to create the most accurate three-dimensional view of how processes actually unfold in reality. A helpful comparison is to look at the difference between how NASA developed the Apollo rockets and how SpaceX reinvented the space industry. The Apollo rockets were a bespoke solution and a rocket was created for every mission. Those bespoke rockets are like a data model in process mining and you need one for each mission (use case). SpaceX disrupted the way rockets are built by deploying a platform strategy with reusable rocket components. They can relaunch rockets for multiple missions. This led to more missions at a lower cost to deploy, making space exploration more scalable than ever. The same is true with object-centric process mining. Reusability of object and event types means more powerful existing use cases and new use cases, much more quickly, at lower cost, and higher efficiency for more value.

3. Appearance

Sticking to our rocket analogy: when peering into the cockpit of a NASA Apollo rocket, you will see many buttons without labels or explanations. Each of these serves an independent function and is engineered individually. To fly this rocket you need extensive training. Users interacting today with a traditional process mining tool or SQL engine face a similar challenge. Transformations are everywhere without a human-readable explanation about what the transformations do or when they need to be executed. With the object-centric data model transformations are bound to the object or event type they connect to the source system. This makes them easy to find and the order of execution is automatically determined. This is like a SpaceX experience for the astronaut. In a SpaceX cockpit, users navigate a central and intuitive interface that simplifies the experience and guides the astronaut toward the right actions. This simplifies the management and maintenance experiences of the rocket and improves the experience of the astronaut.

4. Flexibility

There is a scene in Iron Man 2 where Tony Stark interacts with his computer, Jarvis. What is fascinating is how he changes the perspective of his analysis several times and how he adds and removes objects from his three-dimensional field of view to get at root causes and insights. This is a great example of how the Object-Centric Data Model enables flexibility of process analysis by allowing the user to change the perspective of their process view very easily from the same three-dimensional model of data. This enables users to see the intersection points of connected processes with solutions like Process Sphere™ in order to identify the failure points in end-to-end value chains so they can be easily corrected. 

5. Understandability

“Can you tell me how many VBAP are on this VBAK?” Do you know what you are supposed to do? No? - oh, well …”

Translation is a critical part of communication. Much time is needed to arrive at a common understanding when things are spoken in different languages. The Object-Centric Data Model does away with the complex language of system tables by modeling the data in a common business language that is understandable by humans and machines. This removes the technical language barrier that is often an issue with traditional process mining. Analysts and data scientists needed to understand the nuances of source system tables to ensure the accuracy of statistics. By modeling data in business language, we open the door for greater efficiency and  the application of technologies, such as generative-AI and large language models, which have the potential to drive significant value in process transformation.  A process digital twin modeled in a common language removes the time needed for translation and the risk of misunderstandings, making for much quicker deployments.

Process-centric data with a shelf life beyond the source systems

Not all data has the same value. Data that is trapped in systems is worth less than real-time, quantified operational data that can be used to generate prescriptive actions. Object-centric process mining enables the latter by transforming the former. It creates a future-proof repository of the most enlightened and valuable data to drive business performance. As systems are installed and evolve or even decommissioned, the intelligence will remain constant in the object-centric data model. This process digital twin provides a significantly faster and more scalable approach to process mining. It becomes the foundation from which operational intelligence can be derived. 

Object-centric process mining, implemented through the Celonis Object-Centric Data Model, enables data that is trusted, free from system dependencies, embedded with process context, quantified and actionable. As R “Ray” Wang, CEO and Principal Analyst at Constellation Research, said in a statement to Celonis, features such as the OCDM allow organizations to “rapidly improve processes to achieve cost savings, increase customer satisfaction, meet regulatory requirements and boost overall performance.”

Editor's Note: Svenja Matthaei, product manager at Celonis, contributed to this article.

John Santic, Director of Product Marketing at Celonis
John Santic
Director of Product Marketing

John Santic is Director of Product Marketing at Celonis. John has over twenty years of experience in the software industry in marketing and GTM roles, focusing on analytics, data management and process mining.

Dear visitor, you're using an outdated browser. Parts of this website will not work correctly. For a better experience, update or change your browser.