By Maureen Fleming, VP, Intelligent Process Automation Research, IDC

As a new phase of process design and execution sweeps across enterprises, connected business processes will run more autonomously at higher speeds while lowering costs and transforming into highly efficient value streams focused on the delivery of quality customer experiences.

Three major data-driven technological advances enable this type of change: agentic AI, end-to-end orchestration, and ML models supporting continuous improvements and dynamic behaviors.

The success of these advances depends on data, requiring access to data assets leveraged from enterprise and custom applications, datawarehouses, lakehouses and event logs needed for planning, execution, continuous optimization, and performance measurement.

Agentic AI shifts manual tasks to digital and augmented labor

In business processes, AI agents enable broader and more sophisticated task automation than traditional rules-based – or deterministic -- technologies. To test the promise and benefits of agentic AI, enterprises are beginning to explore activities that require large amounts of skilled manual labor and building AI agents to automate and augment the activity. Successful examples of these exploratory efforts include:

  • Automated processing of inbound emails from customers, partners, and automated systems.
  • Shifting from manual sampling of checks to automated fraud detection across all checking transactions.
  • Improved customer onboarding using AI agents to replace manual rekeying of contract terms with an automated process that identifies and extracts terms from a document, matches those terms, and updates the database or flags anomalies to a digital assistant with a human on the loop.

Early adopters are finding that the combination of AI agents and use of GenAI-based digital assistants to automate and augment work improves quality. They are discovering that most of the tasks are performed outside the use of an enterprise or custom application that manages the business process. They also are able to reassign workers and decrease the number of seats used to execute the business process, lowering the labor and software costs of a business process.

Aligning agentic AI opportunities with systematic, data-driven design

As organizations become successful in their early use cases, they are beginning to focus on becoming more systematic, including the need for:

  • Credible and consistent performance metrics. They need to shift from anecdotal successes to a more systematic way to measure their investments.
  • Agentic AI roadmaps. They need to understand and identify areas in a business process that drive efficiency or innovation by becoming more autonomous -- by shifting from fully manual labor to augmented and digital labor.
  • Data-centric tools to assist with planning. To get started with standalone AI agents, teams initially relied on surveys and anecdotes to identify agentic opportunities. This changes as teams begin to build roadmaps aimed at improving a business process by using process mining to identify and investigate manual tasks and calculate the improvement potential across the process.

Moving beyond application silos to a focus on end-to-end orchestration

As automation replaces and augments manual tasks, work executes more rapidly and lends itself to extend across business and IT processes. Process orchestration is emerging as a way to execute work straight through using multiple types of automation and connectivity tools, including robotic process automation, intelligent document processing, AI agents, APIs, data connectors and different types of event-driven technologies. Orchestration also supports human-in-the-loop tasks.

End-to-end orchestration supports the growing need for managing value streams that span across business applications and IT systems. Increasingly, value streams are also extending to automate partner ecosystems.

End-to-end orchestration requires a detailed understanding of how an individual process or sub-process works and where one process links with another. Capturing data from all related systems to be analyzed in process mining supports this, assisting in the planning of a value stream. Additional data from trouble ticketing and customer service requests are often used to identify common problems that can be solved by identifying the root cause and tasks and automating the value stream using process orchestration.

Merging machine learning and event logs for continuous optimization

While GenAI is ideally suited for content understanding and content generation, machine learning is used to adapt systems by using algorithms and statistical models to analyze and draw inferences from patterns in the data. The use cases vary widely, ranging from applying ML models to real-time event logs to detect problems – or anomalies – that require immediate investigation and mitigation.

ML is also used in forecasting, continuously predicting whether the forecast is on track based on the continuous ingestion of data. This is used for financial systems, technology systems, manufacturing and many other use cases.

Another increasingly critical use case uses ML for routing within an end-to-end orchestration or process, enabling the process to execute dynamically. An example is determining how to follow-up on a new lead or interaction from a prospect, resulting in a score that is used to route the lead. The scoring model replaces the need for rules-based decisioning.

All of these use cases are aimed at continuous improvement and continuous optimization, widely ranging from quality of customer delivery, reduction in waste, and system health.

Process innovation is built on enterprise access to their own data

For decades, enterprises have owned their own data, whether the data is structured or unstructured, streamed or stored in a mainframe, third party enterprise packaged application, custom application, third party cloud storage, data warehouse and other assets used to manage data.

With this assumption, software markets were formed to enable enterprises to extract their own data, consolidate and normalize the data, and perform analysis on that data.

There is an emerging trend where some application vendors are narrowing or restricting the scope of data enterprises have access to or adding cost to access their own data. Much of this is to exploit growth opportunities, while other reasons are tied to a vendor's desire to build and sell proprietary AI agents that they can exclusively build due to restricted access to data that a customer would be unable to replicate.

This will be problematic for many customers. An IDC survey conducted in June 2025 indicated that 52% of respondents plan to build their own AI agents, while 48% plan to adopt pre-built agents (N=2,296, conducted worldwide). More than 97% of respondents already use AI in their organization. Out of that 97%, 42% are also already using AI agents, 27% are exploring use cases and 31% plan to invest in agentic AI in 2026. Compared with other emerging technologies in the past, adoption of AI agents is explosive.

As business processes transform from silos to value streams, enterprises should not have to work around the restrictions aimed at protecting application vendors from AI competition. They also should not have to incur unnatural costs to utilize data that they've historically had access to for decades.

This is especially true because it makes it more difficult to build optimal AI agents. That said, any restrictions are unlikely to hold back the wave of innovation that will occur as agentic AI is fully unlocked in an enterprise.