Throughput is a core factor when it comes to bending the healthcare cost curve – whether by reducing readmissions, increasing access to care or by driving better service quality. It’s also a metric than can be fraught with complications when we expand focus from a single specific service area to processes that span multiple service areas becoming increasingly more common in the era of longitudinal patient care. Of course, process complexity itself has a direct and substantial impact on the ability to both understand and optimize throughput in any scenario.
Consider the challenges posed by a process’ scope such as orderly and efficient patient flow, timely processing of a lab test or the reading of an X-Ray, accurately processing a claim — or the performance of many hundreds of other processes executed thousands of times day. Our challenge is further aggravated since in many of these examples the information (data) related to the execution of each step of each process may be stored across multiple different systems (EMR, LIS, RIS, PACS, ERP, etc.). Consequently, there is no central application that can provide a comprehensive view of the process histories.
Enter an empirical approach that can deliver 100% process visibility for existing processes; one that visualizes how those myriad processes are executed across all patients, and all systems; and that maintains the nuance of execution details as a core analytics element.
Do I have your attention?
The digital transformation that has been taking place in healthcare also enables a new technology that digitally recreates any process flow, along with all of the relevant process artifacts. Historically, there was no centralized system to facilitate this analysis, so providers were forced to utilize highly manual, people or consulting-intensive methods to establish any level of process visibility. Unfortunately, even if you’re awash in BI, analytics or related tools, they simply aren’t designed to understand “process” (how events related to each other across the element of time). As the focus on process execution increased, what emerged was a need for a new approach to the problem – at TimelinePI we call this Intelligent Process Mining.
While the technology that enables Intelligent Process Mining is very advanced, the concept is very straightforward. By analyzing the digital “bread crumbs” left behind when we process transactions or record patient information, we can now reconstruct every process instance exactly as it was originally performed. This includes the timing of events, and other critical data collected as related process artifacts. These recreated digital process instances, or timelines as we like to call them, collectively provide us with a comprehensive interactive model, a “digital process twin” if you will, that allows us to analyze the patients care in a way never before possible.
Intelligent Process Mining combines this interactive process model with a variety of visualization tools designed specifically for analyzing processes. The process-centric tools empower healthcare providers to understand throughput improvement opportunities based on real data, presented in an understandable format. These process-aware analytics are designed specifically to discover, segment and filter the numerous process patterns found in populations. Providers can identify and remediate the root cause of poor process execution, compare cohorts on a variety of variables, and search for other process anomalies.
This fact-based approach recreates a process and its sub-processes in their entirety, allowing you to derive cause and effect dynamics, and expose key process metrics. In some scenarios these facts may exhibit unique patterns of poor execution when certain personnel are involved in a specific situation. In others, they may identify interdependencies between equipment, administrators and certain care paths. It can also expose previously unrecognized best practices, so they can be adopted more broadly by the organization.
Significantly impacting a provider’s throughput requires scalability to handle the volumes of cases facing providers. This is where the power of the digital process twin provides its greatest benefit – automated monitoring. By maintaining a digital process twin for all processes which are continuously updated with the latest case and operational data it is possible to automatically monitor for any condition which violates a prescribed protocol or operational process. When such conditions are detected they can then trigger automatic real-time alerting of the appropriate parties allowing them to take timely action to mitigate the impact of these sub-optimal actions.
So, let’s recap. Intelligent Process Mining revolutionizes healthcare providers’ understanding of their throughput improvement opportunities based on real data, presented in an understandable format. More importantly, this new approach is specifically designed to document myriad types of process patterns, filter and group processes into logical cohorts, and identify and remediate the root cause of process misbehavior. Finally, it’s possible to automatically monitor process issues in near real-time and address issues as they arise – enabling truly sustainable process improvement.
I am sure some readers may be somewhat skeptical of what I have described here – that’s natural. I have also seen numerous peoples’ reaction to this technology when they load their data and view their actual process behavior for the first-time. It is not uncommon for them to observe a behavior and declare “that’s not possible!!”. I have also seen their astonishment when they realize it is real and they just were not aware.
If I have raised your interest, this video demonstrates how this approach can help you understand a potential throughput issue in an emergency room.
About the author
Scott Opitz is a 30 year veteran of the computer industry and has founded and built companies in the application integration, business process management, and business intelligence spaces. In 2014 he founded TimelinePI and serves as its President and CEO.