In this article, I’ll explain the flow of using process mining to improve business processes, contrasting it with the procedure of treatment in a hospital.
Process mining aims to discover various issues and problems hidden in the process by visualizing invisible business processes from the event log data.
In terms of this “visualization of the process”, process mining is often likened to an X-ray. However, just as in the treatment of diseases, the ultimate goal is not the discovery of the lesion (Inefficiencies and bottlenecks) but the implementation of appropriate treatment (improvement measures) and the return to a healthy state, in other words, the realization of an improved “ideal process(to be proess)”.
Let’s start by outlining the flow of medical activities in a hospital. Broadly speaking, there are two stages: the “diagnostic stage” and the “treatment stage”.
The starting point for treatment is when a patient comes in with some kind of symptom such as fever or cough.
First, we will ask questions about the extent of your current symptoms and conduct an interview.
Using an X-ray machine, the area where the lesion is thought to exist will be photographed.
The presence of the lesion is confirmed by looking at the X-ray photograph.
From the results of the X-ray photos, you can determine what diseases the patient have.
In addition, various physical exam and tests will be performed to verify the correctness of the above diagnosis.
The course of treatment is based on the results of the diagnosis and the patient’s wishes. For example, it’s about whether to carry out surgery or how to treat medication.
If it is better to remove the lesion, surgery will be performed.
The treatment is performed by administering medications alone or in conjunction with surgery.
The etiology has been eliminated and the symptoms are gone. Treatment is complete.
Next, we’ll outline the steps to improve business processes along the path of diagnosis and treatment at the above hospital.
●Business Process Improvement
Understanding the current situation – Diagnostic stage
Process with problems – Patient
Select processes that are experiencing problems as phenomena, such as long throughput, high operating costs, customer complaints, etc., as targets for improvement.
Process Setup – Preliminary interview
Basic information related to the process to be improved, such as an overview of the process, the number of processes, and the department or person in charge, will be organized through interviews. If there are any specifications or manuals for the system involved in the process, check them as well.
Process Mining – X-ray
Based on the event log data of the process to be improved, we analyze it using a process mining tool and create a flowchart of the current process.
As is process – X-ray photograph
We analyze the current process from various perspectives, such as frequency and time required.
Problem identification – Diagnosis
Based on the results of the above analysis, we identify the areas that are causing problems or issues as a phenomenon, i.e. inefficient procedures that are taking too long, or bottlenecks that are piling up pending cases.
On-site interview and observation – physical examination
To identify the problem areas, we conduct interviews with the person in charge at the site and conduct observational surveys to identify the root cause.
The root causes of process inefficiencies and bottlenecks are: too many meaningless steps, too many mistakes, too many reworkings, and too few people assigned to deals that need to be done.
Improvement Activities – Treatment Stage
Improvement Policy – Treatment Policy
Once we have identified the various problems and issues related to the process and the root causes of these problems and issues, we plan improvement measures.
As a major improvement policy, it is important to first clarify the objectives, such as reducing throughput, reducing costs, and improving customer satisfaction.
Implementation of improvement measures – Surgery and Medication
There are a variety of options for improvement measures, ranging from major to minor modifications.
BPR (Business Process Re-engineering), which is a zero-based re-engineering of the process, can be compared to surgery. Replacing manual tasks with RPA software robots might be like replacing an artificial heart.
If a small change in procedure could improve the time required, it would be a disease that could be treated with simple medication.
Improved Process (To be process) – Recovery
Once the desired process has been achieved as a result of effective improvement measures, the project is complete.
Just as regular check-ups are necessary in the treatment of a disease, it is important to continuously monitor the target process to ensure that problems do not recur or new problems arise.
Thanks to process mining and task mining, you are able to find inefficient processes and bottlenecks in the process. But that’s not where it’s done, is it?
Needless to say, process mining and task mining make it easy to uncover problems through data analysis, but they don’t tell you how you can solve problems. (It’s plausible that in the future, advanced AI capabilities will be added in process mining tools to hint at ways to improve them.
Therefore, once a problem related to the process has been found, the process mining and task mining are no longer needed for the time being.
What comes into play after finding a problem is problem-solving techniques, including Lean Six Sigma. In those problem-solving methods, you can use various frameworks such as 5WHY and factor analysis (fish bone analysis) to find the root cause behind the problem, and then plan and implement specific improvement measures.
This article introduces nine redesign methods that can be used as a reference to consider specific improvement measures. These redesign methods are systematized in BPM (Business Process Management). This is a practical method that has been developed empirically through numerous process improvement projects that have been implemented in the past.
In fact, there are nearly 30 rules of thumb for process redesign. Of these, the nine methods I’m going to share with you are the most common and most likely to produce improvements.
The nine process redesign methods can be broadly categorized into three levels (task level, flow level) and process level). Each method will be explained at each of the three levels.
This is a change of any kind to the individual tasks (activities) that make up the process.
1 Task Elimination
For those tasks that are taking a long time, it’s important to ask yourself if the task is worth doing in the first place, eliminate it, or reduce the number of times you do it.
For example, if there are three levels of approval tasks and they are formidable, reduce them by one level to two. In the inspection process, the number of inspection tasks could be reduced by a fraction of a percent by changing to a statistical method that only inspects a small portion of randomly selected products, rather than inspecting all products.
2 Task composition (decomposition)
When tasks are subdivided into smaller chunks, or when tasks are passed between multiple departments, the time required is often longer. Therefore, it may be effective to consolidate multiple tasks into a single task, or to consolidate tasks in your own department without passing them on to other departments. (Conversely, multiple tasks can become inefficient when they are combined into a single task. (In that case, it may be useful to break down the task).
In some cases, running sub-processes in a conditional branch may reduce the time required for a process. For example, in the procurement process, a task following the receipt of a purchase application would be to run different processes for amounts over 10 million and below.
Conversely, if the complexity is compounded by too many sub-processes, you may want to consider consolidating some of them.
It’s an improvement method for the order of tasks, not just a single task.
Re-sequencing is about reviewing the flow of tasks and rearranging them in the order that is most efficient and requires the least amount of work.
For example, if the procurement process includes two approval tasks, A and B, then on average 1% of the A task will be set back and 10% of the B task will be set back. In this case, bringing task B, which has more regressions, before task A will result in a relative decrease in the number of approvals for A, which will be more efficient and reduce the workload overall.
5 Parallelism enhancement
In some processes, where sequential processing is used, where the next task starts only after the previous task is completed, if the process is changed to one where multiple tasks are processed in parallel instead of sequentially, it is expected to reduce the time required for the entire process.
Changing sequential processing tasks to concurrent processing often has a significant effect on throughput reduction.
It is an improvement based on another perspective besides the individual tasks and the order between them.
6 Specialization and standardization
Specialization aims to improve efficiency and customer satisfaction by dividing a process into multiple processes and assigning a person in charge to each sub-process to increase the expertise. For example, it is possible to divide a process into VIPs and general customers and provide a speedy and courteous service process especially for VIPs.
On the contrary, standardization is an attempt to unify multiple processes in the case of the same business because they are separated by product, etc.
7 Resource Optimization
When multiple people are running the same business process, the amount of work is concentrated on a particular person while other people are playing around, or when there is a bottleneck due to the lack of people in charge of the same amount of work, it is necessary to “optimize resources” by devising the assignment of people in charge or reviewing the shifts of people in charge.
8 Communication Optimization
If the process flow is driven by some kind of communication, such as a phone call, fax, or email, you may be able to improve efficiency and customer satisfaction by changing the timing of receiving or processing communication, for example.
For routine tasks where there is a clear set of procedures, automation with RPA can be effective. There are also multiple options for automation, such as developing an application that makes automatic decisions based on the input information.
Let’s first consider whether these nine rules of thumb can be applied to your own process improvement/innovation project or DX (Digital Transformation) promotion project when guiding a solution to an individual problem or issue.
As mentioned at the beginning, 29 rules of thumb for process redesign are presented in the “Fundamentals of Business Process Management”.
In addition, please take a look at the MOOCs (e-learning), which is based on the book, for detailed explanations of the nine process redesign methods introduced in the book.
Process mining is an “analytical method”. The mere introduction of a process mining tool doesn’t start anything. You will need to plan a series of steps as an “analytical project” and manage their execution.
However, if you have not done any research or analysis projects in the past, it does not seem to be easy to understand the steps of an analysis project. Therefore, I would like to explain the flow of process mining analysis by contrasting the flow of cooking.
First, let’s see the flow of the food. The assumed location is the kitchen of a restaurant. The first activity is “purchasing foodstuff” and the last is serving dished-up food to customers’ tables.
1 Purchase of foodstuffs
purchase a variety of food from all over the world through food wholesalers.
The foods to be cooked are now available. Check to see if there are any insects eating or rotting.
prepare the food by chopping it with a knife or boiling it in a pot of boiling water to remove the bitterness.
Cooks food using a variety of cooking utensils.
5 dishing-up and serving
dish up cooked foods and serve the finished dishes to the customers.
Role of Master Chef
Note that the role of the master chef is to oversee the entire cooking process of the restaurant.
Next, let’s explain the steps of the process mining analysis, corresponding to the above cooking steps.
process mining procedure
1 Extraction of data = Purchase of foodstuff
extract data from various systems that record and accumulate event logs that are the target data for analysis, such as ERP represented by SAP, CRM systems such as Salesforce, or proprietary business systems.
As a method of data extraction, it is common to extract data directly from a DB by SQL.
Data extraction is basically done by system engineers or system administrators, and when the database structure is complex, such as ERP, it is necessary to determine where the data to be analyzed is located, for example, with the assistance of SAP experts who have good knowledge about SAP.
2 Data to be analyzed = Foodstuff
The data extracted from the system is collectively referred to as the “event log. This is because the history of operations on the system is recorded on an event-by-event basis with a time stamp.
As a data format, it would be easier to pre-process the data in the post-process if it were provided in CSV format. In some cases, the event log may be provided in JSON format and the pre-processing of the event log in JSON format can be a bit cumbersome.
3 Data preparation = Precooking
The event log data extracted from the system is often composed of multiple files, often ten or more. It can be a file that records activity and time stamps, etc., as well as a file that contains the master data.
Basically, all the files must be combined into a single file in order to analyze by a process mining tool. In addition, the original files contain a lot of data that cannot be analyzed as it is, such as garbled parts and empty cells that should have contained some kind of value.
Therefore, it is necessary to remove or adjust for those noisy data, that is, perform data cleaning similar to the removal of unfavorable parts of food. Data preparation is the process of processing the original data into clean data that can be analyzed by a process mining tool
Data preparetaion is done by data scientists who know how to process data to make it clean, using ETL tools, Python, and other tools, languages.
4 Analysis = Cooking
Once the data has been pre-processed and the clean data is ready for analysis, it can finally be fed into process mining tools for various analyses.
The process mining tool is a very versatile tool. It takes some training and experience to become proficient, but it’s fun to visualize business processes as a flowchart from event log data that looks like nothing more than a litany of numbers to uncover inefficiencies and bottlenecks.
Analysis with process mining tools requires tool experts who are familiar with the tools used, but it is the process analyst who gives the analytical perspective on how to do the analysis. The data scientist also has a better understanding of the original data through pre-processing of the data, so they can assist in the analytical work.
5 Reporting = Dishing up and serving
create reports using graphs, tables, etc. on the issues and problems of the target process identified from various analysis results with process mining tools. Since the people receiving the report are not necessarily familiar with data analysis, it is necessary to keep in mind the visual presentation that makes it easy to understand what the issue or problem is.
Ideally, the report should be written by a process analyst, with the assistance of a process consultant with process improvement know-how (Lean, Six Sigma, etc.). It’s also good to have the support of a data scientist or tool expert, as additional analysis may be required.
Role of Project Manager
It is the project manager who correspond to the master chef of the restaurant who runs the entire process mining analysis project. A project manager does not have to be familiar with the entire process. However, you must have a good understanding of each step of the process and above all, you must have the skills to execute the project smoothly.
So far I have used the culinary metaphor to explain the standard procedure for process mining analysis. Each process is a highly challenging one that requires a certain level of skill and experience, so it is necessary for experts in each field to work well together to advance the project.
現在のやり方を強化するアプローチです。ここには、TRIZ、制約理論、リーンマネジメント、シックスシグマ、BPR（Business Process Re-engineering)が含まれます。革新（Innovation)というよりは、主に改善（Improvement）のための手法です。現在の業務プロセス、業務内容を把握し、非効率性、ボトルネックなどの問題点を発見し、改善施策を講じます。組織再編も含めた、全社的に根本的な改善を行うのがBPRです。
Organizing the functionality of a process mining tool based on the purpose of analysis
Process mining tools that perform analysis based on event logs are basically very versatile and are evolving with new features being added every day. It’s not easy to get an overview of the features of a process mining tool when you’ve just been given a one-size-fits-all explanation or demo.
So, in this article, let’s start with what kind of analysis you want to do, that is, the “purpose of the analysis”, and organize what kind of function it has.
Please note that we have deliberately left out task mining because it is a technologically immature feature and we are focusing on the main features.
Now, there are many ways to analyze using process mining tools, but I would like to divide them into the following four main categories.
1 Process Focus
This is the basic analytical perspective of process mining. The analysis focuses on the flow of the target process.
2 Organizational Focus
The three required data items for process mining analysis are process ID, activity, and time stamp. In addition to these three items, “resource (user in charge)” and “role (department and position)” are typically analyzed as semi-requisite items.
In addition to the process itself, the Organizational Focus analyzes the process from the perspective of the people in charge of executing the process and their departments and positions.
This approach is sometimes referred to as “organizational mining”.
3 Simulation Focus
Literally, it’s an approach to simulating by setting up some parameters.
4 Operational Focus
Process mining analysis is essentially an approach that targets previously completed data, but analyzes currently running and uncompleted processes in real time.
Let’s take a look at the analysis objectives and corresponding functions for each cut.
1 Process Focus
1.1 I want to know what the variations of the process are.
⇒ Variant Analysis
1.2 I’d like to see the number of cases flowing through the process.
⇒ Frequency analysis function
1.3 I want to see the time required for a process (throughput, lead time between activities, etc.)
⇒ Performance analysis function
1.4 I want to Discover deviant processes compared to standard processes (to be processes)
⇒ Conformity inspection function
1.5 I want to compare multiple process variations.
⇒ Comparative analysis function
1.6 I would like to delve deeper into the causes of the problem regarding inefficiencies and bottlenecks in the process.
⇒ Root cause analysis function
1.7 I want to understand the deviation from the KPI target values (throughput, processing time, etc.).
⇒ KPI setting function
1.8 I want to understand the business rules in the process branch (gateway).
⇒ Business Rule Mining Function
1.9 I want to create a BPMN-compliant model.
⇒ BPMN model conversion function
⇒ BPMN model creation and editing functions
2 Organizational Focus
2.1 I want to know which person is in charge of which activity.
⇒ Activity Map Function
2.2 I want to calculate the number of processes and processing time for each person in charge.
⇒ Create customized dashboards
2.3 I would like to understand how those in charge of the process relate to each other in the target process.
⇒ Social network function
3. Simulation Focus
3.1 We want to verify the effects of changing a part of the process or implementing RPA.
⇒ Simulation function
4 Operational Focus
4.1 I want to estimate how much more time it will take to complete an incomplete process.
⇒ Predictive analysis function
4.2 I’d like to estimate the steps to be taken to shorten the throughput of an incomplete process.
⇒ Recommended process functions
4.3 I want to send an alert to a person in charge when a deviation process occurs.
⇒ Alert function
Above, we have organized the features of the process mining tool according to the purpose of the analysis. Please note that the function names of each tool are different.
When selecting a tool, understand how you want to analyze the process to be analyzed from the perspective of your company, and then confirm whether the candidate tool has any functions.
Introduction to Process Mining (3) Business environments which make process mining
In this article, I will explain the changes in the business environment in which process mining is becoming more important and indispensable.
First, there are two environmental changes that I would like to list as major trends in society as a whole. It is a “service economy” through servitization, and a “digital economy” through digitalization.
Service Economy – Servitization
Economic development to date has been supported by the mass production and sale of a variety of products, mainly by the manufacturing industry. Excellent products are made in large quantities to lower the unit cost of production and are sold efficiently, mainly through wholesale and retail channels.
The important thing for manufacturers is to develop, manufacture and ship quality products, leaving the process of delivering them to consumers to the distributors. In addition, the products purchased are literally free to be used and consumed by individual households and consumers. The manufacturer was basically not involved in the use and consumption of the product and the disposal process, although it did take care of the breakdown, of course.
However, with the rise and development of various service industries outside of the manufacturing industry, as well as increased competition among manufacturers, more and more companies are offering a combination of services related to their products (e.g., installation of products, insurance, services that provide content that teaches how to use them, etc.). In other words, our approach is to provide a “total solution” that includes a variety of services, rather than a single product as an object. (Incidentally, total solution is called “whole product” in marketing.
Furthermore, it does not sell out the product, but continues to provide value for use. In other words, the form of lending for a monthly or even annual fee has been increasing in recent years. It is a so-called “subscription type”.
This shift in the quid pro quo from product to service is happening in every industry. That’s the progress of the “service economy.
Now, there are four features of the service
The service is not a physical entity.
Services are consumed as soon as they are produced. For example, the services of hair cutting and styling at barber shops and beauty salons are provided to customers in real time.
Variance in Service quality:
The quality of service delivery may vary especially when there are many human factors involved. Some people may provide good service and vice versa, others may provide terrible service.
this is related to the characteristics of intangibility and simultaneity, but services disappear as soon as they are provided.
Of these characteristics, the ones that are relevant to the process are simultaneity and variability. A service is provided in real time, and each time it is provided, there is a high or low level of value or quality of service. Therefore, as a service provider, it is crucial to manage the process properly.
The Digital Economy – Digitalization
The beginning of digitalization was the lifting of the commercial ban on the Internet in 1995. Since then, a variety of services utilizing the Internet have been born one after another. Consumers can now easily use the Internet through mobile phones as well as PCs, making the use of digital devices and services in all aspects of their lives indispensable. Therefore, it can be said that the progress of digitalization has greatly facilitated the transition to services mentioned in the previous section.
In this digital economy, proper management of the processes that deliver value has become a major challenge for companies. A variety of online services are essentially “services” with the four characteristics presented in the previous section, even if they often involve the movement and consumption of products with substance. Therefore, quality control is essential, especially in the difficult situation of simultaneity and variability.
Now, let’s look at the factors that companies need to respond to in the larger macro trend of the service economy due to the increasing use of services and the digital economy due to the increasing use of digital technology.
In terms of the external environment, customer experience and operational excellence are two keywords.
Customer experience is deeply related to servitization. For manufacturers, it used to be that if you made a good product and sold it, that was the end of it. Little attention was paid to how purchasers would use and consume their products.
However, they began to offer a variety of services to accompany their products, and the number of subscriptions increased. As you sell directly through your own website, it is also important to optimize the buying behavior of your prospects and the process of using and disposing of your products.
It has become necessary not only to design the specifications of the product itself properly, but also to “design the customer experience” to ensure that the customer experience associated with the product, from purchase to disposal, is the best it can be.
As the overall customer experience becomes increasingly important to consider, Operational Excellence has become increasingly important as a fundamental strategy for establishing a competitive advantage.
There are three basic strategies for establishing a competitive advantage: product leadership, customer intimacy, and operational excellence. Each company has decided on a strategic direction to focus on based on its own management resources, but the effectiveness of the “product leadership” strategy has declined as it has become increasingly difficult to differentiate the product itself. In addition, customer intimacy, which aims to form an intimate relationship with customers, is no longer sufficiently effective as a point of differentiation due to the progress of digitalization.
However, operational excellence, or “operational excellence,” has a great deal to do with the proper management of processes, but due to the increasing sophistication and complexity of products and services, it is difficult to follow a straight line, and it is easy to see the difference between companies that can do it well and those that cannot. Therefore, in order to establish a competitive advantage, it is necessary to address the operational advantage. What’s more, being able to perform well can lead to increased customer satisfaction and have a positive impact on “customer intimacy”.
Now let’s consider the internal environment. While much of the change in the internal environment of companies and organizations is due to digitalization, let’s list two key words: “process de-visualization” and “digital footprint.
It can be said that the development of digitalization, or the systematization of various types of business operations in companies, began with the emergence of ERP in the 1990s. As mentioned above, after the lifting of the ban on the commercialization of the Internet in 1995, the so-called “Internet Revolution”, business systemization based on Internet technology has progressed. In addition, SaaS, which can avoid huge initial development costs, such as Salesforce.com, has appeared one after another, and the operations of many companies, from large corporations to small and medium-sized enterprises, are being systematized.
The problem is that the systematization of the business has made it impossible to see how the business is done from the outside. When all the employees were in the office and working by paper, phone and fax, it was possible to know who was doing what and how they were doing it, albeit in a sensory way.
Now, however, the phones rarely ring, and employees are working in silence at their computers. When it comes to telecommuting, you can no longer visually see how your work is being done.
In other words, the digitalization of operations has made many of them invisible, making it very difficult for management to properly control progress.
On the other hand, since most of the business operations have been digitized and systematized, it is possible to record the status of operations on the system exactly as data. It’s called a “digital footprint,” but by capturing, recording, and analyzing each user’s application operation history, whether it’s an application such as ERP or CRM, or office software such as Excel or Powerpoint, it’s possible to “visualize” business processes that have become invisible again.
In other words, process mining, which automatically reproduces business processes based on event logs extracted from business systems, and is useful for continuous business process improvement, has emerged as an indispensable analysis method for corporate and organizational management today and in the future.
INTRODUCTION TO PROCESS MINING IN PRACTICE – e-learning course on Udemy
Aalstn e-learning course through Udemy that will teach you the basic knowledge you need to know when implementing process mining, scheduled for early May 2020.
Person in charge of implementing process mining in a company or organization
Consultants who are helping to implement process mining
Those who want to become an expert in process mining
It’s not about the theoretical aspects of process mining, but more about the content that will help you successfully apply it to your business process improvement.
A comprehensive e-learning course containing process mining principles has been offered since 2014 through Coursera by the godfather of process mining, Professor Wil van der Aalst.
However, this is the world’s first practical introductory course to process mining, as it does not rely on a specific process mining tool and is not yet offered in Japan or the rest of the world as a practical e-learning course focused on business applications to improve business processes.
An English version will be released at a later date.
Benefits for participants
You will learn the basics of process mining from practical aspect.
You will be able to effectively communicate the necessity of the introduction of process mining to your supervisors and other internal stakeholders(person in charge).
You’ll be able to convincingly communicate the value of process mining to your prospects (Process Mining Consultant).
What is process mining?
History of Process Mining
Business environments that make process mining indispensable
Benefits and Expected Returns of Process Mining
Processes to be analyzed
Process Mining and Related Solutions (ETL, RPA, BPMs, DWH/Datalake)
What is the event log?
Principles of Process Mining Algorithms
Four Approaches to Process Mining
How to manage a process mining project
Task Mining (Robotic Process Mining)
Skill sets required for process mining practitioners
process mining tool
– What is process mining – History of process mining – Business environments that make process mining indispensable – Benefits and expected returns of process mining – Target processes to be analyzed – Use cases – Process mining and related solutions(ETL, RPA, BPMs, DWH/Datalake) – What is event log – Principle of process mining algorithm – Four approaches of process mining – Process discovery – Conformance checking – Process enhancement – Operational support – How to manage a process mining project – Basics of data preparation – Task mining/Robotic process mining – Necessary skill set for a process miner – Process mining tools
Task Mining – Three Analytical Perspectives for Improving Labor Productivity.
Based on PC operation logs, “task mining” visualizes the tasks performed on each user’s PC.
In this article, let me explain the three analytical perspectives for task mining.
First, let’s be clear about the purpose of doing task mining. That is improving labor productivity.
Productivity is generally defined as
Then, the “labor productivity” in task mining can be expressed by the following formula;
Labor productivity = amount of value created/labor time(cost) spent
Here, labor input, if per day, would generally be 8 hours, and 40 hours per week and 160 hours per month would be the norm. (If you are on a two-days off per week).
In a nutshell, increased labor productivity is about creating more value with the hours worked. The point is not to increase value by working longer, but to increase the value you create in the same amount of time(cost).
Now, with task mining, PC operations can be recorded and accumulated in detail through sensors installed on individual PCs, allowing analysis to be performed in order to consider improvement measures to be taken to improve labor productivity.
This analysis aimed at improving labor productivity includes the following three analytical perspectives
1 Created Value 2 Efficiency 3 Task to be improved
I will outline one by one.
1 Created Value
The first perspective of a task mining analysis is how much of your business time is spent engaged in value-creating activities.
As you can see from the labor productivity formula, labor is about creating value. Value, to put it plainly, is what contributes directly or indirectly to sales. In the case of factory labor, it is exactly the “product” as a result of creating the value.
In the case of various types of office work, it is not as clear as factory work, but if you are in charge of sales, preparing proposals and quotations are important value creation activities to create sales. The time that any department or business is creating value in some way is called “value creation time”.
On the other hand, time spent watching YouTube videos or just zoning out during work hours is not creating value. This is “non-value creation time. (Note that lunch and break times are not included in the analysis in the first place, as they are not business hours.)
The way to improve labor productivity is to increase value-creating activities as much as possible. However, it is assumed that the unit time of 8 hours will not be increased for a day. Therefore, it is necessary to work on how to reduce the amount of non-value-creating activities, in other words, the amount of slacking and idleness in the eight hours.
Therefore, first of all, task mining classifies business time into “value creation time” and “non-value creation time” from the perspective of value creation.
Value creation activities can also be divided into two categories. They are “high value” and “low value. High value is the aforementioned proposal and quotation writing, if you’re a salesperson. Low-value tasks include such as expense reimbursement and customer travel.
We should aim to reduce low-value operations as much as possible. For example, for expense reimbursement, you can simplify the procedure with a dedicated application, automate it with RPA, and eliminate travel time with web conferencing.
Even though they create the same value, it takes different amounts of time depending on the people who work faster or those who work slower. Therefore, after sorting out value creation and non-creation, the next step is to seek efficiency, in other words, to reduce the time to create value keeping the same value created.
When analyzing efficiency in task mining, it is necessary to set a reference value. In short, even if we do the same work, we can’t judge whether the work is highly efficient without setting some kind of evaluation criteria.
In general, this standard value is based on the average processing time by department or job category. The good thing is, unlike interview-based business analysis, task mining can be analyzed based on the actual “PC operation time”.
3 Task to be improved
The third analytical perspective of task mining will be the discovery of improvement potential tasks. While the previous two items (created value and efficiency) focus on business processing time, the improvement potential task focuses on the flow of work.
Firstly, we find and extract the tasks that we think could be improved somehow. The main targets for extraction are “routine patterns”, “multiple mistakes” and “repetition”.
A “routine pattern” is one in which several steps are taken in sequence. Day-off requests and business travel settlements are typical routine patterns. These procedures are often systematized into business systems, so they can be analyzed by process mining, but even if they are not systematized into business systems, they can be discovered by task mining. The “multiple mistakes” or “repetitions” are outliers found in the flow of app and file operations, many of which involve a large amount of “copy and paste”.
Although specific improvement measures for these potential improvement tasks may occur on a case-by-case basis, automation with RPA is the most likely solution.
Above, we have explained that in task mining, the analysis is carried out from the three perspectives of “created value,” “efficiency,” and “task to be improved”.
Task mining can also do other things, such as finding non-compliance processes that are related to compliance, but this is less relevant to improving labor productivity and will be discussed at another time.
Robotic Process Mining: Vision and Challenges Volodymyr Leno, Artem Polyvyanyy, Marlon Dumas, Marcello La Rosa, Fabrizio Maria Maggi
Discovering Automatable Routines From User Interaction Logs Antonio Bosco, Adriano Augusto, Marlon Dumas, Marcello La Rosa, and Giancarlo Fortino
AI for Business Process Management From Process Mining to Automated Process Improvement Marlon Dumas, University of Tartu Institute of Computer Science
What is RPD – Robotic Process Discovery?
Robotic Process Discovery (RPD) is essentially synonymous with “task mining”. That is, it collects and analyzes PC interaction Log, which is the history of the user’s operation of applications and files such as Excel, PowerPoint, and browsers on his or her own PC.
“Task mining” is an expression first proposed by US IT advisory firm Gartner in its report, “Gartner, Market Guide for Process Mining, Marc Kerremans, 17 Jun 2019”. The term “task mining” is already gaining fairly high recognition around the world and in Japan as a general name that includes all solutions for “business visualization” based on PC interaction logs.
“RPD”, on the other hand, is a methodology proposed by Marlon Dumas (Professor at Tartu University) and Marcello La Rosa (Professor at Melbourne University) in 2018 through their research on PC interaction log analysis.
In the case of task mining, it only connotes the big framework of PC interaction log analysis, but RPD shows the basic analysis procedure of PC interaction log mainly for the purpose of “automation of tasks by RPA”.
The following is an overview of how RPD proceeds to analyze the PC interaction log. The references are shown at the end.
Please note that this is a simplified version based on my original understanding. And it should also be noted that the above researchers have recently started to call it RPM (Robotic Process Mining) instead of RPD (Robotic Process Discovery), but I will use RPD in this article.
1 Collection and storage of PC interaction logs
The detailed activity captured by the sensor is called “atomic activity” because it is the smallest unit that cannot be decomposed any further.
2 Data Extraction and Noise Filtering
PC operation logs are very detailed, so called atomic activity. What’s more, there’s a lot of noise in there that can’t be analyzed, such as modified activities due to user error.
Therefore, after extracting PC interaction log data based on some conditions (target period, target PC, etc.), it is necessary to remove (filter) the noise first. In addition, if the notation on the recorded data is slightly different even though it is the same application, it will be processed as a different application, so we can unify the notation, correct garbled characters, and perform various data processing other than noise removal. This work is commonly referred to as “Data Preparation”.
3 Task Segmentation
In RPD, “segmentation” means to isolate a group of tasks from the PC operation log that are assumed to have followed a certain procedure. For example, “Copying and pasting information displayed on the browser screen into an Excel file” is a task to extract a series of tasks to accomplish some purpose, so-called “routine tasks”.
Unlike business systems (e.g., procurement systems) with pre-built business procedures, PC operation has a high degree of freedom for the user, and at a glance, PC operation logs look like they are just moving various applications and files at will, and business procedures are not clear.
Therefore, it is necessary to perform “task segmentation”, that is, to isolate only the data related to a single task from the PC operation log.
4 Task Simplification
The tasks extracted by the segmentation, such as data transcription, still contain some noise. Many of them are caused by user mistakes or parallel operation in other applications, but if you remove these noises, you can clearly understand the steps in each PC operation. The aforementioned example reveals a flow that reveals the following clear steps
Excel File Open (Excel) ⇒ Data Display Screen Access (Browser) ⇒ Data Copy (Browser) ⇒ Paste (Excel) ⇒ Data Display Screen Access (Browser) ⇒ Data Copy (Browser) ⇒ Paste (Excel)…
The finishing touches that make it possible to understand the procedure clearly are called “task simplification”.
5 Identification of candidate tasks which can be automated
From the PC interaction log data extracted for analysis, we were able to isolate multiple tasks through task segmentation and clearly understand the flow of each task through task simplification.
The next step is to consider which of these tasks are suitable for RPA automation and whether they are likely to produce a reasonable effect. At this stage, it is advisable to interview the person in charge in the field who is actually performing the candidate task in detail. (In reality, even the task segmentation and task-simplification stages can be done quickly with the help of field personnel.)
6 Automatable procedure discovery
This is the stage where the scope of automation with RPA is determined. The tasks identified in the previous section as being better suited for automation are not necessarily all automatable from beginning to end.
So, we will further narrow down the steps that can be automated. For example, if the procedure of the automation candidate task identified in the previous section is [A ⇒ B ⇒ C ⇒ D ⇒ E ⇒ F], then only [C ⇒ D ⇒ E ⇒ F] is to be automated (A ⇒ B remains the current one).
7 Create specifications for automation procedures
Once you have narrowed down the steps that can be automated, consider the requirements for the automatic execution of the task by any RPA tool and create a “basic design document” for the programming in the next section.
8 RPA programming
This is to be done on an RPA tool writing an actual automation procedure. After testing in the actual environment and verifying that it works without any problems, the RPA robot is ready to go live.
Although it may not be easy to get an image of the RPD just by explaining it in words, I have explained the general procedure of RPD.
Whateve you call it, RPD, RPM or task mining, the main focus is to develop various improvement measures with the main objective of improving productivity through visualization of operations at each PC. There are a variety of specific improvement measures, but we hope you understand that RPD is an analysis method with the basic purpose of “task automation” in particular.
It should also be emphasized that RPD, or task mining, is most effective when combined with “process mining”, which visualizes business processes across multiple departments.
Robotic Process Mining: Vision and Challenges Volodymyr Leno, Artem Polyvyanyy, Marlon Dumas, Marcello La Rosa, Fabrizio Maria Maggi
Discovering Automatable Routines From User Interaction Logs Antonio Bosco, Adriano Augusto, Marlon Dumas, Marcello La Rosa, and Giancarlo Fortino
AI for Business Process Management From Process Mining to Automated Process Improvement Marlon Dumas, University of Tartu Institute of Computer Science
This describes standard procedures on how to analyze target processes using a process mining tool.
Please note the following points in advance.
The flow of analysis is not usually linear, but is basically deepened through iterative work of going back and forth between various aspects of analysis. In particular, trial and error is indispensable for process mining to extract new insights from analysis of large and complex event data. There is no victory pattern in the analysis that “you can discover new insights in this way”.
The features vary depending on the analysis tool used. Also, the names for the functions are different. This article uses general expressions. For the unique expression of the using tool, regard it with a functionally similar to the described one.
Now, the basic process mining analysis
procedure is as follows.
1 Confirmation of statistical figures for the process to be analyzed
2 Variant analysis
3 Frequency analysis
4 Performance Analysis
5 Rework analysis
6 Problem evaluation meeting
7 Root-cause analysis
8 Planning improvement measures
9 Improved process model creation / effect simulation
10 Report making
Let’s outline the content of each step.
In addition, it is assumed that data
pre-processed “event log data” is uploaded to the process mining
tool, analysis items are linked (mapping), and analysis preparation is
1 Confirmation of statistical figures for the process to be analyzed
Start the analysis by pressing the analysis
execution button of the process mining tool. The default screen varies
depending on the tool, but first check the screen showing the following basic
statistical values for the process to be analyzed. To get an overview
of the data.
Number of Items to be analyzed (number of case IDs)
Number of completed projects / number of uncompleted projects
Number of activities to be analyzed (total number of activities)
Average / most / least number of activities per issue
Number of occurrences by activity
Analysis target period (From: yyyy/mm/dd To: yyyy/mm/dd)
Average / longest / shortest / central / most frequent throughput (time)
Throughput distribution (histogram)
Throughput standard deviation
The most important point to keep in mind
when checking the above figures is whether to analyze only completed projects.
A completed matter means one that includes both “start activity” and
“end activity” within the analysis period.
In the event log where the problem is in
fact, there are cases where the start activity has started before the period,
and the case where the “end activity” has not been reached within the
analysis period. Leaving these “uncompleted items” as they are will
cause problems in calculating the throughput.
Basically, make use of the filter function
and set the analysis target at this point only for those that include both
“Start activity” and “End activity” within the analysis
period. (Of course, depending on the purpose of analysis, incomplete projects
may be analyzed.)
As for throughput, let’s check not only the average, maximum, minimum, center, and most frequent values, but also the distribution status using histograms. In the case of process data, there are almost no cases of normal distribution.
Also, in the case of a “multi-peak”
histogram with multiple mountains, even if the process is the same, the case
that goes through all processes and the case of a simple process in which the
middle process is omitted under some conditions become jumbled. You may have
(This is a point that should be clearly identified in the next variant
2 Variant analysis
The flow of the process to be analyzed can
be divided by business rules (for example, the subsequent flow is divided into
two flows of approval and non-approval based on the result of loan
application), return to the previous process, and repetitive operations ,
Deviations and other activities trigger multiple patterns.
This means that there are various
variations in the path from the start activity to the end activity. Variant
analysis examines each of these variations to find typical flows and patterns
that include deviations.
The first thing to look at in variant
analysis is the process pattern with the largest number of issues. This process
is sometimes referred to as a “happy process” or “happy path (route)”, but the
large number of processed cases does not necessarily mean that it is the most
Therefore, check with the process owner or
domain expert (site staff) to confirm what is the ideal process without any
problems = happy process.
Also, if there are multiple process
patterns where the flow diverges in the middle due to business rules, etc., it
is considered that a happy path exists for each pattern.
On the other hand, process patterns that are considered to deviate from the ideal process described above are identified and recorded. These are issues to raise.
Note that you can also use “conformance Checking” to find deviations. However, this functionality can be regarded as an advanced analysis, not standard.
3 Frequency analysis
The numerical value that is standardly
shown in the process model, that is, the display screen of the flowchart that
shows the process flow, is the number of processes of each activity included in
the process and the number of processes that flowed from one activity to
Frequency analysis is a detailed
examination of the number of cases processed. The point of interest is, of
course, the location with a large number of processing cases. The large number
of cases is that the load on the site staff is large, the processing time is
long, and bottlenecks in which matter stays are likely to occur.
4 Performance Analysis
Performance analysis is essentially a
“time” scale analysis. Looking at the total required time from the
start activity to the end activity of the process, that is, the throughput, the
processing time of each activity, the transition time from one activity to
another activity = waiting time, etc. Find efficient locations and
“bottlenecks” with long wait times.
It is important for performance analysis to
have a clear reference value for determining whether time is long or short. For
example, in the case of throughput, longer processes based on average or
central throughput are considered to have the problem of “overall overall
Also, for the processing time and waiting
time for each activity, set the average value or the median value as the
reference value, or compare it with the target value of the KPI (Key
Performance Indicator), which should be about this time. It is effective to
extract the problem.
5 Rework analysis
Rework, that is, less repetitive work is
preferable, and where rework occurs, there may be inefficient procedures or
problems that can easily trigger rework.
There are two main types of rework: when an
activity is repeated, and when multiple activities are repeated. The former is
likely to involve simple errors and repetitive tasks, and the latter is because
“rework” has occurred, so there are some causes here as well.
The problem spots highlighted in the rework
analysis will also be pursued as root causes as one of the next “problem
6Problem evaluation meeting
Issues enumerated through variant analysis, frequency analysis, performance analysis, and rework analysis are enumerated, and a stakeholder evaluation meeting should be held.
While showing the analysis results to the
process owner and the domain expert (on-site staff), confirm the following
problems and potential parts.
The cause of the large number of transactions for a specific activity or flow
The cause of the process that the throughput is longer (or shorter) than the reference value
Causes where the processing time or waiting time of the activity is longer than the reference value
Reason for occurrence of deviated or skipped procedure not included in standard procedure
Reason for reworks
As a result of checking with the process owner or domain expert, it was determined that the procedure was a departure procedure, but it was actually a predetermined procedure, or that a longer processing time at a certain place was not a problem in reality.
The problems that are revealed in the
process mining analysis are phenomenal problems only, so it is necessary to
determine whether they are problems that have a real negative effect on the
company, should be corrected, or can be corrected. To do so, it is essential to
leave only “real problems to be solved” through interviews with
stakeholders who understand the real process well.
7 Root-cause analysis
After examining the issues, we will conduct
a deep analysis to elucidate the root cause. In process mining analysis, we
look for clues to problems through cross-analysis with various attributes.
Which attribute should be analyzed depends
on the target process, but the following attribute analysis is generally
By customer (type)
By procurement department
By purchase price
By Order price
For example, in the purchasing process,
which supplier or material procurement is likely to cause rework, and under
what conditions business rules are applied, the business rules are properly
operated. Analysis can help you get closer to the root cause.
However, in the event that the manual work
of the field staff ultimately causes inefficiency, it is a task that is not
recorded in the event log, so the data analysis alone can not approach the true
Therefore, in root cause analysis, it will
be necessary to take all means to find the root cause, regardless of the
presence or absence of data, in combination with interviews, workshops,
observational surveys and the like of field staff.
8 Planning improvement measures
Once you have identified the real problem
to be solved and the root cause of the problem, you can finally make
improvements. The draft of the improvement plan is created by, for example, a
process analyst or business consultant.Which improvement plan is effective and
which should be given the highest priority is determined by a study group
including process owners and domain experts. It is a matter to be done
Of course, what kind of process improvement
should be done is case-by-case, but the following are standard approaches.
Reorganization of business procedures
Removal of problem activity
Parallel processing of multiple activities
Optimization of staffing
9 Improved process model creation and effect simulation
After the improvement measures to be taken
are drafted, create an “improvement process” when the improvement
measures are implemented, and simulate as much as possible. It examines how
much the number of processes can be reduced, and how much time and cost can be
reduced with the current process (as is process) and the improvement process
(to be process).
When creating an improvement process, you
should create a “BPMN compliant” flow using a modeling tool. Some
process mining tools include BPMN-compliant modeling capabilities. If not, you
will need to prepare a separate modeling tool.
10 Report making
A report is created that summarizes the
results of analysis up to now, the improvement process based on improvement
measures, and the results of the effect verification simulation. A common
report title is
“Process Mining Analysis Report for XXX Process-Problems and Solutions”.
Depending on the purpose and situation of
the analysis project, the report may be created up to the stage before the
planning of the improvement measures, that is, until the problem is identified.
Experts who are familiar with process
mining are called “process miners”.
Process mining itself can be narrowly
defined as an “analysis method”, but if it is broadly regarded as the
main methodology for business process reform / improvement projects and
continuous process improvement, process mining The domains involved are as
Total management as a business reform / improvement project
Creating scoping (analysis plan)
Extract event log data from various systems
Data preprocessing (cleaning, etc.)
Creation of operation / analysis dashboard for process mining tools
Root cause analysis and improvement plan
Development and establishment of improvement measures
Establish continuous monitoring and improvement system
Of course, it would be difficult for an
individual to do all of the above. In reality, a team of data scientists,
system administrators, tool operators, business analysts, business consultants,
and others with a certain title will work together.
I believe that a process miner, whatever
the title, is someone who has the knowledge and experience to derive insights
from data analysis related to business processes and lead to improvements.
Therefore, in order to be recognized as a
process minor, we believe that it is necessary for each to acquire a wide range
of process-related knowledge while possessing special skills.
Here is the knowledge you need to acquire
as a process miner.
1. Basic knowledge of process mining
It is desirable to understand the
principles and principles of what is process mining in the first place, and
what is the basic approach of an algorithm that reproduces a process model from
event log data.
I believe that basic knowledge of process
mining is a basic subject for everyone involved in process mining.
2 Business analysis
Business analysis is a comprehensive
methodology for collecting and analyzing data about various aspects of your
business. The main points of business analysis are to understand the current
situation as accurately as possible and to clearly identify business issues,
leading to superior strategy planning, effective system requirements definition
and organizational restructuring.
If you consider process mining as an
analysis method, it can be positioned as an analysis method specialized for
processes included in business analysis. In a real process mining project or
business reform / improvement project, analysis of event log data is not
enough.In many cases, peripheral investigation and analysis, interviews for
root causes, workshops, etc. are performed. Therefore, it is better to have
knowledge of business analysis.
Business analysis knowledge is something
that a business analyst, of course, also wants a data scientist to have.
3 Business Process Management (BPM)
Business process management (BPM) literally
means controlling business processes appropriately. The purpose is to increase
profits, reduce costs, and improve customer satisfaction through continuous
improvement of business processes.
Therefore, BPM develops and organizes
various ideas and methods that cover the entire cycle from process development
to operation and transformation. Process mining can provide various forms of
support such as understanding the current situation based on data, simulating
improvement measures, and operational support in each phase of BPM.
Since the ultimate goal of process mining
is to reform and improve processes, knowledge of BPM is very important.
Business process management knowledge is a
must for business analysts and business consultants involved in process mining.
Most of the extraction sources of event
logs to be analyzed by the process mining tool are transaction data stored in
the database in the business system.
Depending on the system, a function that
can download past operation history and transaction history collectively may be
implemented, but if not, analyze the structure of the database and determine
which table contains the data items required for analysis, You need to consider
which data items should be used as keys to connect the related tables, and how
to extract the three required items of the event log, that is, matter ID
(process ID), activity, and time stamp.
To do this, it is necessary to understand
at least the formation of databases, especially relational databases, in
addition to understanding business systems such as ERP described below.
Basic knowledge of DB is essential for data
scientists, IT managers, and process analysts involved in process mining.
5 Business system (ERP, CRM, etc…)
The processes that are often analyzed in
process mining are P2P (Procure-to-Pay), that is, the purchasing process, and
O2C (Order-to-Cash), that is, the order receiving process, which are often SAP
and Oracle. Is done on the ERP system.
ERP is a business system that covers all
corporate activities, so its database structure is complicated and its
functions are diverse. Therefore, by understanding the basic architecture of
ERP such as SAP and ORACLE, the DB structure, and the relationship between
functions, in addition to the basic knowledge of the database, it becomes
possible to smoothly perform event log extraction and preprocessing.
ERP knowledge is a must for process mining
data scientists, IT managers and business consultants.
6 ETL & Scripts(SQL, Python, etc…)
Data (transaction data) extracted from
various systems cannot be uploaded to the process mining tool as it is. It is
necessary to process the data into a format that can be analyzed by a process
mining tool, and to remove noise and the like to make it a clean file.
This process is called “data
preprocessing”, and “ETL” is a useful tool for data
preprocessing. I recommend using the open source “KNIME”. However,
there are many other tools that have the same functions, so it is fine if you
adopt one that is easy for you to use. In any case, if you are in charge of
data pre-processing, learning the ETL tool is effective.
Of course, if you are an engineer, you
should be able to perform data pre-processing even for scripts that are good at
SQL, Python, R, etc. However, the ETL tool has the advantage that it is easy to
explain to the stakeholders because the flow of data preprocessing can be shown
visually as a “workflow”, that is, a processing procedure.
Some of the process mining tools themselves
have a connector function that allows API connection or ODBC connection to
various systems and DBs to extract data and perform some preprocessing.
However, if the system is customized or if
it is necessary to correct garbled characters or missing data, it will be
necessary to process the ETL tool or script separately.
ETL & Scripting is essential knowledge
and skills for data scientists and IT managers.
7 Tool operation
The tool operation method differs
considerably for each process mining tool, so you have to learn by using the
tool that you operated. However, when performing advanced analysis, standard
functions often cannot be covered, and dashboards are often customized and
various charts are created. In this case, the same approach as BI design
(analysis dimension = dimension and aggregation method = measure setting) and
SQL-like formulas are required, so it is necessary to learn and adapt
8 Lean Six Sigma
Lean Six Sigma, which originates from the
Toyota Production System, has been adopted by many companies worldwide as an
approach to help identify root causes and plan solutions for business process
reforms and improvements.
Process mining merely exposes process
inefficiencies and bottlenecks as phenomena, does not tell the root cause of
why such problems are occurring, and naturally suggests solutions. It doesn’t
even give me. (Although it gives us clues)
Therefore, a separate methodology is needed
to identify the root cause, plan a solution, prioritize it, and put it into a
concrete development plan based on the current situation obtained from various
analysis results including process mining.
Lean Six Sigma is essential knowledge and
skills for business consultants engaged in process mining.
市場リーダーのCelonisは既に社員数800人を抱え、大型の資金調達にも成功して「ユニコーン」としても認められる存在。そして、リーダーグループの一角を占めるSoftware AGは、「ARIS」のブランドで知られ、「ARIS Process Mining」の販売にも力を入れてきています。また、先ごろ買収したProcessGoldを「UiPath Process Mining」と名称を変え、UiPathが強みを持つRPAを含んだトータルソリューションとして提案力を強化しています。
Major Contender、すなわちリーダーグループに闘いを挑んでいる主要な競争ベンダーはまさに群雄割拠という状況。なお、私が把握している限りですが、日本においてなんらか連絡先があるのは、ABBYY Timeline、LANA Lab、myInvenioの３つだけです。
Let’s consider the value of process mining
from the viewpoint of “business strategy”.
This time, the framework of the strategy
used is “Value Discipline Model,” which was shown by author M. Treacy and F.
Wiersema in “THE DISCIPLINE OF MARKET LEADERS”.
In the book, they argued that businesses
should focus on where each company can provide unique value, rather than trying
to provide everything to every customer. They state that there are three
strategic directions as shown in the value model below.
1 PRODUCT LEADERSHIP
Providing the best products by enhancing
the competitive advantage of a product.
2 CUSTOMER INTIMACY
-Build close relationships with customers
and provide the best service.
3 OPERATIONAL EXCELLENCE
-Improve business efficiency and offer the
Product leadership is an approach that
enhances the competitive advantage by presenting unique features that are not
found in competing products. Apple is a typical example for Product leadership.
In addition to superior functions, iPod, iPhone, iPad, etc., which have
increased the attractiveness of outer design, have been introduced one after
another by them.
Customer intimacy is an “intimate” service
that treats customers with tailored care and strengthen and maintains strong
ties, such as the Ritz-Carlton Hotel.
Operational excellence is an approach that
maximizes efficiency, lowers costs, and offers customers a competitive price. McDonald’s
is one of them who are primarily pursuing this approach in the hamburger
By the way, these three basic strategies
were proposed in 1995, 25 years ago, but they can still be said to be effective
as a way of thinking. Since each of these requires huge investment of
management resources and excellent business operations.
However, the dramatic advances in various technology
and especially the advancement of digitalization are important to note first
that it has become extremely difficult to “product leadership”, that
is, to continue to differentiate our products.
For example, in the automobile industry, in
the era of gasoline cars, proprietary technology was required for each
component, whether it was an engine or a transmission, and imitating was not so
easy. However, many parts are modularized in electric vehicles, making it
difficult to make a difference in basic performance.
In “Customer Intimacy,” manufacturers and
consumers can directly interact with each other through the Internet, and as a
result of the fact that many consumer behaviors take place online, vast amounts
of data on individual customers Accumulation and analysis can be performed, and
detailed responses are becoming possible.
The ultimate in “customer
intimacy” is not “one-to-many” but “one-to-one”, but
digital technology is making it as realistic as possible. Therefore, many
companies will also be able to provide the same level of service in customer
intimacy, and may not be able to fully demonstrate their unique value.
But what about “operational
excellence”? Even if it can be systematized and automated to some extent,
it will become increasingly difficult to keep operating costs appropriate and
maintain margins while maintaining appropriate customer intimacy in response to
sophisticated and complex customer needs.
The pursuit of operational excellence is,
after all, the optimal design of end-to-end business processes and ensuring the
execution of daily business processes.
In other words, how thoroughly business
process management (BPM) is performed is the key to achieving operational
excellence. And, process mining helps BPM to understand the current process
based on data and identify inefficiencies and bottlenecks. It can also be used
for simulations to develop ideal processes and for real-time monitoring of
daily business processes.
Given this, I think that process mining can
be said to have higher value among companies that pursue operational excellence
among the value-based models.
I need to add that in the first place, the execution of any strategy should break down into business processes, and it is required that the business processes be executed as expected, so the value of process mining is not limited only to the pursuit of operational excellence.
The term “Process Intelligence” is often used in process mining documentation and vendor information.
The definition of “Process Intelligence” varies from vendor to vendor, but there is no clear definition, therefere many people are confused about how it differs from process mining.
In this article, I would like to explain the relationship and differences between “process mining” and “Process Intelligence”, explaining “Business Intelligence” at the same time.
In fact, the relationship between the two is clear, as illustrated in the “process mining manifest” published in 2011.
As you can see, the largest frame is the “Business Intelligence”, inside which is the “Process Intelligence”, and inside which is the “process mining”. They are nesting relationships.
Business intelligence is literally the collection and analysis of all the business related data and information. The analysis is often conducted using so-called BI tools, typically starting from financial data such as sales and profits, we look at trends by fiscal year, month, and week, and drill down by area and product to delve into the areas and product categories that contribute to sales and profits, as well as the factors that are hindering them. This is business intelligence.
“Process Intelligence” is a type of business intelligence analysis that focuses on data and information related to business processes. Furthermore, “process mining” is an analytical method based on the flow of business process, or “Control Flow” in process intelligence.
Some people say that since process mining is basically included in business intelligence, can it be replaced by business intelligence?
However, the basic function of process mining, “(automatic) Process Discovery” requires a special algorithm that BI tools typically do not equipped with. And it is not practical to build algorithms for process mining based on BI capabilities from scratch (Even if you could build an elementary one, the process model that you reproduced would be unreliable.)
Therefore, if you want to do process mining, you need to use a dedicated process mining tool, which BI cannot replace.
So where does process intelligence cover?
In addition to process discovery using special algorithms, the Process Mining Tool calculates various statistical values and presents them in various tables and graphs.
For example, the number of issues involved in the process being analyzed, the throughput from start to finish of the process (cycle time), the number of activities per activity, the processing time, the transition time (path time) from one activity to another, or the wait time. The average, maximum and minimum, median, and standard deviation of these values can also be checked.
These statistics can be calculated on the basis of simple arithmetic operations without the need for special algorithms. It’s easy to do with BI. That’s where “Process Intelligence” covers.
In the process mining analysis, based on the “process model” (as is process model) discovered through the algorithm, various variations are verified “variant analysis”, and comparative analysis with ideal processes (to be Process), that is, conformance checking, is performed.
In addition, identify problem activities where processing time exceeds KPIs and bottlenecks where waiting time is too long. Basic statistics such as number of processes, processing time, and waiting time are important.
In other words, process mining involves drilling down into process intelligence figures from various perspectives in conjunction with process models.
The major process mining tools have standard process intelligence capabilities with dashboards that visually represent various numbers, as well as algorithms for creating process models. In this sense, it is safe to say that the current process mining tools are “process intelligence tool”.
Aalst氏は、2011年にプロセスマイニングに関する初めての著作『Process Mining: Data Science』（現在は2016年版）を出版、また2014年にはCouseraで当著作と同じタイトルのMOOC、すなわちeラーニングコースを開発、提供を開始しています。当eラーニングコースは、これまでに世界中で数万人が受講しており、プロセスマイニングの基本的な知識・ノウハウを広めることに寄与しています。
欧州においてプロセスマイニングが本格普及期に入ったのは、2015年ころからです。2018年以降はRPAに続く、大きな成長分野として注目が高まりました。2019年には、国際的なプロセスマイニングコンファレンス、「International Conference on Process Mining 2019」がドイツのArchenで初めて開催されました。2020年には、イタリアのPaudaで同コンファレンスが開催予定です。
欧州以外のエリアでは、ProMと同じオープンソースのプロセスマイニングツール「Apromore」が開発されたUniversity of melbourneの研究者を中心にオーストラリアでの取り組みが活発です。
Process mining is a young technology that
just turned 20 in 2019. Here’s a brief history of process mining.
The creator of process mining is Dutch researcher, Wil van der Aalst, Professor, RWTH Archen University, being called as “God Father of Process Mining”.
As a computer scientist, world-renowned Aalst’s key areas of expertise include information systems, workflow management and process mining, and at Archen University, leads the Process and Data Science Group.
In the late 1990s, while studying workflow
and workflow management at Eindhoven University of Technology (TUe) in the
Netherlands, Aalst was challenged by the fact that existing methods for
understanding current business processes — interviews and workshops — could
only draw incomplete process models based on subjective and fragmented
On the other hand, in the 1990s, business systems such as SAP ’ s ERP became widespread, and many of the business operations in various departments of companies and organizations were conducted on IT systems.
So Prof. Aalst came up with the idea that
business processes could be replicated from the operational history, or event
log, recorded in IT systems. According to Aalst, the term “process
mining” was first used in a research proposal he wrote in 1998.
Aalst began working on process mining in
earnest in 1999. Therefore, 1999 is the year of the birth of process mining,
and Holland is the birthplace of process mining. Since the early 2000s,
academic research has been actively carried out by TUe and other European universities.
The first algorithm used to reconstruct the process model from the event log was “Alpha Algorithm”. Since then, various algorithms such as “Heuristic Miner” and “Inductive Miner” have been developed to represent more reliable process models.
In 2004, the first version of
“ProM” an open source process mining tool, was developed. ProM is
still being upgraded and is mainly used for research. ProM is unique in that it
is used for research, and the latest algorithms and new functions are provided
as plug-ins one after another.
The first process mining company that no
longer exists, “Futura” was founded in 2007. Since around 2010,
companies specializing in process mining have emerged one after another.
ProcessGold, which has been acquired by UiPath in October 2019, was established in 2009. After graduating, Anne Rozinat, who studied process mining under Aalst and received a master’s degree, founded Fluxicon in 2010 to develop a process mining tool called “Disco”. In 2011, the current industry leader, Ceonis, was born.
With new process mining tools on the market since 2010, it’s Anne of Fluxicon who’s done the most to raise awareness and understanding of process mining in Europe by holding “Process Mining Camp” every year since 2012 besides Prof. Aalst.
Aalst published his first work on process mining in 2011, ‘Process Mining: Data Science in Action’ (The current version is 2016.), and in 2014 developed and began offering MOOC with the same title of the book, an e-learning course, at Cousera. Tens of thousands of people around the world have taken this e-learning course, helping to spread the basic knowledge and know-how of process mining.
In Europe, process mining started becoming
widely used around 2015. Since 2018, it has gained attention as a major growth
area following RPA. In 2019, the first international Process Mining Conference,
“International Conference on Process Mining 2019” was held in Archen,
Germany. In 2020, the conference will be held in Pauda, Italy.
Outside of Europe, Australian practice is centered around researchers at the University of Merlbourne, which developed the open source process mining tool “Apromore”.
Process mining has been practically introduced in both the U.S. and Japan since 2019. In Japan, the “Process Mining Conference 2019” hosted by Impress attracted 500 participants and showed great interest.
In Korea, Puzzle data, which has a
“ProDiscovery” developed by a researcher who studied under Aalst, has
a track record of introducing process mining in Korean companies.
In the case of Japan, although the two
tools Celonis and myInvenio have a strong presence in the market at present, it
is certain that various process mining tools will be introduced to Japan and
the market will expand.