Scott Menter – BP Logic
In the demo from BP Logix, they show how they have integrated Machine Learning into their Process Director to start using it in combination with processes.
In this case we are trying to make a prediction on employee attrition (whether they are likely to leave the company). You start by creating a learner object. After selecting a datasource (a database) and possibly some transformation, you can select which inputs you want to use (giving information or even suggestions on the available data, or visualizing characteristics about the data you selected) and train the model with the selected data.
This data can then be used in for example a form to show potential attrition rate while you are filling in information about an employee. Or it can be used in a process to drive a decision.
By integrating the learner objects into the Process Director, the learning curve to start using this is much lower, as it’s all integrated in one solution (even if the learner objects might actually be encoded by a different actor).
Leveraging process mining to enable human and robot collaboration
Michal Rosik – Minit
Minit suggests to use process mining to improve your RPA strategy. The strategy is two-fold: (1) use it to pick the right process to apply RPA to, select the right activity and person, to get a higher degree of success (as 40% of RPA projects fail); and (2) to monitor the results to make sure everyone is happy.
They apply this to a purchase process, where there are various bottlenecks detected to fill in the right order number, etc. (using standard process mining). They allow you to drill down several layers to inspect the details of the selected activity, how for example the human actor is using a combination of the browser, skype, etc., the steps they take (possibly multiple variations) to get the necessary information. These detailed steps could then be used as a basis to generate the RPA script.
After applying the RPA robots to automate some of the steps, the same process mining can be used to monitor and compare the results. For example, the average completion time might not have improved as expected, in which case we can analyze why that might be (for example that the bots are creating an increased load on the system, causing performance issues).
Finally, Minit dashboards exposes all this information in interactive BI charts.
Process mining and DTO – How to derive business rules and ROI from the data
Massimiliano Delsante, Luca Fontanili – Cognitive Technology
Cognitive Technology is moving from traditional process mining to creating a Digital Twin of your Organization (DTO). This includes process discovery, cost analysis, simulation, etc. but for example also a new feature to derive actual business rules from the data (rather than traditional probabilities).
The demo is showing the use case of closing a bank account. They can generate a BPMN diagram from the mining data, but now they even detect correlations for decisions (gateways) using machine learning, to also discover the conditions that are underlying. After verification by a human actor and/or simulation, these conditions can be added to the process. The decision can also be extracted separately using DMN, called from the process model. Finally, simulation can be used to identify possible improvements by applying for example RPA to automate some of the tasks: the simulation engine can generate new data with the suggested improvements, and this data can then be mined again to verify the results.
Is the Citizen Developer Story a Fairytale?
Neil Miller, KissFlow
KissFlow is a no-code platform for citizen developers. Neil starts by showing the runtime application first, showing various kinds of forms to start a process, tracking current state, performing work, etc. These forms have various pretty advanced features to load form data, printing, getting assistence, etc.
Next, we shifted to the tool to create this. First the forms: composed of various field types like text fields and dropdowns to tables and advanced fields like signatures etc. The process itself is a drag and drop tool but using a quite different visualization that is still a flow chart but tries to be as simple as possible for citizen developers (with inline editing of actions etc. inside the diagram, etc. – which reminded me a lot about Zapier for defining integrations).
They are also working on a new KissFlow version 3.0, which will be available soon. The forms and process modeling still look pretty similar, but this new version is adding various features to simplify collaboration, having things like threads where people are collaborating, more adaptive processes, using kanban boards, more extensive reports, etc.
Insightful process analysis
Jude Chagas Pereira, Frank Kowalkowski, Gil Laware
Wizly is a tool that allows you to run analysis on collected log data, to do things like compliance checks, correlation checks, relationship and sentiment analysis, etc.
The demo shows a call center use case. After loading in the data of about 2000 cases into the tool, the flow model can be generated from the log data and start running analytics. The compliance analysis shows us various information about the paths that are being executed (or not). Next, we can run further analysis, in this case zooming in on problems with bagage-related problems. This allows us to find possible causes (like canceled flights) but also to filter down to get even more insights.
DNA analysis detects possible paths and can visualize relations between your data (with the capability to filter further down if necessary). Finally, fourbox plots the data on some form of bubble chart. They were only able to show some of the features, as they explained they have a lot more analytical capabilities under the hood.
Improving the execution of work with an AI driven automation platform
Kramer Reeves, Michael Lim, Jeff Goodhue – IBM
IBM has worked hard in the last few years to integrate some of their offerings into one unified platform, that they are presenting here.
This demo stars with the authoring, where the case builder, process designer and decision center are combined to define the business logic. Next, we switched to the runtime UI where new cases can be started and managed and we run through a few steps of the case.
Next they showed some more advanced integrations: a robot is launched to automatically perform one of the steps, interaction with a chatbot to help find the data I need, analysis charts to help with the decision making, etc. The final step is to use Watson AI to make recommendations.
Finally, we got a look of the new Business Automation Studio, where you can build business applications in a low-code manner. You can create forms for business users, and these can be linked (by associating actions with the different buttons) to call new pages or backend functions.
That concludes day 1 (at least for you, we still have a wine and bear tasting and dinner :-)). If you are interested to get another view of what’s happening here, be sure to check out Sandy Kemsley’s blog
, who is blogging about the different presentations as well and has a lot more experience in doing this as an independent analyst for many years on BPM and related technologies 🙂