Requires Java 8.
Issues which made ProM incompatible with Java 9/10 have been fixed:
- The Framework now includes support for commons-logging (by default this is support is not there on Java 9/10).
- The Framework now comes with its own class loader, which should be used by default. In Java 9/10, the default class loader does not subclass the URLCLassLoader anymore, which is required by ProM. The new ProM class loader does subclass the URLClassLoader.
- The Package Manager now tires two alternatives for obtaining the amount of memory (RAM) in the local computer. If both fail, 1 GB will be assumed to be there. The Package Manager of ProM 6.7 will crash on Java 9/10 while attempting to obtain the amount of memory.
The Framework itself requires Java 7, but some of the ProM packages require Java 8 (mostly for JavaFX).
New Established packages
A (block-structured) graph layout algorithm based on the reduction rules for free-choice Petri nets. Works very nice on block-structured graphs, like block-structured Petri nets, but may get messy if the graph (or Petri net) is not block-structured.
A classifier algorithm for event logs (classifies 194 out of 200 traces correctly in the Process Discovery Contest 2017). The classifier uses rule-base models (named log skeletons) as its basis. These log skeletons can be discovered from an event log, or event logs can be visualized through these log skeletons.
PDC2016 and PDC2017
All event logs as used for the Process Discovery Contest 2016 and 2017. This way, you do not have to download these logs, as they are available in ProM 6.8.
The SLOG package offers support to import logs in 3 simple easy-to-read plain-text formats: simple log (.slog), simple resource log (.srlog), and TR log (.tr). The package includes plugins to import these 3 formats, but also a ProM graphical editor to create XLog on-the-fly using the same format.
The Statechart Workbench can be used to discover models with hierarchy, recursion and cancellation from ordinary event logs. The models can be displayed as statecharts, sequence diagrams, Petri nets and other visual formalisms. The Statechart Workbench is a process exploration workflow tool and it provides an easy and guided approach to apply a collection of advanced algorithms.
A reference implementation for the IEEE XES Standard. Uses different interfaces as OpenXES does.
A two-way bridge from XESStandard to OpenXES. Includes importers (from IEEE XES file to OpenXES) and exporters (from OpenXES to IEEE XES file).
New RunnerUp packages
Offers a novel log visualizer: the auto-assocation plot. The auto-association plot gives insight in how structured the behavior in the log is by plotting the “correlation” of the activities at timestep t with the activities at all timesteps t-k. All values k are on the horizontal axis, while the association/correlation is shown on the vertical axis.
In this package we have a novel and versatile algorithm to optimally fold a linear layout of a graph such that it can be drawn effectively in a specified aspect ratio, while still clearly communicating the linearity of the layout. The algorithm allows vertices to be drawn as blocks or rectangles of specified sizes to incorporate different drawing styles, label sizes, and even recursive structures. For reasonably-sized drawings the folded layout can be computed interactively.
In this package, We demonstrate the applicability of the linear layout algorithm on graphs that represent process trees, a particular type of process model. Our algorithm arguably produces much more readable layouts than existing methods
New Starter packages
The ActiTraC package contains a plugin for Active Trace Clustering. It can be used to divide an event log into a number of clusters, such that the fitness between the traces in each cluster and a process model mined from these traces is optimized. Some of the settings of the learning algorithm and the underlying process discovery technique can be configured.
Offers a collection of techniques to filter activities from event logs based on their degree of structuredness instead of based on their frequency. Unstructured activities can ruin directly-follows relations and therefore having such activities in your log can ruin the results of process discovery on your log. All filters implemented in this package are described in the following recent publication in Journal of Intelligent Information Systems: https://doi.org/10.1007/s10844-018-0507-6
Boudewijn van Dongen
In this package we have a layout algorithm that can be used on a Petri net, an accepting Petri net, a BPMN diagram, and a transition system. The layout algorithm uses horizontal and vertical blocks to layout all the object sin the provided directed graph. The algorithm changes only the positions of the nodes and the positions of the edge points. To obtain the horizontal and vertical blocks, the well-known reduction rules for free-choice Petri nets (Desel and Esparza) are used as a basis. To ensure that we have a free-choice Petri net, the directed graph (be it a Petri net, an accepting Petri net, a BPMN diagram, or a transition system) is first converted into a free-choice Petri net by replacing every node with a place and every edge with a transition with two arcs. To the application of any reduction rule, costs are associated. A cheapest reduction rule will be applied, building either a vertical or horizontal block, until only a single node remains. Te nodes and edges are then positioned according to the constructed blocks.
The ChangePatterns package provides several plugins for injecting control-flow change patterns (e.g., remove, delete and swap activities) to process models (e.g., process tree). As a result, the control-flow structure of process models is modified.
This Plugin allows inferring the case identifier (case id) attribute from CSV log files and produces XES event logs where the case id is automatically identified. The implemented approach is suitable for log files produced by process unaware software systems where the case id is a hidden attribute in the log. By evaluating the grouping ratios and the control-flow quality dimensions (fitness, precision, simplicity, and generalization) for the different log attributes, this plugin allows identifying the case id and enables process mining on a wider range log files (cf. https://doi.org/10.1007/978-3-319-91704-7_6).
DeclareAnalyzer allows user to check the conformance of an event log with respect to Multi-Perspective Declare (MP-Declare) the extension of the Declare language supporting constraints over data payloads and timestamps.
The ExpertTraceClustering package contains plugins for active trace clustering using expert knowledge. It can be used to divide an event log into a number of clusters, based on expert knowledge. The goal of the technique is to optimize correspondence between a process model mined to represent each cluster, and the traces in that cluster. The expert knowledge can come in the form of a (partial) pre-clustering, or trace-level constraints. Some of the settings of the learning algorithm and the underlying process discovery technique can be configured.
Genetic Miner Using Partial Knowledge is a package that produces initial population for genetic algorithm based on Partial Knowledge of the model about to be mined. Knowledge is provided by a user in a form of a Petri Net. The plugin then uses the Genetic Miner to mine model using generated initial population.
The Hybrid Miner plugin allows for the discovery of Hybrid Petri nets, a new class of Petri nets combining the classic Petri net formal dependencies (normal Petri net arcs) with informal annotations (sure and unsure arcs). Intuitively, whenever there is enough structure and evidence in the data, explicit routing constructs are formally represented in the Petri net, otherwise informal constructs are used.
Indulpet Miner is a business process discovery technique that combines several other techniques: it applies Inductive Miner, bottom-up recursion, Local Process Models and the Evolutionary Tree Miner.
In this package we provide plug ins to enable interactive process mining. Most of the plug ins are based on the synthesis rules (by Desel and Esparza) that enable modeling/discovery of free-choice workflow nets. The data from the event logs in projected on top of the workflow nets to guide the user. The user can model duplicate tasks, silent tasks etc., and the synthesis rules engine in the background guarantee the soundness of discovered process models.
Offers a set of techniques to evaluate a set of Local Process Models (which can be mined with techniques that are available in the LocalProcessModelDiscovery package), in contrast to the standard evaluation techniques for Local Process Models which evaluate per-model instead of evaluating a collection of Local Process Models. Furthermore, it offers a set of techniques to post-process a mined collection of Local Process Models such that the quality of the whole set of Local Process Models improves. The package implements the techniques described in: https://arxiv.org/abs/1712.04159
SUBDUE pattern mining is a pattern mining technique that is orthogonal to Local Process Models in the sense that they can complement each other in the log insights that you obtain. This package implements techniques to constrain the Local Process Model search space in a lossless way such that results can be obtained faster, by extracting constraints on Local Process Models from a set of SUBDUE patterns that you provide as additional input. The package implements the techniques described in: http://ceur-ws.org/Vol-2016/paper2.pdf
One of the main challenges in applying process mining on real event data, is the presence of noise and rare behaviour. Applying process mining algorithms directly on raw event data typically results in complex, incomprehensible, and, in some cases, even inaccurate analyses. The Log Filtering Package provides some plug-ins that lets user separates outlier behaviour from general behaviour. Using Repair event log plug-in, also noisy traces could be modified. In general, most of plug-ins in this package, need an event log as an input and return a filtered event log without (or with only) outlier behaviour.
In this package we assume a scenario where the alignment cost function is set to epsilon, 0, 1, 0 for respectively model, synchronous, log, and silent moves. We provide an algorithm, based on computing the transitive closure of the marking graph, for computing alignments. We exploit the specific cost function by doing a precomputation step on the model, before aligning with log traces.
MobuconLDL is exactly the same as MobuconLTL. The difference is that here constraints can be represented with LDL which is more expressive than LTL and which can be used to express constraints over constraints (meta-constraints).
Giacomo Lanciano and Massimiliano de Leoni
This package allows to mine for decision models in the form of decision requirement diagrams (DRGs) as formalized by the Decision Model and Notation (DMN) standard. The technique does not incorporate control flow information (e.g. places), but is consistent with the behavioural information in the event log. The decision models capture decisions over loops, long-distance dependencies, and handles autocorrelations as well as case attributes. The results are bundled per trace cluster supporting compatible decisions, or per top level decision node and its decision model variants over different sets of traces.
Jonathan Wai Lam Lee
Another plug-in to compute precision, based on alignments.
This package offers functionality to use a Petri net as a Sequence Model in the Machine Learning sense, i.e., to use a Petri net to make a series of consecutively dependent predictions. Furthermore it offers functionality make train/test splits on logs, and functionality to evaluate a Petri net that is used as a sequence model using Brier score, which is a common evaluation measure from the Machine Learning domain. The theory behind the content of this package is described in: https://doi.org/10.1007/978-3-319-91704-7_11
The StreamBasedEventFilter package represents an implementation of the algorithm described in the paper entitled “Filtering Spurious Events from Event Streams of Business Processes” by Sebastiaan J. van Zelst et al., wich was published at CAiSE’18 https://doi.org/10.1007/978-3-319-91704-7_11. The package contains a plugin that takes an event stream as an input (XSEventStream object) and results in an output event stream. The output event stream potentially does not contain certain events of the input stream, if these events seem to be related to spurious behaviour. Note that at this point, the user cannot change the parameters of the filter, this functionality is integrated in the RapidProM release of the filter.
This package contains code/plugins that allows the user to transform a live event stream into an event log. The user can either use a simple sliding window as a basis, a prefix-tree based supplementary storage or reservoir sampling. The plugins result in a “Stream Reader” entity. The current visualizer of the plugin does not allow the user to effectively query the reader for an event log. This functionality is provided in the RapidProM framework, where the correspondnig plugins are provided as an iterative subprocess on top of a given static event stream. Note that the code in this package is mainly suited for academic experiments.
Acts as a “library” package for the StreamBasedEventLog package. Allows us to construct “Event Stores” as described in the PhD. Thesis of S.J. van Zelst, Chapter 3.
Bas van Zelst and Andrea Burattin