(Optional) If the use of a [Prolog-based knowledge base](https://www.lamda.nju.edu.cn/abl_test/docs/build/html/Intro/Reasoning.html#prolog) is necessary, the installation of [Swi-Prolog](https://www.swi-prolog.org/) is also required:
@@ -50,7 +50,7 @@ Alternatively, to install ABL by source code, sequentially run following command
For Linux users:
```bash
$ sudo apt-get install swi-prolog
$ sudo apt-get install swi-prolog
```
For Windows and Mac users, please refer to the [Swi-Prolog Download Page](https://www.swi-prolog.org/Download.html).
@@ -64,3 +64,107 @@ We provide several examples in `examples/`. Each example is stored in a separate
+ [Hand written Equation Decipherment](https://github.com/AbductiveLearning/ABL-Package/tree/Dev/examples/hed)
We use the MNIST Addition task as a quick start example. In this task, pairs of MNIST handwritten images and their sums are given, alongwith a domain knowledge base which contain information on how to perform addition operations. Our objective is to input a pair of handwritten images and accurately determine their sum.
### Working with Data
ABL-Package requires data in the format of `(X, gt_pseudo_label, Y)` where `X` is a list of input examples containing instances, `gt_pseudo_label` is the ground-truth label of each example in `X` and `Y` is the ground-truth reasoning result of each example in `X`. Note that `gt_pseudo_label` is only used to evaluate the machine learning model's performance but not to train it.
In the MNIST Addition task, the data loading looks like:
```python
# The 'datasets' module below is located in 'examples/mnist_add/'
from datasets import get_dataset
# train_data and test_data are tuples in the format (X, gt_pseudo_label, Y)
# If get_pseudo_label is set to False, the gt_pseudo_label in each tuple will be None.
Learning part is constructed by first defining a base model for machine learning. The ABL-Package offers considerable flexibility, supporting any base model that conforms to the scikit-learn style (which requires the implementation of fit and predict methods), or a PyTorch-based neural network (which has defined the architecture and implemented forward method). In this example, we build a simple LeNet5 network as the base model.
```python
# The 'models' module below is located in 'examples/mnist_add/'
from models.nn import LeNet5
cls = LeNet5(num_classes=10)
```
To facilitate uniform processing, ABL-Package provides the `BasicNN` class to convert a PyTorch-based neural network into a format compatible with scikit-learn models. To construct a `BasicNN` instance, aside from the network itself, we also need to define a loss function, an optimizer, and the computing device.
The base model built above are trained to make predictions on instance-level data (e.g., a single image), while ABL deals with example-level data. To bridge this gap, we wrap the base_model into an instance of `ABLModel`. This class serves as a unified wrapper for base models, facilitating the learning part to train, test, and predict on example-level data, (e.g., images that comprise an equation).
```python
from abl.learning import ABLModel
model = ABLModel(base_model)
```
### Building the Reasoning Part
To build the reasoning part, we first define a knowledge base by creating a subclass of `KBBase`. In the subclass, we initialize the `pseudo_label_list` parameter and override the `logic_forward` method, which specifies how to perform (deductive) reasoning that processes pseudo-labels of an example to the corresponding reasoning result. Specifically for the MNIST Addition task, this `logic_forward` method is tailored to execute the sum operation.
Next, we create a reasoner by instantiating the class `Reasoner`, passing the knowledge base as a parameter. Due to the indeterminism of abductive reasoning, there could be multiple candidate pseudo-labels compatible to the knowledge base. In such scenarios, the reasoner can minimize inconsistency and return the pseudo-label with the highest consistency.
```python
from abl.reasoning import Reasoner
reasoner = Reasoner(kb)
```
### Building Evaluation Metrics
ABL-Package provides two basic metrics, namely `SymbolAccuracy` and `ReasoningMetric`, which are used to evaluate the accuracy of the machine learning model's predictions and the accuracy of the `logic_forward` results, respectively.
```python
from abl.data.evaluation import ReasoningMetric, SymbolAccuracy
ABL-Package requires data in the format of ``(X, gt_pseudo_label, Y)`` where ``X`` is a list of input examples containing instances,
``gt_pseudo_label`` is the ground-truth label of each example in ``X`` and ``Y`` is the ground-truth reasoning result of each example in ``X``. Note that ``gt_pseudo_label`` is only used to evaluate the machine learning model's performance but not to train it. If examples in ``X`` are unlabeled, ``gt_pseudo_label`` should be ``None``.
``gt_pseudo_label`` is the ground-truth label of each example in ``X`` and ``Y`` is the ground-truth reasoning result of each example in ``X``. Note that ``gt_pseudo_label`` is only used to evaluate the machine learning model's performance but not to train it.
In the MNIST Addition task, the data loading looks like
.. code:: python
# The 'datasets' module below is located in 'examples/mnist_add/'
from datasets import get_dataset
# train_data and test_data are tuples in the format (X, gt_pseudo_label, Y)
@@ -33,14 +34,14 @@ Read more about `preparing datasets <Datasets.html>`_.
Building the Learning Part
--------------------------
Learnig part is constructed by first defining a base model for machine learning. The ABL-Package offers considerable flexibility, supporting any base model that conforms to the scikit-learn style (which requires the implementation of ``fit`` and ``predict`` methods), or a PyTorch-based neural network (which has defined the architecture and implemented ``forward`` method).
Learning part is constructed by first defining a base model for machine learning. The ABL-Package offers considerable flexibility, supporting any base model that conforms to the scikit-learn style (which requires the implementation of ``fit`` and ``predict`` methods), or a PyTorch-based neural network (which has defined the architecture and implemented ``forward`` method).
In this example, we build a simple LeNet5 network as the base model.
.. code:: python
# The 'models' module below is located in 'examples/mnist_add/'
from models.nn import LeNet5
# The number of pseudo-labels is 10
cls = LeNet5(num_classes=10)
To facilitate uniform processing, ABL-Package provides the ``BasicNN`` class to convert a PyTorch-based neural network into a format compatible with scikit-learn models. To construct a ``BasicNN`` instance, aside from the network itself, we also need to define a loss function, an optimizer, and the computing device.
@@ -51,7 +52,7 @@ To facilitate uniform processing, ABL-Package provides the ``BasicNN`` class to