Python libraries required:
six (1.16.0)
numpy (1.26.4)
scipy (1.13.0)
scikit-learn (0.23.0)
cvxopt (1.3.2)
pandas (2.2.2)
ranking (0.3.2)
statsmodels (0.14.2)
matplotlib (3.8.4)
tensorflow (1.15.4)
requirements.txt
lists all these libraries. To install:
pip install -r requirements.txt
Installation with pip
:
Execute the following to install the library from git.
pip install git+https://github.com/shubhomoydas/ad_examples.git
To check the installed library version:
pip list | grep ad-examples
IMPORTANT: In order for the logs and plots to be generated by the illustrative examples below, make sure that the current working directory has a temp
folder.
To run demo_aad:
python -m ad_examples.aad.demo_aad
Check output:
baseline found:
[0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 3, 3, 3, 3, 3, 4, 4, 5, 6, 6, 6, 6, 7, 8, 8, 8]
AAD found:
[0, 0, 1, 1, 1, 1, 1, 1, 1, 2, 3, 4, 5, 5, 6, 7, 7, 7, 8, 9, 9, 9, 10, 11, 12, 13, 14, 14, 14, 15]
To uninstall:
pip uninstall ad-examples
Jupyter notebook usage:
See test_aad.ipynb for sample notebook usage. This notebook code should run without the pip install
step since the package ad_examples
is directly under the notebook's work folder.
Note(s):
-
The code has been tested with python 3.6+.
-
Although the package has a dependency on tensorflow, it is not required for AAD and hence tensorflow will not be installed automatically.
This repository includes, among other examples, my own original research in active learning and data drift detection:
- AAD and HiLAD: Human in the Loop Anomaly Discovery (cite) (Das, Islam, et al. 2024), (Das, Wong, et al. 2020), (Das, Wong, et al. 2016), (Das, Wong, et al. 2017)
- GLAD: GLocalized Anomaly Detection (cite) (Islam, Das, et al. 2020)
- Data drift detection (cite) (Das, Islam, et al. 2019)
Anomaly Detection Examples
This is a collection of anomaly detection examples for detection methods popular in academic literature and in practice. I will include more examples as and when I find time.
Some techniques covered are listed below. These are a mere drop in the ocean of all anomaly detectors and are only meant to highlight some broad categories. Apologies if your favorite one is currently not included -- hopefully in time...
- i.i.d setting:
- Standard unsupervised anomaly detectors (Isolation Forest, LODA, One-class SVM, LOF)
- Clustering and density-based
- Density estimation based
- PCA Reconstruction-based
- Autoencoder Reconstruction-based
- Classifier and pseudo-anomaly based
- Ensemble/Projection-based
- A demonstration of outlier influence
- Spectral-based code
- timeseries (Jump to illustrations)
- Forecasting-based
- Exploratory Analysis
- ARIMA
- Regression (SVM, Random Forest, Neural Network)
- Recurrent Neural Networks (RNN/LSTM)
- i.i.d
- Windows/Shingle based (Isolation Forest, One-class SVM, LOF, Autoencoder)
- Forecasting-based
- human-in-the-loop (active learning)
- Active Anomaly Discovery (batch setup, streaming setup) -- Includes plots and illustrations (see sections below)
- High-level summary of the approach
- Cite this work
- Jump right in: General instructions on running AAD
- Descriptions and Interpretability: Generating anomaly descriptions with tree-based ensembles
- Bayesian Rulesets with AAD
- Query strategies: Diversifying query instances using the descriptions and its evaluation
- GLAD: GLocalized Anomaly Detection (glad_batch.py)
- Aside: When we have a lot of labeled data (both anomalies and nominals), should we employ a classifier instead of an anomaly detector?
- Some properties of different tree-based detectors
- Running AAD with precomputed ensemble scores
- API Usage: How to employ AAD in your own application
- Comparing AAD with related work
- Data drift detection and model update with streaming data
- Aside: Applying drift detection to tree-based classifiers
- A bit of theoretical intuition
- Active Anomaly Discovery (batch setup, streaming setup) -- Includes plots and illustrations (see sections below)
- Generative Adversarial Nets (GAN) based Anomaly Detection
- Anomaly injection by adversarial behavior and Graph Convolutional Networks
- Reducing activity sequences to i.i.d -- This illustrates an approach that is becoming increasingly popular as a starting-point for anomaly detection on activity sequences and transfer learning.
There are other important data types/scenarios such as static and dynamic graphs ((Akoglu, Tong, Koutra 2015), (Bhatia, S. et al. 2020)) where anomaly detection is highly relevant for real-world applications, but which are not covered in this repository. Interested readers may instead refer to the references provided.
There are multiple datasets (synthetic/real) supported. Change the code to work with whichever dataset or algorithm is desired. Most of the demos will output pdf plots under the 'temp' folder when executed.
AUC is the most common metric used to report anomaly detection performance. See here for a complete example with standard datasets.
To execute the code:
-
Run code from the checkout folder. The outputs will be generated under 'temp' folder.
-
To avoid import errors, make sure that
PYTHONPATH
is configured correctly to include the ad_examples source dir:.:/usr/local/lib/python
-
The run commands are at the top of the python source code files.
-
Check the log file in
temp
folder. Usually it will be named <demo_code>.log. Timeseries demos will output logs under thetimeseries
folder.
Active Anomaly Discovery (AAD)
This codebase replaces the older 'pyaad' project (https://github.com/shubhomoydas/pyaad). It implements an algorithm (AAD) to actively explore anomalies.
Motivation and intuition
Our motivation for exploring active anomaly detection with ensembles is presented in Motivations.md.
Approach
The approach is explained in more detail in (Das, S., Islam, R., et al. 2019).
Demonstration of the basic idea
Assuming that the ensemble scores have already been computed, the demo code percept.py implements AAD in a much more simplified manner.
To run percept.py:
python -m ad_examples.percept.percept
The above command will generate a pdf file with plots illustrating how the data was actively labeled.
Reference(s):
-
Das, S., Islam, R., Jayakodi, N.K. and Doppa, J.R. (2024). Effectiveness of Tree-based Ensembles for Anomaly Discovery: Insights, Batch and Streaming Active Learning, Journal of Artificial Intelligence Research 80 (2024) 127-172. (pdf) (This is the most comprehensive version.)
-
Das, S., Wong, W-K., Dietterich, T., Fern, A. and Emmott, A. (2020). Discovering Anomalies by Incorporating Feedback from an Expert, ACM Transactions on Knowledge Discovery from Data (TKDD) 14, 4, Article 49 (July 2020), 32 pages. DOI:https://doi.org/10.1145/3396608.
-
Islam, R., Das, S., Doppa, J.R., Natarajan, S. (2020). GLAD: GLocalized Anomaly Detection via Human-in-the-Loop Learning. Workshop on Human in the Loop Learning at 37th International Conference on Machine Learning (ICML) (pdf)
-
Das, S., Islam, R., Jayakodi, N.K. and Doppa, J.R. (2018). Active Anomaly Detection via Ensembles. (pdf)
-
Das, S., Wong, W-K., Fern, A., Dietterich, T. and Siddiqui, A. (2017). Incorporating Feedback into Tree-based Anomaly Detection, KDD Interactive Data Exploration and Analytics (IDEA) Workshop. (pdf)(presentation)
-
Das, S., Wong, W-K., Dietterich, T., Fern, A. and Emmott, A. (2016). Incorporating Expert Feedback into Active Anomaly Discovery in the Proceedings of the IEEE International Conference on Data Mining. (pdf)(presentation)
-
Das, S. (2017). Incorporating User Feedback into Machine Learning Systems, PhD Thesis (pdf) -- The work on AAD in this repository was developed during my PhD and Post-doctoral research.
-
Akoglu, L., Tong, H. and Koutra, D. (2015). Graph based anomaly detection and description: a survey, Data Mining and Knowledge Discovery. (pdf)
-
Bhatia, S., Hooi, B., Yoon, M., Shin, K., Faloutsos, C. (2020). MIDAS: Microcluster-Based Detector of Anomalies in Edge Streams. (pdf) (code)
Cite this work
In case you find this repository useful or use in your own work, please cite it with the following BibTeX references:
@article{das:2020,
author = {Das, Shubhomoy and Wong, Weng-Keen and Dietterich, Thomas and Fern, Alan and Emmott, Andrew},
title = {Discovering Anomalies by Incorporating Feedback from an Expert},
year = {2020},
issue_date = {July 2020},
publisher = {Association for Computing Machinery},
volume = {14},
number = {4},
issn = {1556-4681},
url = {https://doi.org/10.1145/3396608},
doi = {10.1145/3396608},
journal = {ACM Trans. Knowl. Discov. Data},
month = jun,
articleno = {49},
numpages = {32}
}
@article{das:2024,
author = {Shubhomoy Das and Md Rakibul Islam and Nitthilan Kannappan Jayakodi and Janardhan Rao Doppa},
title = {Effectiveness of Tree-based Ensembles for Anomaly Discovery: Insights, Batch and Streaming Active Learning},
year = {2024},
issue_date = {May 2024},
volume = {80},
journal = {J. Artif. Int. Res.},
month = {may},
numpages = {46},
pages = {127--172}
}
@misc{github:shubhomoydas:ad_examples,
author = {Shubhomoy Das},
title = {Active Anomaly Discovery},
year = {2018},
journal = {arXiv:1708.09441},
howpublished = {\url{https://github.com/shubhomoydas/ad_examples}},
note = {[Online; accessed 19-Sep-2018]}
}
Other publications may be cited as:
@article{islam:2020b,
author = {Md Rakibul Islam and Shubhomoy Das and Janardhan Rao Doppa and Sriraam Natarajan},
title = {GLAD: GLocalized Anomaly Detection via Human-in-the-Loop Learning},
year = {2020},
booktitle={ICML Workshop on Human in the Loop Learning},
howpublished = {\url{https://arxiv.org/abs/1810.01403}},
note = {[Online; accessed 15-Jul-2020]}
}
@article{das:2018a,
author = {Shubhomoy Das and Md Rakibul Islam and Nitthilan Kannappan Jayakodi and Janardhan Rao Doppa},
title = {Active Anomaly Detection via Ensembles},
year = {2018},
journal = {arXiv:1809.06477},
howpublished = {\url{https://arxiv.org/abs/1809.06477}},
note = {[Online; accessed 19-Sep-2018]}
}
@inproceedings{das:2016,
author={Shubhomoy Das and Weng-Keen Wong and Thomas G. Dietterich and Alan Fern and Andrew Emmott},
title={Incorporating Expert Feedback into Active Anomaly Discovery},
booktitle={IEEE ICDM},
year={2016}
}
@inproceedings{das:2017,
author={Shubhomoy Das and Weng-Keen Wong and Alan Fern and Thomas G. Dietterich and Md Amran Siddiqui},
title={Incorporating Expert Feedback into Tree-based Anomaly Detection},
booktitle={KDD IDEA Workshop},
year={2017}
}
Running AAD
This codebase is my research platform. The main bash
script aad.sh
makes it easier to run all AAD experiments multiple times (in the spirit of scientific inquiry) so that final results can be averaged. I try