After cloning, move inside this repository and:
- create a conda environment using
conda env create -f environment.yml - activate the environment using
conda activate audit_vis - run
pip install .
Then, download results.tar.gz, checkpoints.tar.gz and images.tar.gz folders from our Zenodo archive, untar them and move them inside the repository:
checkpointscontains the weights of the null and anomalous modelsimagescontains the images we used to compute the explanationsresultscontains precomputed visualizations, anomaly scores, and the final results of the paper
The most important folders in this repository are:
visualize/, which computes the model explanations and stores them inresults/visualizations/analysis/, which uses precomputed explanations to compute the anomaly scores and stores them inresults/anomaly_scores/
In each of these folders, there is:
- a
detection_script.pyscript, which runs the jobs relevant for the detection task - a
localization_script.pyscript, which runs the jobs relevant for the localization task
These 4 scripts use submitit to schedule SLURM jobs on a cluster with GPUs. Before running each script:
- set
partition_nameto the name of your partition - for the
visualizescripts, set thelist_anomalies,list_methods - for the
analysisscripts, set thelist_anomalies,list_lpips_netsandlist_methodsvariables
Finally, analysis/get_results.py computes the final results for the detection and localization tasks from the anomaly scores.