Author: Symeon Papadopoulos
InDistill: Information flow-preserving knowledge distillation for model compression
Towards Quantitative Evaluation of Explainable AI Methods for Deepfake Detection
[ Front Matter ] MAD ’24: Proceedings of the 3rd ACM International Workshop on Multimedia AI against Disinformation
MAD ’24 Workshop: Multimedia AI against Disinformation
Towards Optimal Trade-offs in Knowledge Distillation for CNNs and Vision Transformers at the Edge
Universal Local Attractors on Graphs
Leveraging Representations from Intermediate Encoder-blocks for Synthetic Image Detection
JGNN: Graph Neural Networks on Native Java
We introduce JGNN, an open source Java library to define, train, and run Graph Neural Networks (GNNs) under limited resources. The library is cross-platform and implements memory-efficient machine learning components without external dependencies. Model definition is simplified by parsing Python-like expressions, including interoperable dense and sparse matrix operations and inline parameter definitions. GNN models can be deployed on smart devices and trained on local data.