Knowledge distillation (KD) aims to distill the knowledge from the teacher (larger) to the student (smaller) model via soft-label for the efficient neural network. In general, the performance of a model is determined by accuracy, which is measured with labels. However, existing KD approaches usually use the teacher with its original distribution, neglecting the potential of incorrect prediction. This may contradict the motivation of hard-label learning through cross-entropy loss, which may lead to sub-optimal knowledge distillation on certain samples. To address this issue, we propose a novel logit processing scheme via a sorting mechanism. Specifically, our method has a two-fold goal: (1) fixing the incorrect prediction of the teacher based on the labels and (2) reordering the distribution in a natural way according to priority rank at once. As an easy-to-use, plug-and-play pre-processing, our sort method can be effectively applied to existing logit-based KD methods. Extensive experiments on the CIFAR-100 and ImageNet datasets demonstrate the effectiveness of our method.
Environments:
- Python 3.6
- PyTorch 1.9.0
- torchvision 0.10.0
Install the package:
sudo pip3 install -r requirements.txt
sudo python setup.py develop
- The registration: https://wandb.ai/home.
- If you don't want wandb as your logger, set
CFG.LOG.WANDBasFalseatmdistiller/engine/cfg.py.
Download the cifar_teachers.tar at https://github.com/megvii-research/mdistiller/releases/tag/checkpoints and untar it to ./download_ckpts via tar xvf cifar_teachers.tar.
For KD:
python3 tools/train.py --cfg configs/cifar100/kd.yamlFor DKD:
python3 tools/train.py --cfg configs/cifar100/dkd/res32x4_res8x4.yamlFor CTKD & LSKD: Download CTKD and LSKD.
Download the dataset at https://image-net.org/ and put them to ./data/imagenet.
For KD:
python3 tools/train.py --cfg configs/imagenet/r34_r18/kd.yamlFor DKD:
python3 tools/train.py --cfg configs/imagenet/r34_r18/dkd.yamlFor CTKD & LSKD: Download CTKD and LSKD.
If this repo is helpful for your research, please consider citing the paper:
@article{sortkd25,
title={Parameter-Free Logit Distillation via Sorting Mechanism},
author={Limantoro, Stephen Ekaputra},
journal={IEEE Signal Processing Letters},
volume={32},
pages={3849--3853},
year={2025},
publisher={IEEE}
}This work is based on mdistiller. Sincere gratitude to mdistiller, CTKD, and LSKD for their remarkable contributions.

