Chrome Extension
WeChat Mini Program
Use on ChatGLM

Dual-Level Knowledge Distillation Via Knowledge Alignment and Correlation

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS(2024)

Cited 3|Views57
Key words
Correlation,Knowledge engineering,Task analysis,Standards,Network architecture,Prototypes,Training,Convolutional neural networks,dual-level knowledge,knowledge distillation (KD),representation learning,teacher-student model
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined