Chrome Extension
WeChat Mini Program
Use on ChatGLM

An Optimal Knowledge Distillation for Formulating an Effective Defense Model Against Membership Inference Attacks

INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS(2024)

Cited 0|Views0
Key words
Knowledge distillation,membership inference attack,teacher model,student model,privacy-utility trade-off
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined