Skip to content

ziqi-zhang/LLM_Distillation_Privacy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Membership and Memorization in LLM Knowledge Distillation

Pytorch implementation for the paper:
Membership and Memorization in LLM Knowledge Distillation
Ziqi Zhang, Ali Shahin Shamsabadi, Hanxiao Lu, Yifeng Cai, Hamed Haddadi

Scripts

The scripts for membership inference on distilled models are in scripts/model_family_6k. The scripts of memorization is in mia.

Environment Configuration

Please see README_distiLLM.md.

Acknowledgement

The code framework is based on DistiLLM and MiniLLM

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages