shellmiao 704700c0f1 fix: for debug 7 месяцев назад
..
images 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
README.md a8c72e639b feat: update readme 11 месяцев назад
__init__.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
client.py 8ef104bda6 fix: fix ZeroDivisionError 11 месяцев назад
client_with_pgfed.py 704700c0f1 fix: for debug 7 месяцев назад
communication.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
dataset.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
eval_dataset.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
knn_monitor.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
linear_evaluation.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
main.py 5136c50f5c feat: add privacy compute function 10 месяцев назад
model.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
semi_supervised_evaluation.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
server.py 66bae8d2eb [Fix] Fix FedSSL single GPU runtime. (#7) 1 год назад
server_with_pgfed.py 704700c0f1 fix: for debug 7 месяцев назад
transform.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад
utils.py 90acf8fb09 [Feature]: Federated self-supervised learning (#5) 2 лет назад

README.md

Federated Self-supervised Learning (FedSSL)

Also name as Federated Unsupervised Representation Learning (FedU)

A common limitation of existing federated learning (FL) methods is that they heavily rely on data labels on decentralized clients. We propose federated self-supervised learning framework (FedSSL) to learn visual representations from decentralized data without labels.

This repository is the code for two papers:

  • Divergence-aware Federated Self-Supervised Learning, ICLR'2022. [paper]
  • Collaborative Unsupervised Visual Representation Learning From Decentralized Data, ICCV'2021. [paper]

The framework implements four self-supervised learning (SSL) methods based on Siamese networks in the federated manner:

  1. BYOL
  2. SimSiam
  3. MoCo (MoCoV1 & MoCoV2)
  4. SimCLR

Training

You can conduct training using different FedSSL methods and our proposed FedEMA method.

You need to save the global model for further evaluation.

FedEMA

Run FedEMA with auto scaler $\tau=0.7$

python applications/fedssl/main.py --task_id fedema --model byol \
      --aggregate_encoder online --update_encoder dynamic_ema_online --update_predictor dynamic_dapu \
      --auto_scaler y --auto_scaler_target 0.7 2>&1 | tee log/${task_id}.log

Run FedEMA with constant weight scaler $\lambda=1$:

python applications/fedssl/main.py --task_id fedema --model byol \
      --aggregate_encoder online --update_encoder dynamic_ema_online --update_predictor dynamic_dapu \
      --weight_scaler 1 2>&1 | tee log/${task_id}.log

Other SSL methods

Run other FedSSL methods:

python applications/fedssl/main.py --task_id fedbyol --model byol  \
      --aggregate_encoder online --update_encoder online --update_predictor global

Replace byol in --model byol with other ssl methods, including simclr, simsiam, moco, moco_v2

Evaluation

You can evaluate the saved model with either linear evaluation and semi-supervised evaluation.

Linear Evaluation

python applications/fedssl/linear_evaluation.py --dataset cifar10 \
      --model byol --encoder_network resnet18 \
      --model_path <path to the saved model with postfix '.pth'> \
      2>&1 | tee log/linear_evaluation.log

Semi-supervised Evaluation

python applications/fedssl/semi_supervised_evaluation.py --dataset cifar10 \
      --model byol --encoder_network resnet18 \
      --model_path <path to the saved model with postfix '.pth'> \
      --label_ratio 0.1 --use_MLP 
      2>&1 | tee log/semi_supervised_evaluation.log

File Structure

├── client.py <client implementation of federated learning>
├── communication.py <constants for model update>
├── dataset.py <dataset for semi-supervised learning>
├── eval_dataset <dataset preprocessing for evaluation>
├── knn_monitor.py <kNN monitoring>
├── main.py <file for start running>
├── model.py <ssl models>
├── resnet.py <network architectures used>
├── server.py <server implementation of federated learning>
├── transform.py <image transformations>
├── linear_evaluation.py <linear evaluation of models after training>
├── semi_supervised_evaluation.py <semi-supervised evaluation of models after training>
├── transform.py <image transformations>
└── utils.py 

With PGFed

使用参数--use_pgfed控制训练过程是否使用PGFed

当前可用范围

针对四种SSL方法(SimCLR,MoCo,BYOL和SimSiam),其中由于BYOL(实测)、MoCo(未测试,但是代码是这样写的)没有使用pytorch的梯度更新,需要后续再单独做适配。而SimCLR(实测)以及SimSiam(未测试,看代码ok)可直接运行。

PGFed相关参数设置

lambdaa = 1.0

mu = 0

momentum = 0.0

硬编码,需要可自行设置

(没有设置动量更新的Flag,momentum = 0.0 即代表不使用动量更新)

针对自监督学习的改动

image-20240110235132811

更新方式改为自监督学习更新方式

性能测试 (10 rounds)

PGFed

1704903306280

Without PGFed

image-20240111004954179

Citation

If you use these codes in your research, please cite these projects.

@inproceedings{zhuang2022fedema,
  title={Divergence-aware Federated Self-Supervised Learning},
  author={Weiming Zhuang and Yonggang Wen and Shuai Zhang},
  booktitle={International Conference on Learning Representations},
  year={2022},
  url={https://openreview.net/forum?id=oVE1z8NlNe}
}

@inproceedings{zhuang2021fedu,
  title={Collaborative Unsupervised Visual Representation Learning from Decentralized Data},
  author={Zhuang, Weiming and Gan, Xin and Wen, Yonggang and Zhang, Shuai and Yi, Shuai},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={4912--4921},
  year={2021}
}

@article{zhuang2022easyfl,
  title={Easyfl: A low-code federated learning platform for dummies},
  author={Zhuang, Weiming and Gan, Xin and Wen, Yonggang and Zhang, Shuai},
  journal={IEEE Internet of Things Journal},
  year={2022},
  publisher={IEEE}
}