shellmiao a8c72e639b feat: update readme | 11 mēneši atpakaļ | |
---|---|---|
.. | ||
images | 2 gadi atpakaļ | |
README.md | 11 mēneši atpakaļ | |
__init__.py | 2 gadi atpakaļ | |
client.py | 11 mēneši atpakaļ | |
client_with_pgfed.py | 11 mēneši atpakaļ | |
communication.py | 2 gadi atpakaļ | |
dataset.py | 2 gadi atpakaļ | |
eval_dataset.py | 2 gadi atpakaļ | |
knn_monitor.py | 2 gadi atpakaļ | |
linear_evaluation.py | 2 gadi atpakaļ | |
main.py | 11 mēneši atpakaļ | |
model.py | 2 gadi atpakaļ | |
semi_supervised_evaluation.py | 2 gadi atpakaļ | |
server.py | 1 gadu atpakaļ | |
server_with_pgfed.py | 11 mēneši atpakaļ | |
transform.py | 2 gadi atpakaļ | |
utils.py | 2 gadi atpakaļ |
Also name as Federated Unsupervised Representation Learning (FedU)
A common limitation of existing federated learning (FL) methods is that they heavily rely on data labels on decentralized clients. We propose federated self-supervised learning framework (FedSSL) to learn visual representations from decentralized data without labels.
This repository is the code for two papers:
The framework implements four self-supervised learning (SSL) methods based on Siamese networks in the federated manner:
You can conduct training using different FedSSL methods and our proposed FedEMA method.
You need to save the global model for further evaluation.
Run FedEMA with auto scaler $\tau=0.7$
python applications/fedssl/main.py --task_id fedema --model byol \
--aggregate_encoder online --update_encoder dynamic_ema_online --update_predictor dynamic_dapu \
--auto_scaler y --auto_scaler_target 0.7 2>&1 | tee log/${task_id}.log
Run FedEMA with constant weight scaler $\lambda=1$:
python applications/fedssl/main.py --task_id fedema --model byol \
--aggregate_encoder online --update_encoder dynamic_ema_online --update_predictor dynamic_dapu \
--weight_scaler 1 2>&1 | tee log/${task_id}.log
Run other FedSSL methods:
python applications/fedssl/main.py --task_id fedbyol --model byol \
--aggregate_encoder online --update_encoder online --update_predictor global
Replace byol
in --model byol
with other ssl methods, including simclr
, simsiam
, moco
, moco_v2
You can evaluate the saved model with either linear evaluation and semi-supervised evaluation.
python applications/fedssl/linear_evaluation.py --dataset cifar10 \
--model byol --encoder_network resnet18 \
--model_path <path to the saved model with postfix '.pth'> \
2>&1 | tee log/linear_evaluation.log
python applications/fedssl/semi_supervised_evaluation.py --dataset cifar10 \
--model byol --encoder_network resnet18 \
--model_path <path to the saved model with postfix '.pth'> \
--label_ratio 0.1 --use_MLP
2>&1 | tee log/semi_supervised_evaluation.log
├── client.py <client implementation of federated learning>
├── communication.py <constants for model update>
├── dataset.py <dataset for semi-supervised learning>
├── eval_dataset <dataset preprocessing for evaluation>
├── knn_monitor.py <kNN monitoring>
├── main.py <file for start running>
├── model.py <ssl models>
├── resnet.py <network architectures used>
├── server.py <server implementation of federated learning>
├── transform.py <image transformations>
├── linear_evaluation.py <linear evaluation of models after training>
├── semi_supervised_evaluation.py <semi-supervised evaluation of models after training>
├── transform.py <image transformations>
└── utils.py
使用参数--use_pgfed控制训练过程是否使用PGFed
针对四种SSL方法(SimCLR,MoCo,BYOL和SimSiam),其中由于BYOL(实测)、MoCo(未测试,但是代码是这样写的)没有使用pytorch的梯度更新,需要后续再单独做适配。而SimCLR(实测)以及SimSiam(未测试,看代码ok)可直接运行。
lambdaa = 1.0
mu = 0
momentum = 0.0
硬编码,需要可自行设置
(没有设置动量更新的Flag,momentum = 0.0 即代表不使用动量更新)
更新方式改为自监督学习更新方式
If you use these codes in your research, please cite these projects.
@inproceedings{zhuang2022fedema,
title={Divergence-aware Federated Self-Supervised Learning},
author={Weiming Zhuang and Yonggang Wen and Shuai Zhang},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=oVE1z8NlNe}
}
@inproceedings{zhuang2021fedu,
title={Collaborative Unsupervised Visual Representation Learning from Decentralized Data},
author={Zhuang, Weiming and Gan, Xin and Wen, Yonggang and Zhang, Shuai and Yi, Shuai},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
pages={4912--4921},
year={2021}
}
@article{zhuang2022easyfl,
title={Easyfl: A low-code federated learning platform for dummies},
author={Zhuang, Weiming and Gan, Xin and Wen, Yonggang and Zhang, Shuai},
journal={IEEE Internet of Things Journal},
year={2022},
publisher={IEEE}
}