Unsupervised Federated Continual Learning (UFCL) is a new learning paradigm that embeds unsupervised representation techniques into the Federated Learning (FL) framework, which enables continuous training of a shared representation model without compromising individual participants’ data privacy. It is particularly well-suited for the Internet of Things and Edge Computing scenarios. However, the continuous learning process may cause catastrophic forgetting in the model, reducing generated representations’ performance. The research findings suggest that limited model capacity and undifferentiated weight aggregation in UFCL are mainly responsible for decreased model performance. As shown in Table 1, the research team optimize the unsupervised SimSiam model with 2, 4, and 6 linear layers using centralized and FL paradigm. The results show that the larger the model capacity, the better the memory, and the weight aggregation in FL brought an average forgetting growth of 6.6%.
To solve the aforementioned problems, a research team led by Xudong LIU published their
new research on 15 June 2025 in
Frontiers of Computer Science co-published by Higher Education Press and Springer Nature.
The team proposes an anti-forgetting representation learning method based on a new hybrid model architecture and gradient truncation technique. Specifically, the method proposes a new model architecture based on Kolmogorov–Arnold and pluggable structures, which can effectively improve the model’s memory capacity and anti-forgetting ability. The method also designs a gradient truncation technique to reduce the interference of weight aggregation on model memory and use an ordinary differential equation (ODE) sampler to augment the representation performance.
DOI:
10.1007/s11704-024-40557-w