Kazan (Volga region) Federal University, KFU
KAZAN
FEDERAL UNIVERSITY
 
USING NON-LIPSCHITZ SIGNUM-BASED FUNCTIONS FOR DISTRIBUTED OPTIMIZATION AND MACHINE LEARNING: TRADE-OFF BETWEEN CONVERGENCE RATE AND OPTIMALITY GAP
Form of presentationArticles in international journals and collections
Year of publication2025
Языканглийский
  • Gabidullina Zulfiya Ravilevna, author
  • Aghasi Alireza , author
  • Doostmohammadian Mohammadreza , author
  • Ghods Amir Ahmad , author
  • Rabiee Hamid R. , author
  • Bibliographic description in the original language Mohammadreza Doostmohammadian, Amir Ahmad Ghods, Alireza Aghasi, Zulfiya R. Gabidullina, Hamid R. Rabiee. Using Non-Lipschitz Signum-Based Functions for Distributed Optimization and Machine Learning: Trade-Off Between Convergence Rate and Optimality Gap, Mathematical and Computational Applications. 2025, 30(5), 108; https://doi.org/10.3390/mca30050108
    Annotation In recent years, the prevalence of large-scale datasets and the demand for sophisticated learning models have necessitated the development of efficient distributed machine learning (ML) solutions. Convergence speed is a critical factor influencing the practicality and effectiveness of these distributed frameworks. Recently, non-Lipschitz continuous optimization algorithms have been proposed to improve the slow convergence rate of the existing linear solutions. The use of signum-based functions was previously considered in consensus and control literature to reach fast convergence in the prescribed time and also to provide robust algorithms to noisy/outlier data. However, as shown in this work, these algorithms lead to an optimality gap and steady-state residual of the objective function in discrete-time setup. This motivates us to investigate the distributed optimization and ML algorithms in terms of trade-off between convergence rate and optimality gap. In this direction, we specifically consider the distributed regression problem and check its convergence rate by applying both linear and non-Lipschitz signum-based functions. We check our distributed regression approach by extensive simulations. Our results show that although adopting signum-based functions may give faster convergence, it results in large optimality gaps. The findings presented in this paper may contribute to and advance the ongoing discourse of similar distributed algorithms, e.g., for distributed constrained optimization and distributed estimation. Lipschitz continuity
    Keywords linear regression, distributed optimization,network and graph theory, Lipschitz continuity
    The name of the journal MATHEMATICAL AND COMPUTATIONAL APPLICATIONS
    Please use this ID to quote from or refer to the card https://repository.kpfu.ru/eng/?p_id=320950&p_lang=2

    Full metadata record