Switched Neural Networks for Simultaneous Learning of Multiple Functions


EFE M. Ö., Kurkcu B., Kasnakoglu C., Mohamed Z., Liu Z.

IEEE Transactions on Emerging Topics in Computational Intelligence, 2024 (SCI-Expanded) identifier

  • Publication Type: Article / Article
  • Publication Date: 2024
  • Doi Number: 10.1109/tetci.2024.3369981
  • Journal Name: IEEE Transactions on Emerging Topics in Computational Intelligence
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Keywords: Artificial neural networks, Genetic algorithms, genetic algorithms, learning multiple functions, Neural networks, Optimization, parameter switching, Switches, Task analysis, Training, Vectors
  • Hacettepe University Affiliated: Yes

Abstract

This paper introduces the notion of switched neural networks for learning multiple functions under different switching configurations. The neural network structure has adjustable parameters and for each function the state of the parameter vector is determined by a mask vector, 1/0 for active/inactive or +1/-1 for plain/inverted. The optimization problem is to schedule the switching strategy (mask vector) required for each function together with the best parameter vector (weights/biases) minimizing the loss function. This requires a procedure that optimizes a vector containing real and binary values simultaneously to discover commonalities among various functions. Our studies show that a small sized neural network structure with an appropriate switching regime is able to learn multiple functions successfully. During the tests focusing on classification, we considered 2-variable binary functions and all 16 combinations have been chosen as the functions. The regression tests consider four functions of two variables. Our studies showed that simple NN structures are capable of storing multiple information via appropriate switching.