Preprint / Version 1

Bidirectional pre-training method for Neural Network

##article.authors##

  • Volkan Sezar University of Ankara
  • Mohammad Hasan Ahmadian
  • Vahid Vaseghi
  • Bita Omid

DOI:

https://doi.org/10.31224/2870

Abstract

In this paper, a two-way pre-training method is presented to combine the learning of deep neural networks with the learning of others. The learning of these networks often does not converge due to facing a high number of local minima. It is now possible to avoid many local minima with the appropriate initial value of network weights. The two-way layer-by-layer pre-training method is a fast and efficient method that adjusts the initial values of its weights in a forward and backward direction using the desired inputs and outputs of the network. For this purpose, a hidden layer based on the weights of the pre-trained layer of the deep network and the auxiliary weights is used to teach the auxiliary networks. Then, the weight values obtained from their training are put under pre-training in the main structure of the network, and for the accurate adjustment of the weights, integral training is done. This method was used for pre-training the weights of three deep neural networks that recognize the person, emotional states and handwritten digits, and it was shown that by using this pre-training method, the speed of learning convergence increases dramatically. Also, the number of recognitions in the face database improves significantly, which indicates the increase in the network's generalization power using this method.

Downloads

Download data is not yet available.

Downloads

Posted

2023-03-11