Architecture neural network deep optimizing based on self organizing feature map algorithm

Muthna Jasim Fadhil, Majli Nema Hawas, Maitham Ali Naji


Forward neural network (FNN) execution relying on the algorithm of training and architecture selection. Different parameters using for nip out the architecture of FNN such as the connections number among strata, neurons hidden number in each strata hidden and hidden strata number. Feature architectural combinations exponential could be uncontrollable manually so specific architecture can be design automatically by using special algorithm which build system with ability generalization better. Determination of architecture FNN can be done by using the algorithm of optimization numerous. In this paper methodology new proposes achievement where FNN neurons respective with hidden layers estimation work where in this work collect algorithm training self organizing feature map (SOFM) with advantages to explain how the best architectural selected automatically by SOFM from criteria error testing based on architecture populated. Different size of dataset benchmark of 4 classifications tested for approach proposed.


Architecture for optimization; Hidden layers; Hidden neurons; Neural network; Self organizing feature map

Full Text:




  • There are currently no refbacks.

Bulletin of EEI Stats