Using Persistence Barcode to Show the Impact of Data Complexity on the Neural Network Architecture

Authors

  • Labiba M. Alhelfi Department of Mathematics, College of Science, University of Basrah, Basrah, Iraq
  • Hana M. Ali Department of Mathematics, College of Science, University of Basrah, Basrah, Iraq

DOI:

https://doi.org/10.24996/ijs.2022.63.5.37

Keywords:

ANN Architecture, Alpha Complexity, Topological Data Analysis, Betti Number,, Persistence Barcode

Abstract

    It is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing  mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcode reading method to extract the Betti number of a dataset. After that,  MLPR were trained using that dataset using a single hidden layer with increased hidden neurons. Then, increased both hidden layers and hidden neurons. Our empirical analysis has shown that the training efficiency of MLPR severely depends on its architecture’s ability to express the homology of the dataset.

Downloads

Download data is not yet available.

Downloads

Published

2022-05-25

Issue

Section

Computer Science

How to Cite

Using Persistence Barcode to Show the Impact of Data Complexity on the Neural Network Architecture. (2022). Iraqi Journal of Science, 63(5), 2262-2278. https://doi.org/10.24996/ijs.2022.63.5.37

Similar Articles

1-10 of 2093

You may also start an advanced similarity search for this article.

Most read articles by the same author(s)