Evaluation of Neural Network Architectures for Predicting Material Elastic Properties
Andi M. N. F. Syamsul (a*), Agoes Soehianie (a), Abdul M. T. Pradipto (a)

a) Physics Study Program, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10, Bandung 40132, Indonesia.
*andimuhammad164[at]gmail.com


Abstract

This study evaluated various neural network architectures for predicting material properties from a dataset. Sequential models with varying dense layers (1-3) and neurons per layer (32-112 in 16 steps) were constructed in TensorFlow. The models were trained for 10 epochs on datasets representing material properties, with 80% for training and 20% for validation. Training was performed to minimize mean absolute error loss using the Adam optimizer. Models with more dense layers exhibited lower validation losses, indicating superior predictive ability to capture complex relationships in the data. The number of neurons also impacted model performance, with larger sizes tending to yield lower losses. The key findings showed that adequate hidden layers and neuron counts allowed the models to better learn representations and discriminate patterns needed to accurately predict elasticity and other material properties. This provided guidance on designing optimal neural network architectures for predicting material properties and informed new model designs for enhanced materials simulations.

Keywords: Elastic properties- Neural network architecture- Material property prediction

Topic: Material Physics

APS 2023 Conference | Conference Management System