Deep learning for computational structural optimization.

Researchers

Journal

Modalities

Models

Abstract

We investigate a novel computational approach to computational structural optimization based on deep learning. After employing algorithms to solve the stiffness formulation of structures, we used their improvement to optimize the structural computation. A standard illustration of 10 bar-truss was revisited to illustrate the mechanism of neural networks and deep learning. Several benchmark problems of 2D and 3D truss structures were used to verify the reliability of the present approach, and its extension to other engineering structures is straightforward. To enhance computational efficiency, a constant sum technique was proposed to generate data for the input of multi-similar variables. Both displacement and stress enforcements were the constraints of the optimized problem. The optimization data for cross sections with the objective function of total weight were then employed in the context of deep learning. The stochastic gradient descent (SGD) with Nesterov’s accelerated gradient (NAG), root mean square propagation (RMSProp) and adaptive moment estimation (Adam) optimizers were compared in terms of convergence. In addition, this paper devised Chebyshev polynomials for a new approach to activation functions in single-layer neural networks. As expected, its convergence was quicker than the popular learning functions, especially in a short training with a small number of epochs for tested problems. Finally, a split data technique for linear regression was proposed to deal with some sensitive data.
Copyright © 2020 ISA. Published by Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *