Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-24T17:07:22.910Z Has data issue: false hasContentIssue false

The application of neural network techniques to structural analysis by implementing an adaptive finite-element mesh generation

Published online by Cambridge University Press:  27 February 2009

Mansour Nasser Jadid
Affiliation:
Department of Building Technology, King Faisal University, Kingdom of, Saudi Arabia.
Daniel R. Fairbairn
Affiliation:
Department of Building Technology, King Faisal University, Kingdom of, Saudi Arabia. Department of Civil Engineering and Building Science, The University of Edinburgh, Edinburgh, U.K.

Abstract

This study focuses on the application of neural network techniques to adaptive remeshing of an idealized squareshaped structure and individual triangle by using triangular elements. The backpropagation learning algorithm is implemented by a supervised training technique to deal with the problem of remeshing structural elements in a structural analysis. A recent study introduced a structural remeshing that incorporated a finite-element analysis with an adaptive mesh generation technique. The main objective of the study is to demonstrate how neural networks can be employed to remesh structural elements without using numerically intensive computations. One essential requirement of this approach is the selection of feasible and appropriate training and testing data. The exploration of neural network in remeshing structural elements is a fundamental technique that looks beyond the finite-element and adaptive mesh generation techniques. It also demonstrates the capability of neural network to represent n-dimensional space and track each individual characteristic in that separate space. In general, an overview is presented, and the potential of neural networks to use the backpropagation algorithm instead of the more conventional approach of numerical methods is demonstrated.

Type
Research Article
Copyright
Copyright © Cambridge University Press 1994

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Dayhoff, J. (1990). Neural Network Architectures. Van Nostrand Reinhold, New York.Google Scholar
Hebb, D.O. (1949). The Organization of Behavior. John Wiley, New York.Google Scholar
Hopfield, J.J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. 79, 25542558.CrossRefGoogle ScholarPubMed
Jacobs, R.A. (1988). Increased rates of convergence through learning rate adaptation. Neural Networks 1, 295307.CrossRefGoogle Scholar
Khan, A.I., & Topping, B.H. (1991). Parallel adaptive mesh generation. Comput. Syst. Eng. Int. J. 2(1), 75101.CrossRefGoogle Scholar
Minai, A.A., & Williams, R.D. (1990). Acceleration of back-propagation through learning rate and momentum adaptation. Int. Joint Conf. on Neural Networks 1, 676679.Google Scholar
Minsky, M., & Papert, S. (1969). Perceptrons. MIT Press, Cambridge, MA.Google Scholar
McCulloch, W.C., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115133.CrossRefGoogle Scholar
NeuralWare Inc. (1991). Reference Guide and Neural Computing: Professional II/Plus NeuralWorks Explorer (version 4).Google Scholar
Rosenblatt, F. (1958). The perceptron: A probabilistic model for information storage and organization in the brain. Psychol. Rev. 65, 386408.CrossRefGoogle ScholarPubMed
Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1986). Learning internal representations by error propagation. In Parallel Distributed Processing: Explorations in the Microstructures of Cognition (Rumelhart, D.E., & McClelland, J.I., Eds.), pp. 318362. MIT Press, Cambridge, MA.CrossRefGoogle Scholar