Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-22T18:55:23.073Z Has data issue: false hasContentIssue false

A neural network-based machine learning approach for supporting synthesis

Published online by Cambridge University Press:  27 February 2009

Nenad Ivezic
Affiliation:
Department of Civil Engineering, Carnegie Mellon University, Pittsburgh, PA 15213–3890
James H. Garrett Jr
Affiliation:
Department of Civil Engineering, Carnegie Mellon University, Pittsburgh, PA 15213–3890

Abstract

The goal of machine learning for artifact synthesis is the acquisition of the relationships among form, function, and behavior properties that can be used to determine more directly form attributes that satisfy design requirements. The proposed approach to synthesis knowledge acquisition and use (SKAU) described in this paper, called NETSYN, creates a function to estimate the probability of each possible value of each design property being used in a given design context. NETSYN uses a connectionist learning approach to acquire and represent this probability estimation function and exhibits good performance when tested on an artificial design problem. This paper presents the NETSYN approach for SKAU, a preliminary test of its capability, and a discussion of issues that need to be addressed in future work.

Type
Articles
Copyright
Copyright © Cambridge University Press 1994

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Ang, H.-S.A., & Tang, W.H. (1975). Probability Concepts in Engineering Planning and Design. John Wiley, New York.Google Scholar
Buntine, W.L., & Weigend, A.S. (1991). Bayesian back-propagation. Complex Syst. 5, 603643.Google Scholar
Coyne, R.D. et al. (1990). Knowledge-Based Design Systems. Addison-Wesley, Reading, MA.Google Scholar
Cybenko, G. (1988). Continuous Valued Neural Networks with Two Hidden Layers Are Sufficient. Technical Report, Department of Computer Science, Tufts University, Medford, MA.Google Scholar
El-Jaroudi, A., & Makhoul, J. (1990). A new error criterion for posterior probability estimation with neural nets. IJCNN Joint Conf. Neural Networks, 185192.CrossRefGoogle Scholar
Flemming, U., Adams, J., Carlson, C., Coyne, R., Fenves, S., Finger, S., Ganeshan, R., Garrett, J., Gupta, A., Reich, Y., Siewiorek, S., Sturges, R., Thomas, D., & Woodbury, R. (1992). Computational Models for Form-Function Synthesis in Engineering Design. Technical Report EDRC 48–25–92, Engineering Design Research Center, Carnegie Mellon University, Pittsburgh, PA.Google Scholar
Gupta, A.P., Birmingham, W.P., & Siewiorek, D.P. (1991). Automating the Design of Computer Systems. Technical Report CSE-TR-104–91, Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI.Google Scholar
Hertz, J.A., Palmer, R.G., & Krogh, A.S. (1991). Introduction to the Theory of Neural Computation. Addison-Wesley, Reading, MA.Google Scholar
Hinton, G.E., McClelland, J.L., & Rumelhart, D.E. (1986). Distributed representations. In Parallel Distributed Processing: Explorations in the Microstructures of Cognition (Rumelhart, D.E. and McClelland, J.L. Eds.), Vol. 1, chapter 7, pp. 282317. MIT Press, Cambridge, MA.Google Scholar
Hornik, K., Stinchcombe, M., & White, H. (1989). Multilayer feedforward networks are universal approximators. Neural Networks (2), 359366.Google Scholar
Ivezic, N. (1992). NETSYN: A Connectionist Approach for Acquiring and Using Engineering Synthesis Knowledge. Unpublished Masters Thesis, Department of Civil Engineering, Carnegie Mellon University.Google Scholar
Julien, B. (1992). Experience with four probability-based induction methods. AI Appl. 6(2), 5156.Google Scholar
Lapedes, A., & Farber, R. (1987). How neural nets work. In Neural Information Processing Systems, Denver 1987 (Anderson, D.Z., Ed.), pp. 442456. American Institute of Physics, New York.Google Scholar
Le, Cun Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., & Jackel, L.D. (1989). Backpropagation applied to handwritten zip code recognition. Neural Comp. 1(4), 541551.Google Scholar
MacKay, D.J.C. (1992a). Bayesian interpolation. Neural Comp. 4(3), 415447.Google Scholar
MacKay, D.J.C. (1992b). A practical Bayesian framework for backprop networks. Neural Comp. 4(3), 448472.Google Scholar
Pomerleau, D.A. (1989). ALVINN: An Autonomous Land Vehicle in a Neural Network. Technical Report CMU-CS-89–107, Computer Science Department, Carnegie-Mellon University, Pittsburgh, PA.Google Scholar
Qian, N., & Sejnowski, T.J. (1988). Predicting the secondary structure of globular proteins using neural network models. J. Mol. Biol. 202, 865884.Google ScholarPubMed
Reich, Y. (1991). Building and Improving Design Systems: A Machine Learning Approach. Technical Report EDRC 02–16–91, Engineering Design Research Center, Carnegie Mellon University, Pittsburgh, PA.Google Scholar
Richard, M.D., & Lippmann, R.P. (1991). Neural network classifiers estimate Bayesian a posteriori probabilities. Neural Comp. 3(4), 461483.Google Scholar
Rivest, R. (1988). Lecture Notes in Machine Learning. Unpublished.Google Scholar
Rumbaugh, J., Blaha, M., Premerlani, W., Eddy, F., & Lorensen, W. (1991). Object-Oriented Modeling and Design. Prentice Hall, Englewood Cliffs, NJ.Google Scholar
Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1986). Learning internal representations by error propagation. In Parallel Distributed Processing: Explorations in the Microstructures of Cognition (Rumelhart, D.E. and McClelland, J.L., Eds.), Vol. 1, Chapter 8, pp. 318362. MIT Press, Cambridge, MA.Google Scholar
Sejnowski, T.J., & Rosenberg, C.R. (1988). Parallel networks that learn to pronounce English text. Complex Syst. 1(1), 145168.Google Scholar
Spirtes, P., Glymour, C., & Scheines, R. (1993). Causation, Prediction, and Search. Springer-Verlag, Berlin.CrossRefGoogle Scholar
Tesauro, G. (1990). Neurogamon wins computer olympiad. Neural Comp. 1(3), 321323.Google Scholar