JAXA Repository / AIREX 未来へ続く、宙(そら)への英知

このアイテムに関連するファイルはありません。

タイトルReducing neural network training time with parallel processing
本文(外部サイト)http://hdl.handle.net/2060/19950017789
著者(英)Lamarsh, William J., II; Rogers, James L., Jr.
著者所属(英)NASA Langley Research Center
発行日1995-02-01
言語eng
内容記述Obtaining optimal solutions for engineering design problems is often expensive because the process typically requires numerous iterations involving analysis and optimization programs. Previous research has shown that a near optimum solution can be obtained in less time by simulating a slow, expensive analysis with a fast, inexpensive neural network. A new approach has been developed to further reduce this time. This approach decomposes a large neural network into many smaller neural networks that can be trained in parallel. Guidelines are developed to avoid some of the pitfalls when training smaller neural networks in parallel. These guidelines allow the engineer: to determine the number of nodes on the hidden layer of the smaller neural networks; to choose the initial training weights; and to select a network configuration that will capture the interactions among the smaller neural networks. This paper presents results describing how these guidelines are developed.
NASA分類COMPUTER PROGRAMMING AND SOFTWARE
レポートNO95N24209
NASA-TM-110154
NAS 1.15:110154
権利No Copyright


このリポジトリに保管されているアイテムは、他に指定されている場合を除き、著作権により保護されています。