JAXA Repository / AIREX 未来へ続く、宙(そら)への英知

このアイテムに関連するファイルはありません。

タイトルAccelerated Training for Large Feedforward Neural Networks
本文(外部サイト)http://hdl.handle.net/2060/19990008890
著者(英)Stepniewski, Slawomir W.; Jorgensen, Charles C.
著者所属(英)NASA Ames Research Center
発行日1998-11-01
言語eng
内容記述In this paper we introduce a new training algorithm, the scaled variable metric (SVM) method. Our approach attempts to increase the convergence rate of the modified variable metric method. It is also combined with the RBackprop algorithm, which computes the product of the matrix of second derivatives (Hessian) with an arbitrary vector. The RBackprop method allows us to avoid computationally expensive, direct line searches. In addition, it can be utilized in the new, 'predictive' updating technique of the inverse Hessian approximation. We have used directional slope testing to adjust the step size and found that this strategy works exceptionally well in conjunction with the Rbackprop algorithm. Some supplementary, but nevertheless important enhancements to the basic training scheme such as improved setting of a scaling factor for the variable metric update and computationally more efficient procedure for updating the inverse Hessian approximation are presented as well. We summarize by comparing the SVM method with four first- and second- order optimization algorithms including a very effective implementation of the Levenberg-Marquardt method. Our tests indicate promising computational speed gains of the new training technique, particularly for large feedforward networks, i.e., for problems where the training process may be the most laborious.
NASA分類Mathematical and Computer Sciences (General)
レポートNOA-9812323
NASA/TM-1998-112239
NAS 1.15:112239
権利No Copyright
URIhttps://repository.exst.jaxa.jp/dspace/handle/a-is/97090


このリポジトリに保管されているアイテムは、他に指定されている場合を除き、著作権により保護されています。