Abstract
Summary form only given. The authors have applied a simple backpropagation neural network on a very large scale in an attempt to associate many primary sequences with representations of the corresponding three-dimensional structures. The training set consisted of 25 five sequences (the input layer, 130 amino acids long) associated with 25 130 × 130 distance matrices (the output layer, 16,900 neurons). Each amino acid was coded according to its hydrophobicity (range ±1; the degree to which it avoids contact with water), and the Euclidean distances in the distance matrices were normalized to the largest distance in the training set (range 0-1; about 40 angstrom). The network was configured with a single fully connected hidden layer of 50 to 1000 neurons using the network description language (NDL, also called BigNet). The network simulation was run on a Cray 2 supercomputer with four processors and 512 million words of randon access memory. The network achieved rates of 2 million connections per second in full backpropagation learning mode and was able to learn some aspects of sequence-to-structure mapping.
Original language | English (US) |
---|---|
Title of host publication | IJCNN Int Jt Conf Neural Network |
Editors | Anon |
Publisher | Publ by IEEE |
Number of pages | 1 |
State | Published - Dec 1 1989 |
Event | IJCNN International Joint Conference on Neural Networks - Washington, DC, USA Duration: Jun 18 1989 → Jun 22 1989 |
Other
Other | IJCNN International Joint Conference on Neural Networks |
---|---|
City | Washington, DC, USA |
Period | 6/18/89 → 6/22/89 |