On sparsely connected optimal neural networks
Abstract
This paper uses two different approaches to show that VLSI and sizeoptimal discrete neural networks are obtained for small fanin values. These have applications to hardware implementations of neural networks, but also reveal an intrinsic limitation of digital VLSI technology: its inability to cope with highly connected structures. The first approach is based on implementing F{sub n,m} functions. The authors show that this class of functions can be implemented in VLSIoptimal (i.e., minimizing AT{sup 2}) neural networks of small constant fanins. In order to estimate the area (A) and the delay (T) of such networks, the following cost functions will be used: (i) the connectivity and the numberofbits for representing the weights and thresholdsfor good estimates of the area; and (ii) the fanins and the length of the wiresfor good approximates of the delay. The second approach is based on implementing Boolean functions for which the classical Shannon`s decomposition can be used. Such a solution has already been used to prove bounds on the size of fanin 2 neural networks. They will generalize the result presented there to arbitrary fanin, and prove that the size is minimized by small fanin values. Finally, a sizeoptimal neural network of small constant faninsmore »
 Authors:

 Los Alamos National Lab., NM (United States)
 Wayne State Univ., Detroit, MI (United States)
 Publication Date:
 Research Org.:
 Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
 Sponsoring Org.:
 USDOE Assistant Secretary for Human Resources and Administration, Washington, DC (United States)
 OSTI Identifier:
 532531
 Report Number(s):
 LAUR971567; CONF9709841
ON: DE97008308; TRN: AHC29721%%81
 DOE Contract Number:
 W7405ENG36
 Resource Type:
 Conference
 Resource Relation:
 Conference: International conference on microelectronics for neural networks, evolution and fuzzy systems, Dresden (Germany), 2426 Sep 1997; Other Information: PBD: [1997]
 Country of Publication:
 United States
 Language:
 English
 Subject:
 99 MATHEMATICS, COMPUTERS, INFORMATION SCIENCE, MANAGEMENT, LAW, MISCELLANEOUS; NEURAL NETWORKS; FUNCTIONS; INTEGRATED CIRCUITS; OPTIMIZATION; IMPLEMENTATION; DESIGN
Citation Formats
Beiu, V, and Draghici, S. On sparsely connected optimal neural networks. United States: N. p., 1997.
Web.
Beiu, V, & Draghici, S. On sparsely connected optimal neural networks. United States.
Beiu, V, and Draghici, S. 1997.
"On sparsely connected optimal neural networks". United States. https://www.osti.gov/servlets/purl/532531.
@article{osti_532531,
title = {On sparsely connected optimal neural networks},
author = {Beiu, V and Draghici, S},
abstractNote = {This paper uses two different approaches to show that VLSI and sizeoptimal discrete neural networks are obtained for small fanin values. These have applications to hardware implementations of neural networks, but also reveal an intrinsic limitation of digital VLSI technology: its inability to cope with highly connected structures. The first approach is based on implementing F{sub n,m} functions. The authors show that this class of functions can be implemented in VLSIoptimal (i.e., minimizing AT{sup 2}) neural networks of small constant fanins. In order to estimate the area (A) and the delay (T) of such networks, the following cost functions will be used: (i) the connectivity and the numberofbits for representing the weights and thresholdsfor good estimates of the area; and (ii) the fanins and the length of the wiresfor good approximates of the delay. The second approach is based on implementing Boolean functions for which the classical Shannon`s decomposition can be used. Such a solution has already been used to prove bounds on the size of fanin 2 neural networks. They will generalize the result presented there to arbitrary fanin, and prove that the size is minimized by small fanin values. Finally, a sizeoptimal neural network of small constant fanins will be suggested for F{sub n,m} functions.},
doi = {},
url = {https://www.osti.gov/biblio/532531},
journal = {},
number = ,
volume = ,
place = {United States},
year = {1997},
month = {10}
}