BACKGROUND
Neural networks are created using the basic building block of a perceptron, an algorithm which performs a linear combination function. Currently, perceptron implementations are done in software and are computed via application-specific integrated circuits (ASIC’s) in the most performance-intensive applications. These circuits are primarily comprised of fused multiply-add datapaths to compute the linear combination a perceptron produces as quickly as possible. However, since this is only modeled in software, they are only as parallel as the number of datapaths a given ASIC can provide.
SUMMARY OF TECHNOLOGY
Researchers at OSU have developed a digital perceptron circuit, viewed from a gate-level digital logic level of abstraction. In contrast to current technology which relies on software models, this technology functions as a fundamental hardware building block of neural networks. This enables the parallelizing of neural network execution, allowing for more high-performance network implementation compared to current levels for a given trained network.
POTENTIAL AREAS OF APPLICATION
MAIN ADVANTAGES
STAGE OF DEVELOPMENT