Our research, which I started in 2004, was motivated by two questions. Can we build a single electronic element – the equivalent of a transistor or switch – that performs most of the known functions of neurons in a brain? If so, can we use it as a building block to build useful computers?
Neurons are very finely tuned, as are electronic elements that they emulate. I co-authored a research paper in 2013 that set out the principle of what to do. It took my colleague Suhas Kumar and others five years of careful research to obtain just the right material composition and structure to produce the necessary properties predicted from the theory.
Kumar then went a big step further and built a circuit of 20 of these elements linked together by a network of devices that can be programmed to have specific capacities or capabilities to store electrical charge. He then mapped a mathematical problem to the capacities in the network so that he could use the device to find the solution to a small version of a problem that is important for a wide range of modern analyzes.
The simple example we used was to examine the possible mutations that may have occurred in a family of viruses by comparing pieces of their genetic information.
Why it matters
Computer performance is quickly reaching its limits as the size of the smallest transistor in integrated circuits is now approaching 20 atoms wide. Smaller and physical principles that determine transistor behavior no longer apply. There is a high stakes competition to see if anyone can build a much better transistor, a method of stacking transistors, or some other device that can do the jobs that currently require thousands of transistors.
This search is important because people have become used to the exponential improvements in computing capacity and efficiency over the past 40 years, and many business models and our economies are based on this expectation. Engineers and computer scientists have now constructed machines that collect enormous amounts of data. This is the ore from which the most valuable commodity, information, is refined. The volume of this data almost doubles every year, which exceeds the ability of today’s computers to analyze it.
What other research is being done in this area?
The basic theory of neuron function was first proposed by Alan Hodgkin and Andrew Huxley about 70 years ago and is still used today. It is very complex and difficult to simulate on a computer, and it was only recently that Leon Chua re-analyzed it and incorporated it into the mathematics of modern nonlinear dynamics theory.
I was inspired by this work and have spent a lot of time over the past 10 years learning the necessary math and figuring out how to build a real electronic device that works as the theory predicts.
There are numerous research teams around the world who are taking different approaches to building brain-like or neuromorphic computer chips.
The technological challenge now is to scale our demonstration of proof-of-principles to something that can compete with today’s digital giants.
This article was republished from The Conversation by R. Stanley Williams, Professor of Electrical and Computer Engineering at Texas A&M University, under a Creative Commons license. Read the original article.
Published on October 11, 2020 – 18:00 UTC
These were the details of the news How brain-like circuits could take computing power to the next level for this day. We hope that we have succeeded by giving you the full details and information. To follow all our news, you can subscribe to the alerts system or to one of our different systems to provide you with all that is new.
It is also worth noting that the original news has been published and is available at de24.news and the editorial team at AlKhaleej Today has confirmed it and it has been modified, and it may have been completely transferred or quoted from it and you can read and follow this news from its main source.