In the era of artificial intelligence, neural networks play an important role in machine learning. A neural network is an algorithm that recognizes patterns by analyzing a set of “trained” examples, and is also considered to mimic the processing path of the brain. To adapt to scenarios such as autonomous driving, controlling robots, and medical diagnosis, the neural network must be adapted to various rapidly changing conditions.
Recently, MIT researchers said that they had designed a “liquid” neural network with major improvements. Its characteristic is that it can greatly expand the flexibility of AI technology after it is put into the training phase.
Researchers drew inspiration from tiny nematodes. Among them, the nematode’s nervous system has only 302 neurons, but it can generate unexpected and complex dynamics. Researchers carefully studied the nematode’s neural network and the process by which neurons activate and communicate with each other through electrical impulses.
And, in the equations that researchers use to construct new neural networks, the researchers allow the parameters to change over time based on the results of a set of nested differential equations.
This flexibility is the key to the success of this study. The behavior of most neural networks is fixed after the training phase, which means that it is difficult for them to adapt to changes in the input data stream. The fluidity of this “liquid” neural network makes it more resilient to unexpected or noisy data. For example, if the perceptual neural network on an unmanned car can distinguish environments such as clear sky and heavy snow, it can better adapt to changes in the situation and maintain higher performance.
In addition, the flexibility of neural networks has another advantage, which is easier to understand. The researchers’ mobile neural network bypassed the non-scalability “just changing the representation of a neuron”, so it is easier to peek into the “black box” of network decision-making and diagnose why the network makes a certain representation. This can help engineers understand and improve the performance of the liquid network.
Currently, the “liquid” neural network performs well in a series of tests. It is several percentage points ahead of other state-of-the-art time series algorithms in accurately predicting the future values of data sets ranging from atmospheric chemistry to traffic patterns. Researchers believe that the small size of the network means that it does not have high computational costs when completing the test. This also provides a promising prospect for further research.