Modular neural network
Lua error in package.lua at line 80: module 'strict' not found.
A modular neural network is an artificial neural network characterized by a series of independent neural networks moderated by some intermediary. Each independent neural network serves as a module and operates on separate inputs to accomplish some subtask of the task the network hopes to perform.[1] The intermediary takes the outputs of each module and processes them to produce the output of the network as a whole. The intermediary only accepts the modules’ outputs—it does not respond to, nor otherwise signal, the modules. As well, the modules do not interact with each other.
Contents
Biological basis
As artificial neural network research progresses, it is appropriate that artificial neural networks continue to draw on their biological inspiration and emulate the segmentation and modularization found in the brain. The brain, for example, divides the complex task of visual perception into many subtasks.[2] Within a part of the brain, called the thalamus, lies the lateral geniculate nucleus (LGN) which is divided into different layers that separately process color and contrast: both major components of vision.[3] After the LGN processes each component in parallel, it passes the result to another region to compile the results.
Certainly some tasks that the brain handles, like vision, have a hierarchy of sub-networks. However, it is not clear whether there is some intermediary which ties these separate processes together on a grander scale. Rather, as the tasks grow more abstract, the isolation and compartmentalization breaks down between the modules and they begin to communicate back and forth. At this point, the modular neural network analogy is either incomplete or inadequate.
Complexity
One of the major benefits of a modular neural network is the ability to reduce a large, unwieldy neural network to smaller, more manageable components.[1] There are some tasks it appears are for practical purposes intractable for a single neural network as its size increases. The following are benefits of using a modular neural network over a single all-encompassing neural network.
Efficiency
The possible connections increases at a daunting rate as nodes are added to the network. Since computation time depends on the number of nodes and their connections, any increase here will have drastic consequences in the processing time. As the greater task is further compartmentalized, the possible connections each node can make are limited, and the subtasks will hopefully execute more efficiently than trying to tackle the whole task at once.
Training
A large neural network attempting to model multiple parameters can suffer from interference as new data can dramatically alter existing connections or just serve to confuse. With some foresight into the subtasks to be solved, each neural network can be tailored for its task. This means the training algorithm used, and the training data used for each sub-network can be unique and implemented much more quickly. In large part this is due to the possible combinations of interesting factors diminishing as the number of inputs decreases.
Robustness
Regardless of whether a large neural network is biological or artificial, it remains largely susceptible to interference at and failure in any one of its nodes. By compartmentalizing subtasks, failure and interference are much more readily diagnosed and their effects on other sub-networks are eliminated as each one is independent of the other.
Notes
References
- Azam, Farooq. Biologically Inspired Modular Neural Networks. PhD Dissertation, Virginia Tech. 2000 http://scholar.lib.vt.edu/theses/available/etd-06092000-12150028/unrestricted/etd.pdf
- Happel, Bart and Murre, Jacob. The Design and Evolution of Modular Neural Network Architectures. Neural Networks, 7: 985-1004; 1994. http://citeseer.comp.nus.edu.sg/cache/papers/cs/3480/ftp:zSzzSzftp.mrc-apu.cam.ac.ukzSzpubzSznnzSzmurrezSznnga1.pdf/the-design-and-evolution.pdf
- Hubel, DH and Livingstone, MS. Color and contrast sensitivity in the lateral geniculate body and primary visual cortex of the macaque monkey. Journal of Neuroscience. 10: 2223-2237; 1990 http://www.jneurosci.org/cgi/content/abstract/10/7/2223
- Tahmasebi, P., Hezarkhani, A., 2011. Application of a Modular Feedforward for Grade Estimation” Natural Resources Research, 20(1), 25-32. DOI: 10.1007/s11053-011-9135-3. http://link.springer.com/article/10.1007%2Fs11053-011-9135-3
Engineers Solve a Biological Mystery and Boost Artificial Intelligence .http://arxiv.org/abs/1207.2743
- Tahmasebi, Pejman, and Ardeshir Hezarkhani. "A fast and independent architecture of artificial neural network for permeability prediction." Journal of Petroleum Science and Engineering 86 (2012): 118-126.