Dynamical Neural Network: Information and Topology
A neural network works as an associative memory device if it has large storage capacity and the quality of the retrieval is good enough. The learning and attractor abilities of the network both can be measured by the mutual information (MI), between patterns and retrieval states. This paper deals with a search for an optimal topology, of a Hebb network, in the sense of the maximal MI. We use small-world topology. The connectivity γ ranges from an extremely diluted to the fully connected network; the randomness ω ranges from purely local to completely random neighbors. It is found that, while stability implies an optimal MI(γ,ω) at γ_opt(ω)→ 0, for the dynamics, the optimal topology holds at certain γ_opt>0 whenever 0≤ω<0.3.
READ FULL TEXT