5. Conclusion
From our exploration of the role of connectivity and symmetry in recursive neural networks, we find two regions that optimize limit behavior storage and signal-response association. The first region is composed of asymmetric/diluted networks, and the second region is formed of symmetric/fully-connected networks. Furthermore, we found a third region made of asymmetric/fullyconnected networks characterized by chaotic and glassy limit behaviors. From these results we are left with the question of why adaptation and evolution selected the first region. Is this because more non-zero elements in the connectivity matrix corresponds to more costly connections? Is it because fully-connected networks imply the existence of two axons between any two neurons and this would be spatially impossible if not technically implausible for large network sizes N? Is it because the natural learning rules that guide neural networks development force them to dynamically evolve in the second region?
Lastly, to partly overcome the smallness of N, we have also analyzed the scaling properties of the main measured quantities. We have found that the scaling behavior for the averaged number of length 1 cycles in fully-connected symmetric networks are perfectly in agreement with theoretical values found by Tanaka and Edwards (1980). Thus the averaged number of length 1 cycles is not biased by the finite size effects. In addition, in the analyzed range for small N, all the scaling laws appear highly robust. These observations do not guarantee that the same occurs for the scaling laws of other observable quantities in diluted networks. Nevertheless, it gives us confidence in the extrapolation of the scaling laws observed in this paper.