[up to nervous]

Neo-neural Networks


John J. Barton.

Neo-neural Networks

Old Neural Networks Reviewed: Abdi, H. (1994). "A neural network primer," Journal of Biological Systems 2, 247--281.

Design dimensions in artifical neural networks (architectural)

Recurrent Neural Network:: Some Systems-Theoretic Aspects [Eduardo Sontag]

$\frac{{d{\bf x}}}{{dt}}(t) = \overrightarrow \sigma  ^{(n)} ({\bf Ax}(t) + {\bf Bu}(t)),{\bf y}(t) = {\bf Cx}(t)$ eg matrix A is the synapse weights, B selects input neurons, C selects output neurons.

$\overrightarrow \sigma  ^{(n)} :\left( {\begin{array}{*{20}c}eg, Sigmoidal function like tanh(x)

$\begin{array}{l}presumably n>>m,p usually

Quantum Cellular Neural Networks
We have previously proposed a way of using coupled quantum dots to construct digital computing elements - quantum-dot cellular automata (QCA). Here we consider a different approach to using coupled quantum-dot cells in an architecture which, rather that reproducing Boolean logic, uses a physical near-neighbor connectivity to construct an analog Cellular Neural Network (CNN).[Geza Toth, Craig S. Lent, P. Douglas Tougaw, Yuriy Brazhnik, Weiwen Weng, Wolfgang Porod Ruey-Wen Liu, Yih-Fang Huang Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana Published in Superlattices and Microstructure, Vol. 20, No. 4, 473(1996)]
Synchronization: The Computational Currency of Cognition
[Finkel LH, Yen S-C, Menschik ED: Synchronization: The Computational Currency of Cognition. ICANN 98, Proceedings of the 8th International Conference on Artificial Neural Networks Skövde, Sweden, 2-4 September 1998. Niklasson L, Boden M, Ziemke T (eds.). New York: Springer-Verlag, 1998]
Spike timing and gain control.
Cortical neurons exhibit tremendous variability in the number and temporal distribution of spikes in their discharge patterns... irregular [time between successive action potentials] arises as a consequence of a specific problem that cortical neurons must solve: the problem of dynamic range or gain control....We will refer to this as a “high-input regime” to distinguish it from situations common in subcortical structures in which the activity of a few inputs determines the response of the neuron. Consequently neural nets can be way off the mark as models for brain computation. ...Perhaps this is the cross over from sensor/actutator to computation? —the price of a reasonable dynamic range is noise. Compare to Grobstein: another natural variability. [Shadlen, MN and WT Newsome (1998). The variable discharge of cortical neurons: implications for connectivity, computation, and information coding. J. Neurosci. 18(10): 1370-3896.]
Dynamic Stochastic Synapses as Computational Units
Single excitatory synapses in the mammalian cortex exhibit binary responses. At each release site, either zero or one neurotransmitter-filled vesicles is released in response to a spike from the presynaptic neuron. When a vesicle is released, its contents cross the synaptic cleft and open ion channels in the postsynaptic membrane, thereby creating an electrical pulse in the postsynaptic neuron. The probability pS.ti/ that a vesicle is released by a synapse S varies systematically with the precise timing of the spikes ti in spike train; the mean size of the postsynaptic response, by contrast, does not vary in a systematic manner for different spikes in a spike train from the presynaptic neuron (Dobrunz & Stevens, 1997). Moreover, the release probability varies among different release sites; that is, release probability is heterogenous...[Wolfgang Maass Anthony M. Zador Salk Institute, La Jolla] Notes.
Redundancy Reduction and Sustained Firing with Stochastic Depressing Synapses
More generally, depressing synapses may underlie a mode of neuronal computation in which individual postsynaptic spikes emphasize specific temporal features of individual inputs, rather than responding equally to all presynaptic spikes arriving along a particular input (Dobrunz and Stevens, 1999). Whereas simpler models of neurons consider them as nonselective integrators of their inputs, a postsynaptic neuron with decorrelating synapses preferentially reflects interesting (i.e., nonredundant) features of its individual inputs. For the example of a neuron receiving saccade model input as in Figure 6, the postsynaptic spikes produced by a neuron with depressing synapses preferentially reflect the jumps in firing rate that are the most prominent feature of this input (Goldman, 2000). In this manner, decorrelating synaptic inputs can provide a neuron with a matched filter that is tuned to the statistics of its individual inputs (Abbott et al., 1997; Maass and Zador, 1999).
Programming Time-domain neural networks
For a network of spiking neurons that encodes information in the timing of individual spike times, we derive a supervised learning rule, SpikeProp, akin to traditional errorbackpropagation.[Error-backpropagation in temporally encoded networks of spiking neurons Sander M. Bohtea;*, Joost N. Koka; c , Han La Poutr/ea;b]
Nonlinearity in synapse as element of computation.
In a digital computer, the basic nonlinearity is of course the transistor. In the brain, however, the answer is not as clear. Among brain modelers, the conventional view, first enunciated by McCulloch and Pitts1, is that the single neuron represents the basic unit. In these models, a neuron is usually represented as a device that computes a linear sum of the inputs it receives from other neurons, weighted perhaps by the strengths of synaptic connections, and then passes this sum through a static nonlinearity. ....Experimentalists have recognized for decades that a synapse is not merely a passive device whose output is a linear function of its input, but is instead a dynamic element with complex nonlinear behavior8.[doi:10.1038/81432; The basic unit of computation Anthony M. Zador] What if the integration function on the synapsis is varied, from all-pass to step function?
Review: The role of single neurons in information processing
The role of neurons in these computations has evolved conceptually from that of a simple integrator of synaptic inputs until a threshold is reached and an output pulse is initiated, to a much more sophisticated processor with mixed analog-digital logic and highly adaptive synaptic elements. [doi:10.1038/81444, The role of single neurons in information processing, Christof Koch, Idan Segev] Interesting no ref to work of Maass?