Unless otherwise indicated,
all glossary content is:
(C) Copyright 2008-2022
Dominic John Repici
~ ALL RIGHTS RESERVED ~
No part of this Content may be copied without the
express written permision of Dominic John Repici.
Synapses are, by definition, connections (etymology). Conventionally, most synapses have been thought of as inputs (unidirectional). They receive stimuli from other neurons and neuron-like processes (senses, etc.) which, in turn, affect the output of the neuron they are part of.
When stimulated, a synapse's effect on its neuron is either to increase the likelihood of a positive output on the neuron's axon (excite), or to decrease the likelihood of a positive output on the neuron's axon (inhibit).
In traditional artificial neural networks (ANNs), synapses are typically implemented with signed weight values. Typically, the weight values are used to modulate the signal value at each synapse. The weighted results from many such calculations, at many synapses, are then summed together to produce an axon level which is passed on to the neuron's output processes.
Netlab™ introduces a new form of synapse (or synaptic connection-point) called a multitemporal synapse. This synapse includes multiple connection-weights for a given connection, where each weight (representing a connection strength) can learn and forget at a different rate. This allows a set of fast-learning weights to quickly form detailed responses to familiar situations, driven and prompted by the blunt beginnings of correct responses maintained in the slower-learning permanent weights.