Key Word Search
by term...

by definition...

for letter: "A"

Acquisition Time
Action Potential
Adaptive Feedback
Adaptive Input
Adaptive Signals
Alan Hodgkin
Alan Sokal
Albus, James S.
Alfred Wallace
Alimentary Reflex
Amino Acid
Andrew Huxley
Artificial Neural Network
Associative Learning
Augustus DeMorgan
Axon Guidance
Axon Level
Axon Pathfinding
Axon Potential
Axonal Pathfinding

Unless otherwise indicated, all glossary content is:
(C) Copyright 2008-2015
Dominic John Repici
No part of this Content may be copied without the express written permision of Dominic John Repici.


Adaptive Signals

Adaptive signals are typically, any signal carried on an axon (axon level) that is connected to the adaptive inputs of one or more neurons. More abstractly, it can refer to any propagated signal (AL or CI) that is being used to have an effect on changes to connection-weights in a neuron.

Adaptive Inputs (I-D)

Neurons in Netlab have two special input connection-points, where axons can be connected so that their signals can be used to effect weight-changes. These are called Adaptive Inputs.

Schematic of a neuron showing adaptive input synapses

Adaptive inputs are also called Increase and Decrease inputs, or simply I-D inputs.
  • The Increase or 'I' input is generally used to control how much connection-strengths (represented by weights) are enhanced, and is denoted by an inward pointing arrow-head ('v') on schematics of neurons. The increase input is shown on the top of the above neuron-schematic, but it can be placed anywhere on the neuron's body.

  • The Decrease or 'D' input is generally used to control how much =connection-strengths (weight values are reduced, and is denoted by a small 'o' on the inside edge of the neuron body. It can be anywhere on the neuron body, but is shown at the bottom of the neuron shown in the above diagram.

. . . . . . .
Language Ambiguities

In traditional ANNs, the terms "increase", and "decrease" have some confusion associated with them, because weight-values representing connection strengths often use sign to denote connection-type. Positive weight values are used for excitatory connection strengths, and negative (-) weight values represent inhibitory connections. In both cases, the absolute value of the weight represents the connection strength.

Because of this, increasing (I) the weight is not necessarily making it more positive. In a traditional ANN, increasing an inhibitory (negative) weight will reduce the connection strength represented by that weight.

Normally, in Netlab's™ connection-strength-centric model, increasing the weight would be the same as making the signed weight-value more negative (increasing its absolute value). Likewise, reducing the connection-strength of an inhibitory connection in Netlab would be the same as making a weight-value more positive in a traditional ANN. This is discussed in more detail in the glossary-entry for Connection Strength.

. . . . . . .
How Ambiguities Are Facilitated In Netlab

The ambiguity embodied in traditional neural-network practices is accommodated in Netlab's description language (Noodle™) by permitting the designer to specify whether a given learning method of a given weight-layer responds to adaptive cues in the traditional way. In this case, the weights within the weight-layer can be specified to be trained as:
  • signed-value based (denoted with 'PN') - the traditional practice, in which increase always means to make more positive, or

  • connection-strength based (denoted with 'CS') - in which increase will increase the absolute strength of the connection, regardless of whether it is inhibitory or excitatory.

. . . . . . .
Adaptive Signaling in Netlab

Signals carried on axons connected at adaptive inputs are called adaptive signals. Any given signal in Netlab, which is produced by a neuron or other signal source (such as a sensor) can serve as both reactive, and adaptive in a given neural network. Adaptive signaling and feedback mechanisms are roughly analogous to the signaling mechanisms that are used to achieve reinforcement learning in biological neural networks.

Occasionally the term reinforcement may also be used here to refer to adaptive signal-values, feedback, and mechanisms. Adaptive inputs are special inputs to a neuron, which carry signals that can be used by the neuron's learning method to produce an error value. The error value is used by the weight-adjustment algorithm to alter the value of the neuron's connection weights.

In other words, adaptive signals are normal signal values, such as those produced at the outputs of neurons, input nodes, or by sensors, that are used in a special way within the neural network. When serving in an adaptive capacity, signal values are not connected to the neuron's normal inputs to affect its output, but are instead used to effect changes to the neuron's connection weights.

. . . . . . .
Implicit Adaptive Signaling

In Netlab™ weight-changes can be affected, and effected through a variety of means beyond axon-level signals present at adaptive inputs. One such alternative mechanism for altering the value of weights in Netlab™ is CIs (Chemical Influences), which can both directly alter weight-values, or affect how other factors alter weight strengths.

Also: Reactive Feedback        


Web-based glossary software: (c) Creativyst, 2001-2015