Key Word Search
by term...

by definition...

for letter: "F"
Faraday, Michael
First Person
First Person Knowledge
Fischer, Edmond H.
Forget Process
Forget Rate
Francis Bacon
Franklin, Rosalind Elsie

Unless otherwise indicated, all glossary content is:
(C) Copyright 2008-2022
Dominic John Repici
No part of this Content may be copied without the express written permision of Dominic John Repici.



Also called Feedback Signals.

Feedback describes the use of output signals, which are fed back to be used as input signals. These are usually used in tandem with other input signals, and cause Turing-indeterminate responses of the systems that employ them.

The term does not necessarily describe direct feedback, from the output of a neuron to its input. It may also describe indirect feedback, such as when the output of a given neuron is connected to pre-synaptic neuron(s) which eventually are connected (possibly through other neurons) back to the signal-producing neuron's input.

Finally, and possibly most importantly, the output of a given neuron or set of neurons may produce signals that have an effect on the external environment. The changes in the external environment are sensed by input sensors which, in turn, produce signals that are eventually part of the input to the neurons that produced the initial output.

. . . . . . .
In Neural Networks

Many traditional neural network learning algorithms restrict network topologies to forward signaling only. That is, they do not allow feedback in the design of the neural network. This leads to topological layers in these designs, which are sometimes referred to as Multilayer Perceptrons, or MLPs.

Netlab introduces a new learning algorithm, called influence learning, which is based on attraction to influence exerted over other neurons during signal propagation. This algorithm is completely feedback-tolerant.

Also: Adaptive Feedback     Reactive Feedback     Inward and Outward Propagation


Web-based glossary software: (c) Creativyst, 2001-2022