Specifically, with regard to neural networks, it is a state that a
learning neural network sometimes gets into, where the
weight adjustments for one or more training patterns simply offset the adjustments performed for a previously trained pattern. The previously trained pattern is not in its ideal desired output mapping, but is stuck in a less than ideal "local" response mapping, referred to as a
local minimum. This state can sometimes be avoided by randomly jogging the connection weights (
weight jogging). The underlying concept of a local minimum is often spoken of in the plural:
local minima.
It is generally asserted that lower is better. That is, lower represents the more correct solution or the more desired, or ideal, state. This is leaving a lot of loose ends, of course, since adaptive systems tend to get very intertwined with philosophically unsettled issues when you begin to consider things like correctness.