[
CogSci
Summaries home |
UP
|
email
]
http://www.jimdavies.org/summaries/
de Garis, H. (1990). Building Artificial Nervous Systems Using
Genetically Programmed Neural Network Modules.
Machine Learning: Proceedings of the Seventh International
Converence, 132--139.
@InProceedings{deGaris1990,
author = {de Garis, Hugo},
title = {Building Artificial Nervous Systems Using
Genetically Programmed Neural Network Modules},
booktitle = {Machine Learning: Proceedings of the Seventh
International Conference},
crossref = {},
key = {},
pages = {132--139},
year = {1990},
editor = {},
volume = {},
number = {},
series = {},
address = {},
month = {},
organization = {},
publisher = {},
note = {},
annote = {}
}
Author of the summary: Joel A. Pettis, 2006
Cite this paper for:
- Behavioural Memory is the tendency of a behaviour which was
evolved in an earlier phase to persist in a later phase. [3]
- GenNets are powerful enough to provide highly time dependent
control. [2]
- By combining the actions of many GenNets, it seems likely that
it will be possible to build artificial nervous systems or
"nanobrains". [4]
- GenNets can be combined to form a functioning (simulated)
artificial nervous system. [4]
- A Darwin Machine is a theoretical tool that is able to implement
many different GenNets very quickly.
- With suitable Darwin Machines, one can imagine teams of Genetic
Programmers undertaking large scale projects to build much more
sophisticated
nervous systems with many thousands of GenNets. [5]
- Limitations in the Genetic Programming approach include both
intra GenNet and inter GenNet limitations. [6]
- Use Shaping to evolve the behaviour of GenNets in multiple
steps rather than one big step [6]
- Make sure the output curve has a non zero slope during the last
few cycles of GenNet to allow the possibility of changing the output
during later
cycles [6]
Avoid multi-function GenNets [6]
- Keep the number of evolutionary cycles of GenNets small [6]
- Weight your fitness measure of GenNets[6]
- Chaos is not a problem in GenNets since low fitness scores can
be quickly eliminated [6]
- With nanotechnology, billions of nanorobots could function in
parallel in a molecular environment, and report back to some central
molecular
processor, which determines the next generation. [6]
Genetic Programming uses the Genetic Algorithm (GA) to design neural
network modules [de GARIS 1990]. The programmer specifies all of the
characteristics and input parameters and the GA is used to find both
the signs and the values of the weights of the network which provides
the functionality desired. Once the weights are found, it is deemed a
GenNet module and thus can be used as a component in a more complex
structure.
GenNets are powerful enough to provide highly time dependent control
[2]. This is shown using a simple pair of stick legs which are taught
to walk :
- The aim of the exercise is to evolve GenNets which make the
stick legs move as far as possible to the right in the user
specified number of cycles
and cycle time.
- The chromosomes used to evolve the weights and their signs in
the GA are simple binary strings. With P binary places and N
neurons, a chromosome
will be N*N(P + 1) bits long
- Knowing the values of the angular accelerations and knowing the
values of the angles and the angular velocities at the beginning of
the cycle, one
can calculate the values of the angles and angular velocities
at the end of the cycle
- The quality of the criterion is the velocity of the stick legs
moving to the right where right distances are non negative
- A series of experiments were performed
In the first experiment, no constraint was imposed on the motion of
the stick legs. The resulting motion was un-lifelike but the legs
still managed to learn to move to the right.
In the second experiment, the hip joint was restricted to remain above
the floor. The evolution was slower and almost as un-lifelike as the
first experiment.
In the third experiment, a full set of constraints were imposed to
ensure a lifelike walking motion. The result was that the legs tried
to take the longest steps possible and in turn "did the splits" which
ceased the evolution. This was a valuable lesson on the important
concept steps. The quality measure was the product of the number of
net
positive steps taken to the right and the distance
The GenNet
was then used in the third phase which used the distance covered as
the quality measure in order to increase the step size. The result
was a definite stepping motion with long strides
The success of these experiments raises the prospect of building
artificial nervous systems ("brain building"), i.e. combing functional
and control GenNets to build simple brains.
By combining the actions of many GenNets, it seems likely that it will
be possible to build artificial nervous systems or "nanobrains."
A concrete proposal is presented showing how GenNets can be combined
to form a functioning artificial nervous system; however the
implementation has not been completed.
A lizard-like creature named LIZZY was chosen to illustrate this
endeavour:
- It consists of a wire frame body with four legs and a fixed
antenna where the head should be
- It is capable of reacting to 3 kinds of creatures: mates,
predators and prey
- Each creature emits a sinusoidal signal of a characteristic
frequency that is detected by LIZZY's antenna
- The amplitudes of the signals decrease inversely as a function
of distance
- Once a signal becomes large enough, LIZZY executes an
appropriate sequence of actions depending upon the outcome. If it is
a prey, LIZZY moves
towards it, stops, and pecks at it. If it is predator, LIZZY
moves away from it, and if it is a mate, LIZZY moves towards it and
mounts it.
- In order to execute these behaviours, a detailed circuit of
GenNets is designed and is shown in FIG. 4
- It is assumed that the object is always placed initially in
front of LIZZY's body so if the signal strength on the left antenna
is larger than
that on the right, and if the object is a prey, then LIZZY is
to turn towards it by rotating clockwise (or anticlockwise if the
object is predator).
Eventually the two signal strengths will become approximately
equal and LIZZY will move forward.
The power and fun of GenNets is that one can evolve a motion without
specifying in detail, how the motion is to be performed.
Implementing many different GenNets makes one conscious of the need
for Genetic Programming tools. Initially this could take the form of a
software package, but in the age of VLSI one can readily imagine VLSI
chips being designed to form the same task as the software package,
but much faster. Garis calls this tool a Darwin Machine. [5]
With suitable Darwin Machines, one can imagine teams of Genetic
Programmers undertaking large scale projects to build much more
sophisticated nervous systems with many thousands of GenNets. [5]
There are some limitations in the Genetic Programming approach
however. There are two types of limitations: intra GenNet and inter
GenNet [6] lessons to be learned about intra GenNet limitations:
- Use Shaping to evolve the behaviour in multiple steps rather
than one big step [6]
- Make sure the output curve has a non zero slope during the last
few cycles to allow the possibility of changing the output during
later cycles [6]
- Avoid multi-function GenNets [6]
- Keep the number of evolutionary cycles small [6]
- Weight your fitness measure [6]
- Chaos is not a problem since low fitness scores can be quickly
eliminated [6]
Inter GenNet limitations are even more of a challenge. How would one
know how to put together a machine with billions of GenNets? Garis
wants to try 3 things:
- Give GenNets the capacity to learn
- Attempt to Genetically Program robots
- Attempt to get the Genetic Algorithm itself to connect the
GenNets
With these things in place, one can imagine populations of simulated
"creatures" competing in an environment consisting of other creatures,
all of which are "constructed" with the GA. The fitness measure would
be truly Darwinian. [6]
With nanotechnology, billions of nanorobots could function in parallel
in a molecular environment, and report back to some central molecular
processor, which determines the next generation. [6]
Summary author's notes:
- the page numbers are from a preprint version.
Back
to the Cognitive Science Summaries homepage
Cognitive Science Summaries Webmaster:
JimDavies
(jim@jimdavies.org)
Last modified: Fri Feb 10 09:04:20 EDT 2006