Instinct involves inherited behavior. How can behaviors be inherited?
How exactly does a newly hatched spider weave a perfect web that is characteristic of his or her species—without ever even having seen such a web, let alone having been trained to spin one? How do butterflies know what to do? Why do dogs and cats behave as dogs and cats? Why do rabbits behave as they do? Instinct is one possible answer. People, especially psychologists, have long considered instinct to be an important determinant of behavior. But how does instinct work? What are the relevant mechanisms that enable instinct to function as it does?
We currently have a good understanding of how humans learn via memoryformation. This information can be used to understand how instinct works because rigid instinctive behavior is the polar opposite of flexible learned human behavior. There may even be a learning-memory continuum of synaptic flexibility, with humans at one end and creatures such as spiders at the other end.
Both ends of a learning-memory continuum are adaptive. Experience-dependent flexibility over an extended developmental period enables humans to acquire complex skills and intelligent behaviors. Preset rigidity avoids the risks and perils of development and parenting by enabling “adult” behavior from the start—which brings us to the topic of evolution.
Instincts obviously evolved along with the rest of the organism’s body via the same principles of variation and natural selection that drives and explains phylogenetic evolution. Here we are talking about behavioral evolution as a parallel to phylogenetic evolution. We know that DNA is the genetic mechanism that mediates phylogenetic evolution but can it also be responsible for behavioral evolution and instinct? If so, how exactly can this occur? What mechanisms might allow for behavior to be inherited?
Computational models of learning and memory enable us to better understand how the central relevant biological mechanisms function because simulations animate causal relationships. Of special interest is how artificial neural networks (ANNs) known as parallel distributed processing connectionist neural network models are trained.
Cognitive Neuroscience and Psychotherapy: Network Principles for a Unified Theory but summarize them below. This technology is known as machine learning because computers can simulate human and animal learning and memory. This technology is also referred to as deep learning in recognition of its remarkable ability to simulate human cognition using connectionist neural network models.
Notable achievements in this regard include when IBM’s Deep Blue computer won the chess championship from humans in 1997. When IBM’s Watson won the Jeopardy championship from humans in 2011. When IBM’s Watson won the Heads-up no-limit Texas hold’em from humans in 2015. When Google’s Deep Mind won the AlphaGo championship from humans in 2016. In some of these cases, machines had to learn to “understand” natural language as well as humans do. In all of these cases, machines had to discern subtle relationships, formulate strategies, and do so more effectively than the best human experts. Bengio (2016) summarized major advances in artificial intelligence in the popular magazine Scientific American. Engelking (2017) reviewed progress being made by the Allen Institute for Artificial Intelligence; the nation’s largest nonprofit artificial intelligence institute founded in 2014, in the Discover magazine.
Let’s take a closer look at the relevant mechanisms that enable learning via memory formation. I confine my discussion to basic principles in order to avoid getting into too many technical details. These principles pertain to neural network simulations and to biological systems. I believe that these principles are sufficient for you to generally understand how instincts might work.
The first principle is that nervous systems are networks of neural networks consisting of many neurons. Humans are estimated to have 100 billion neurons. It is estimated that even the nervous systems of spiders have about 100,000 neurons. Each human neuron connects to many other neurons. In some cases, one human neuron can connect to as many as 10,000 other neurons. The human brain is estimated to contain 100 trillion synapses. We know that genetics, DNA, is responsible for constructing neural networks during embryological development. We also know that different creatures have different DNA which is responsible for their different nervous systems.
The second principle is that the neurons in all species are connected to other neurons by synapses, tiny gaps into which neurotransmitters are secreted. Some of these neurotransmitters facilitate electrical conduction from one neuron to another. Other neurotransmitters inhibit electrical conduction from one neuron to another. We know that genetics, DNA, is responsible for constructing these synapses. It therefore seems quite possible that the excitatory/inhibitory properties of synapses could be set during their construction rather than modified by experience. This possibility is central to the explanation of how instincts work presented below. It could be implemented by silencing or deleting the genes that enable experience-dependent synaptic flexibility.
A great deal of scientific knowledge supports the view that synapses are central to learning and memory (Hell & Ehlers, 2008). Flexible human learning requires that the excitatory/inhibitory properties of synapses be allowed to be set via experience-dependent synaptic plasticity mechanisms. Artificial neural networks such as those involved in the computer championships mentioned above simulate primitive nervous systems using computers or neuromorphic chips. Simulated neurons are interconnected by simulated synapses called connection weights because they mathematically connect simulated neurons to one another. Inputs to these ANNs do not initially result in meaningful desired outputs because the simulated synapses are not yet set to their optimal level. Equations are used to simulate biological experience-dependent plasticity mechanisms. They are used to progressively change and optimize connection weights during simulated learning trials so that the ANN finally functions effectively as illustrated by the computer championships mentioned above. The central point of interest here is that the ability of a fully trained “adult” ANN to properly perform all of the marvelous functions that it can do is directly dependent upon the final levels of excitation/inhibition that characterize the simulated synapses. The problem of setting all of the connection weights to optimal levels is far too complicated to program directly. A feedback process driven by experience is required for these simulated synaptic levels to settle into optimal states. Extensive training is typically required before an ANN can perform at a high level. A similar process of experience-driven synaptic modification enables every cognitive and motor skill that people acquire through learning.
Instinct appears to preset synaptic connections to “adult” values during embryology. That is to say, the genes that are responsible for constructing the synapses as part of the neural networks that mediate instinct appear to also set their functional properties to optimal excitatory or inhibitory levels that replicate what would have been achieved if the network had gone through a rigorous and comprehensive developmental learning phase. DNA appears to code for final “adult” synaptic values in the case of spiders where instinct seems to dominate their behavior. Genetics appears to exert a lesser but still noteworthy effect in what are called biologically prepared behaviors such as our fears of heights and the dark.
The ability of DNA to present properties of individual synapses across complex neural networks explains how behaviors can be inherited. This explains how spiders can weave complex webs shortly after hatching. It also explains why dogs and cats behave differently. Genetic variation explains individual behavioral differences—or in other words, why spiders of the same species may behave somewhat differently, or why individual dogs and cats differ temperamentally.
Bengio, Y. (2016, June). Machines who learn. Scientific American, 314(6), 46-51.
Engelking, C. (2017). Cultivating common sense. Discover, 38(3), 32-39.
Hell, J. W., & Ehlers, M. D. (Eds.). (2008). Structural and functional organization of the synapse. New York: Springer Science + Business Media.