a fetus and infant developmental scenario: self-organization ...a fetus and infant developmental...

8
A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints Yasunori Yamada * ** Hiroki Mori ** Yasuo Kuniyoshi * ** * The University of Tokyo 7-3-1, Hongo, Bunkyo-ku, Tokyo, 113-8656, Japan {y-yamada, hiroki, kuniyosh}@isi.imi.i.u-tokyo.ac.jp ** JST ERATO Asada Synergistic Intelligence Project Abstract To gain a synthetic understanding of the emergence of babies’ goal-directed behaviors that are important for early development, we constructed an early developmental scenario for self-organization of these behaviors. This scenario is guided by a principle of sensorimo- tor integration based on sensory constraints. We constructed a neural model to represent our proposed scenario with simplified but es- sential and realistic biological features. To test this early developmental scenario, we per- formed simulations with musculoskeletal body models. In a series of experiments we found that sensory constraints triggered two impor- tant developmental transitions. First, in a fetus model, the non-uniform distribution of tactile sensors generated hand-face contact behaviors, as a result of the self-organizing process. Second, in an infant model, the lim- itation of visual field produced hand regard behaviors, also in a self-organizing manner. Our results suggest that sensory constraints may drive goal-directed behaviors and conse- quently trigger human early development in an embodied and self-organized manner. 1. Introduction In observing the behavior of fetuses or infants, it is not dicult to infer underlying goals or pur- poses. For instance, hand-face contact behaviors have been observed from 10 weeks postmenstrual age (de Vries et al., 1982). Furthermore, it has been found that newborn infants spend up to 20 per- cent of their waking hours displaying these behav- iors (Korner and Kraemer, 1972). In infants be- tween 2 and 5 months of age, hand regard behaviors, (i.e. moving hand(s) in front of the face) are fre- quently observed (Rochat, 2001; Asada et al., 2009). These behaviors can be considered to represent ’goal- directed’ actions. Converging developmental studies have empha- sized the importance of these goal-directed behav- iors for cognitive development. When spontaneously touching their own faces, infants and fetuses are po- tentially experiencing a sensorimotor and perceptual event. The intermodal and double-touch experience involved in hand-face contact behavior is important for uniquely specifying one’s own body as a dier- entiated agent in the environment (Rochat, 2001). Hand regard behaviors provide an opportunity for exploring the relationship between vision and pro- prioception. Rochat suggested these types of body exploration may provide a foundation for the devel- opment of self-knowledge (Rochat, 2001). Although recent studies have suggested the im- portance of goal-directed behaviors for early devel- opment, few studies have examined the underlying mechanisms generating these behaviors. The purpose of the present study was to reveal the underlying mechanisms generating important goal- directed behaviors in human babies. We proposed an early developmental scenario for self-organization of goal-directed behaviors from sensory constraints. We examined the early developmental scenario by con- ducting simulations based on musculoskeletal body models. We showed that sensory information flow triggered two important developmental transitions. These results suggest that goal-directed behaviors can be generated as a result of self-organizing phe- nomena without any goal or purpose. 2. Early developmental scenario 2.1 Self-organization hypothesis We hypothesized that goal-directed behaviors are the result of a process of self-organization guided by a principle of sensorimotor integration, based on sen- sory constraints. We believe that sensory constraints 145 Johansson, B., !ahin, E. & Balkenius, C. (2010). Proceedings of the Tenth International Conference on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund University Cognitive Studies, 149.

Upload: others

Post on 13-Oct-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

A Fetus and Infant Developmental Scenario:

Self-organization of Goal-directed Behaviors

Based on Sensory Constraints

Yasunori Yamada! !! Hiroki Mori !! Yasuo Kuniyoshi! !!

!The University of Tokyo7-3-1, Hongo, Bunkyo-ku, Tokyo, 113-8656, Japan{y-yamada, hiroki, kuniyosh}@isi.imi.i.u-tokyo.ac.jp!!JST ERATO Asada Synergistic Intelligence Project

Abstract

To gain a synthetic understanding of theemergence of babies’ goal-directed behaviorsthat are important for early development, weconstructed an early developmental scenariofor self-organization of these behaviors. Thisscenario is guided by a principle of sensorimo-tor integration based on sensory constraints.We constructed a neural model to representour proposed scenario with simplified but es-sential and realistic biological features. Totest this early developmental scenario, we per-formed simulations with musculoskeletal bodymodels. In a series of experiments we foundthat sensory constraints triggered two impor-tant developmental transitions. First, in afetus model, the non-uniform distribution oftactile sensors generated hand-face contactbehaviors, as a result of the self-organizingprocess. Second, in an infant model, the lim-itation of visual field produced hand regardbehaviors, also in a self-organizing manner.Our results suggest that sensory constraintsmay drive goal-directed behaviors and conse-quently trigger human early development inan embodied and self-organized manner.

1. Introduction

In observing the behavior of fetuses or infants, itis not di!cult to infer underlying goals or pur-poses. For instance, hand-face contact behaviorshave been observed from 10 weeks postmenstrualage (de Vries et al., 1982). Furthermore, it has beenfound that newborn infants spend up to 20 per-cent of their waking hours displaying these behav-iors (Korner and Kraemer, 1972). In infants be-tween 2 and 5 months of age, hand regard behaviors,(i.e. moving hand(s) in front of the face) are fre-quently observed (Rochat, 2001; Asada et al., 2009).

These behaviors can be considered to represent ’goal-directed’ actions.

Converging developmental studies have empha-sized the importance of these goal-directed behav-iors for cognitive development. When spontaneouslytouching their own faces, infants and fetuses are po-tentially experiencing a sensorimotor and perceptualevent. The intermodal and double-touch experienceinvolved in hand-face contact behavior is importantfor uniquely specifying one’s own body as a di"er-entiated agent in the environment (Rochat, 2001).Hand regard behaviors provide an opportunity forexploring the relationship between vision and pro-prioception. Rochat suggested these types of bodyexploration may provide a foundation for the devel-opment of self-knowledge (Rochat, 2001).

Although recent studies have suggested the im-portance of goal-directed behaviors for early devel-opment, few studies have examined the underlyingmechanisms generating these behaviors.

The purpose of the present study was to reveal theunderlying mechanisms generating important goal-directed behaviors in human babies. We proposed anearly developmental scenario for self-organization ofgoal-directed behaviors from sensory constraints. Weexamined the early developmental scenario by con-ducting simulations based on musculoskeletal bodymodels. We showed that sensory information flowtriggered two important developmental transitions.These results suggest that goal-directed behaviorscan be generated as a result of self-organizing phe-nomena without any goal or purpose.

2. Early developmental scenario

2.1 Self-organization hypothesis

We hypothesized that goal-directed behaviors are theresult of a process of self-organization guided by aprinciple of sensorimotor integration, based on sen-sory constraints. We believe that sensory constraints

145

Johansson, B., !ahin, E. & Balkenius, C. (2010). Proceedings of the Tenth International Conference on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund University Cognitive Studies, 149.

Page 2: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

endow movements with tendencies that reflect thesensory constraints, and guide the emergence of goal-directed behaviors in a self-organizing process. Onthe basis of this hypothesis, we considered that non-uniform distribution of tactile cells, dense in thehands and face, generates hand-face contact behav-iors. Similarly, limitation of visual field may pro-duce hand regard behaviors. These predictions arebased on the notion that early human development isan adaptive process that emerges in a self-organizedmanner through interactions with the body, environ-ment, and nervous system.

2.2 Sensorimotor integration

Accumulating evidence from developmental researchhas suggested the significance of learning beginningfrom the fetal period for motor and cognitive devel-opment (Johnson, 2005). Sensorimotor integrationis fundamental for this early learning process, andspontaneous movements and contingent sensationsare important for this sensorimotor integration. Assuch, we focused on general movements (GMs) andsomatosensory modality.

GMs are spontaneous atypical whole bodymovements of human fetuses and infants(Prechtl et al., 1979), which emerge from thepostmenstrual age of 9 weeks (de Vries et al., 1982).These movements consist of gross body movementsof variable speed and amplitude, but lack clearpatterning (Bekedam et al., 1985). Theoretically,the rich variation and complexity of GMs couldprovide various interactions with environments andgood opportunities for sensorimotor integrationwith sensory and motor information gained fromthe GMs. Recent studies in subjects with spinalcord injuries have suggested neural oscillators1 mayexist in low spinal cord (Dimitrijevic et al., 1998).Prechtl noted the importance of the neural oscil-lator1 for the neural mechanisms underlying GMs(Prechtl, 1997) because GMs have been observedin anencephalic fetuses (Visser et al., 1985). Thus,we emphasize the interaction of GMs generated byneural oscillators in the process of sensorimotorintegration.

In the case of sensation, the somatosensory modal-ity is likely to play an important role in sensori-motor integration because of the high contingencybetween spontaneous movement and somatosensoryinformation, especially in a uterine environment pre-dominantly comprised of amniotic fluid and the uter-ine wall. In the fetal development of sensation, so-matosensor is the first modality to develop, followedby other senses such as taste, audition, and vision.By 13.5-14 weeks gestational age, reflex responsesto tactile stimulation have been observed in almost

1Neural oscillator was described as CPG (Central PatternGenerator) in the original article.

all parts of the body (Bradley and Mistretta, 1975).In addition, muscle spindles (muscle length sen-sors) are developed by 20 weeks gestational age(Sarnat, 2003). Therefore, it might be expected thatthe somatosensory modality plays a central role inthe early development of sensorimotor integration,starting from the fetal stage.

The central nervous system develops in parallelwith the motor and sensation development. By 17.5days gestational age, primary a!erent axons of thedescending bulbospinal and corticospinal pathwaysare already formed, and reach the ventral horns ofthe lumbar spinal cord. By 20 weeks gestational age,corticospinal tract axons reach target peripheral neu-rons of whole body (Sarnat, 2003).

Areas of the central nervous system are related tosomatosensory processing, including the primary so-matosensory area (S1), which underlies touch andproprioception. S1 facilitates the adjustment of mo-tor commands through projections to the primarymotor area (M1), which is involved in stimulatingmuscle contractions. Both S1 and M1 contain so-matotopic representations of the body, maintainingits spatial organization. Individual areas in thesesomatotopic maps are not directly proportional tothe size of the part of the body they correspondto. For example the face and hands occupy dispro-portionately large regions of the somatosensory andmotor cortex (Penfield and Boldrey, 1937). Severalhuman studies have revealed that a!erent input pat-terns underlie these cortical representations, deter-mining the cortical area dedicated to an area of thebody. For instance, Pascual-Leone et al. showed thatan expansion of the sensorimotor cortical representa-tions corresponding to the right index (reading) fin-ger was exhibited in Braille readers compared withthose of the left index (non-reading) finger and ofthe right and left index fingers of the control subjects(Pascual-Leone and Torres, 1937). Furthermore, theresults of Noppeney et al. suggested that spatial at-tention may modulate the somatotopic organizationof S1 (Noppeney et al., 1999). It has been proposedthat the neuromodulator acetylcholine is responsiblefor the plasticity of S1 induced by spatial attention(Lamour et al., 1988). Thus, sensorimotor integra-tion in S1 and M1 is achieved through adaptive self-organizing processes of these cortical representations.Moreover, the self-organization process is not onlydependent on a!erent input from the somatosensorysystem, but is also involved in spatial attention.

In the genesis of the cortical representation, toour knowledge, the point at which the representationemerges in human development has never been di-rectly investigated. However, Rochat suggested thatnewborns are capable of discriminating between ex-ternal and self-stimulation in observations of rootingresponses (i.e., head turn towards stimulation with

146

Page 3: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

mouth open and tongue movement) (Rochat, 1997).Due to the body complexity it is unlikely that new-born integrates somatosensory and motor informa-tion without learning. Thus, it seems reasonable tohypothesize that the sensorimotor integration startsfrom the fetal stage.

To summarize, GMs, the somatosensory modality,and the central nervous system, which are all im-portant elements in the early development of senso-rimotor integration, are already present at the fetalstage. The self-organization of sensorimotor integra-tion in S1 and M1 is starting from the fetal stageand continues to develop through early developmen-tal period on the basis of somatosensory and motorinformation arising from GMs.

2.3 Early developmental scenario for self-organizing goal-directed behaviors

We proposed an early developmental scenario for theself-organization of goal-directed behavior guided bya principle of sensorimotor integration based on sen-sory constraints. A model of this early developmen-tal scenario is described below.

Human babies are actively engaged in GMs gen-erated by neural oscillators in the lower spinal cord,which causes various types of interaction with theenvironment. Somatosensory information from theseinteractions is projected to S1, and representations ofsensory experiences are formed by a self-organizingprocess. Furthermore, spatial attention dynami-cally modulates these representations. M1 receivesthis information from S1, and adjusts motor activ-ity projecting neurons involved in motor activity.Somatosensory and motor information is integratedin M1 through self-organizing process based on thecontingency of the information. In such a process,it must be emphasized that sensory constraints in-evitably a!ect the self-organizing process. As a con-sequence, the self-organization process endows move-ments with tendencies that reflect the sensory con-straints, and enable the emergence of goal-directedbehaviors.

2.4 Neural model

We developed a neural model that representsour early developmental scenario, involving sim-plified but biologically realistic properties (Fig.1). All modules are integrated with time de-lay (transfer lag between modules) and gain pa-rameters. These parameters were largely deter-mined based on previous work by Kuniyoshi et al.(Kuniyoshi and Sangawa, 2006). However, with achange of learning rule for connections the gain pa-rameters were changed from M1 neurons to neuraloscillators, ! and " motor neurons (GNO, G! andG").

Figure 1: Neural model. S1 : primary somatosensory

area model, M1 : primary motor area model, Neuro-

modulation : neuromodulation model, Neural oscillator

: neural oscillator neuron model, S0 : a!erent sensory

interneuron model, ! : ! motor neuron model, " : " mo-

tor neuron model, Mechanoreceptor : mechanoreceptor

model, Spindle : muscular sensory organ model, Ten-

don : Golgi tendon organ model. Arrow and filled circle

represent excitatory and inhibitory connections, respec-

tively. Thick broken lines represent all to all connections

with plasticity.

We adopted the muscle and spinobulbar model de-veloped by Kuniyoshi et al. based on a biological per-spective (Kuniyoshi and Sangawa, 2006). The mus-cle model generates power by inputting motor com-mands, and then outputs sensory information viaspindle and Golgi tendon models. Activity in thespindle model codes the length and velocity of itsembedded muscle. The Golgi tendon model servesas a tension sensor. A composition element of thespinobulbar model consists of the muscle, ! and "motor neurons, a!erent sensory interneuron S0, andneural oscillator models. Though each element isnot directly mutually connected, these are mutuallycoupled and various whole-body movements emergethrough the embodiment by the property as a non-linear oscillator in the neural oscillator model.

As a tactile cell, we focused on a Merkel cell whichis a type of cutaneous mechanoreceptor and mainlydetects pressure. We used a Merkel cell model thatis a low pass filter with a cuto! frequency of 50 Hz(Freeman and Johnson, 1982). Tactile informationis projected to S1 model through S0 model.

The model of S1 and M1 is based on Chen’s work(Chen, 1997). This model consists of self-organizingmaps with continuous dynamics and whose neuronsare arranged in a faveolate structure on a plane (Fig.2). A neuron of the S1 model receives input from allS0 neurons, whose activity codes somatosensory in-formation. Each neuron in the M1 model is also fully

147

Page 4: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

Figure 2: Topology of neurons of S1 and M1 model. Red

neurons belong to the nearest group Ni,1 to neuron i, and

blue neurons belong to the second nearest group Ni,2 to

neuron i.

connected to all neurons involved in motor activity,i.e. the neural oscillator, ! and " motor neurons.All of these connections have plastic properties. Weadopted a covariance learning rule as shown below(Dayan and Abbott, 2001).

dWdt

= #(y ! y)(x ! x)T (1)

wi " wi

#wi#(2)

Here, W = [· · · ,wi, · · · ]T is the connection weightmatrix, and wi is the connection weight vector to theoutput neuron i. x and y are respectively input andoutput vector, x and y are average vectors during10 sec., and # is a learning coe!cient. #S0,S1 and#M1 are learning coe!cients of connections from S0to S1 and from M1 to neural oscillator, !, and "neurons. We also updated Ii which denotes the inputto neuron i. In the input from S0 to S1, Ii is definedas the following.

Ii =x · wi

#x# (3)

S1 neuron models also receive input from adjacentneurons, and Ii is defined as the following.

Ii =1

nNi,1

!

k!Ni,1

yk ! 1nNi,2

!

l!Ni,2

yl (4)

Here, Ni,1 and Ni,2 are the neuron groups with adistance that is the first and second nearest to neuroni (Fig. 2), and nNi,1 and nNi,2 are the number ofneurons that belong to Ni,1 and Ni,2. In the othercase, Ii is defined as the following.

Ii = Gx · wi"

k!N |x · wk|(5)

Here, N is the group that is composed of all outputneurons. G is the gain parameter of each connection.

We modeled the plasticity of the S1 by neuromod-ulator as an adjustment of #S0,S1.

#S0,S1 " G! #S0,S1 (6)

Here, G! is a gain parameter and is usually set to1.0.

Muscle

Ground

Angular limit

Figure 3: 1-joint model. Blue link is upper link, and red

link is lower link and is fixed to ground.

Fetus Model Infant Model

Figure 4: Fetus and infant models. Red strings represent

muscles and green circles represent joints.

3. Experiments

We predicted that our neural model would generatea tendency for movements and drive goal-directedbehaviors based on sensory constraints through em-bodied interactions in a self-organizing manner.

To examine our early developmental scenario,we conducted simulation experiments involving twogoal-directed behaviors, hand-face contact and handregard behaviors, using the open dynamics engine(Smith) for simulating rigid body dynamics.

We used three musculoskeletal body models: a 1-joint model (Fig. 3)(Kuniyoshi and Sangawa, 2006),a fetus model, and an infant model (Fig.4)(Mori and Kuniyoshi, 2010). We used the 1-jointmodel to represent a clearly understandable simplephenomenon. The 1-joint model contains two cylin-drical rigid links, one joint, and 12 muscles. One ofthe links is fixed to the ground and referred to as thelower link, while the other link is referred to as theupper link. The two links are connected by a spheri-cal joint that bends freely up to 30" in any directionfrom vertical. In the fetus and infant models, size,mass, moment of inertia of each body part and jointangle limits are determined to match those of corre-sponding human babies. The fetus and infant modelswere designed to correspond to a fetus of 20 weeksgestational age and an infant 5 months of age. Thesemodels contained 198 muscles in the whole body ex-

148

Page 5: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

cluding the finger and face muscles. Furthermore,to obtain appropriate sensory information in wholebody movements including contacts, a body shapemodeled from triangle mesh.

Amniotic fluid and the uterine wall are importantfeatures of the uterine environment in which fetallearning takes place. As such, we adopted the amni-otic fluid and uterine wall models produced by Moriet al. (Mori and Kuniyoshi, 2010). The pressure ofeach tactile point in a body model is calculated basedon these models and inputted to a physics simulator.Each input to a tactile cell model is a value reflect-ing the pressure of a contact, and in the case of theuterine environment, we added the pressures of theamniotic fluid and uterine wall models. The contactpressure of each tactile point is determined by dis-tributing a contact force calculated by the physicalsimulator. When object j, having tactile point i,contacts object k with contact point l, contact forceFj,k,l is calculated by the physical simulator. Let fi

denote the contact pressure of neuron i. Then, fi isdefined as follows.

di,k =

!0.02 ! li,k (li,k < 0.02)0 (li,k " 0.02)

(7)

ci,k,l =

!!Fj,k,l · ni Ai di,k (Fj,k,l · ni < 0)0 (Fj,k · ni " 0)

(8)

fi =1Ai

"

k

"

l

ci,k,l#m cm,k,l

(9)

Here, li,k is the distance between tactile point i andobject k, Ai and ni are the assignment area and di-rection unit vector of tactile point i.

3.1 Hand-face contact behaviors

We used these models to test our hypothesis that thenon-uniform distribution of tactile cells sensitive tothe face and hands would generate hand-face con-tact behaviors as a result of self-organization. First,we found that tactile distribution guided the organi-zation of motion, shown by the results of a 1-jointmodel experiment. Next, we conducted an experi-ment with a fetus model.

3.1.1 Experiment with 1-joint model

We investigated the e!ect of tactile distribution onthe development of motion. In the experiment,we compared the 1-joint model in a uterine en-vironment with the uniform and non-uniform dis-tributions of tactile sensors made by Mori et al.(Fig. 5)(Mori and Kuniyoshi, 2010). We set a two-dimensional 4 # 4 number of neurons in the S1 andM1 models, and set parameters to GNO = 0.5, G!

= 0.2, G" = 0.1, !S0,S1 = 0.001, and !M1 = 0.0005.

(a) Uniform tactile distribu-tion.

(b) Non-uniform tactile distri-bution.

Figure 5: Tactile distribution and muscle configuration of

1-joint model (Mori and Kuniyoshi, 2010). White circles

are tactile points and green strings are muscles. The total

amount of tactile sensors is 96.

0.06

0.1Muscle

Tactile

0.06

0.1

(a) uniform distribution (b) non-uniform distribution

Figure 6: Direction probability distributions of 1-joint

models with uniform and non-uniform distribution of tac-

tile sensors after learning.

We sought to determine how an agent’s tactile dis-tribution might a!ect its emergent movement behav-ior. To do so, we investigated the motion generatedgiven di!erent tactile distributions, and measuredthe resulting preferences in direction of motion. Thiswas computed from the direction of the centroid ofthe upper link based on the centroid of the lower linkin the horizontal plane.

The direction probability distributions taken dur-ing self-organized activity for a simulation time pe-riod between 70,000 and 80,000 sec. for a uniformtactile distribution and a non-uniform tactile distri-bution are shown in Fig. 6. The results revealed nopreferential direction in the case of a uniform tactiledistribution. However, in the case of a non-uniformdistribution, we observed a specific motion prefer-ence directed toward the area in which tactile sensorswere denser.

3.1.2 Experiment with fetus model

To further test our hypothesis, we conducted similarcomparisons in fetus models (Fig. 7) with uniformtactile distribution with a constant density, and non-

149

Page 6: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

Figure 7: Overview of simulation with fetus model in

uterine environment. Red strings represent muscles and

blue sphere represent uterine wall.

Total tactile sensors : 764

head neck chest abdomenuniform

non-uniform200364

106

7232

11444

shoulder upper arm3014

5416

forearm hand3614

22172

thigh calf foot8222

5616

3642

hip5222

(a) uniform distribution (b) non-uniform distribution

Figure 8: Uniform and non-uniform distribution of tactile

sensors with fetus models.

uniform tactile distributions where density was basedon human two point discrimination (Fig. 8). Weconstructed tactile distributions based on the workof Mori et al. (Mori and Kuniyoshi, 2010). We seta two-dimensional 6 ! 12 number of neurons in S1and M1 models, and set parameters to GNO = 1.0,G! = 1.0, G" = 0.5, !S0,S1 = 0.01, and !M1 = 0.01.

We investigated the time ratio of hand-face con-tact behaviors every 200 sec. during 0-2000 sec. and18,000-20,000 sec. (Fig. 9). Face region was definedas Fig. 10. When the distance between hand cen-troid and face region was less than or equal to 0.03m, we considered the movement to constitute hand-face contact. No di!erence in time ratio before andafter learning was observed with a uniform tactiledistribution. However, when the tactile distributionwas non-uniform, time ratios in both hands showeda clear increase over the same learning period.

3.2 Hand regard behaviors

We examined our hypothesis that the limitation ofvisual field would lead to the emergence of hand re-gard behaviors as a result of self-organization. We

uniform distribution oftactile sensors

non-uniform distribution oftactile sensors

0

0.04

0.08

0.12

0.16 0 - 2000 [sec.] 18000 - 20000[sec.]

left hand right hand left hand right hand

Time ratio of hand-face

contact behaviors

Figure 9: Time ratio of hand-face contact behaviors be-

fore and after learning in fetus models with uniform and

non-uniform distribution of tactile sensors.

Figure 10: Face area in fetus model. Area surrounded

with blue line is face area.

(a) uniform neuromodulation

(b) non-uniform neuromodulation

1010.0001

25Gη

Muscle

0.1

0.2

0.1

0.2

Figure 11: Direction probability distributions of 1-joint

models with uniform and non-uniform neuromodulation

after learning.

reflected this visual constraint using a neuromodula-tion model. Thus, we first examined whether a neu-romodulation model could guide the organization ofmotion using a 1-joint model, and then conducted anexperiment with an infant model.

3.2.1 Experiment with 1-joint model

In the experiment, we compared the 1-joint model insituations with uniform and non-uniform neuromod-ulation. In the uniform neuromodulation model, G#

was constantly 1.0. In contrast, in the non-uniformneuromodulation model, G# was changed based onthe upper link direction (Fig. 11). We set a two-dimensional 3 ! 3 number of neurons in S1 and M1model, with parameters set to GNO = 1.0, G! = 0.5,G" = 0.2, !S0,S1 = 0.0005, and !M1 = 0.0001.

To determine how neuromodulation influences mo-tion behavior, we investigated the motion generated

150

Page 7: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

Figure 12: Overview of simulation with infant model and

infant camera view.

by two models in the same manner as the above 1-joint model experiment analysis.

The direction probability distributions emergingduring self-organized activity for a simulation timeperiod between 490,000 and 500,000 sec. for modelsinvolving the uniform and non-uniform neuromodu-lation are shown in Fig. 11. No preferential direc-tion was observed in the case of uniform neuromod-ulation. In contrast, when the model involved non-uniform neuromodulation, we observed an increase inthe probability of movement directed to the regionwhere plasticity was high.

3.2.2 Experiment with infant model

To examine our hypothesis, we compared infantmodels without and with a limitation of visual fieldreflected by uniform and non-uniform neuromodu-lation models (Fig. 12). To produce more clearlyunderstandable results, we simplified the model us-ing movement in both arms only, with all other bodyparts held in a fixed position. Each arm has 37 mus-cles. In the uniform neuromodulation, G! was con-stantly 1.0. When a limitation of visual field wasimplemented with non-uniform neuromodulation, wedefined visual angle as 25!. When a hand enteredview, plasticity was high, but was otherwise very low.Let ! denote the relative angle between the line ofsight and a line connecting the hand and head cen-troids, then G! was defined as follows.

G! =

!"""#

"""$

50 (! ! 5)20 (10 < ! ! 15)1 (20 < ! ! 25)0.0001 (25 < !)

(10)

We set a two-dimensional 5 " 5 number of neuronsin S1 and M1, with parameters set to GNO = 1.0, G"

= 0.5, G# = 0.2, "S0,S1 = 0.005, and "M1 = 0.001.We investigated a time ratio of hand regard be-

haviors every 200 sec. during a period of 0-2000 sec.and 48,000-50,000 sec. (Fig. 13). When ! was lessthan or equal to 25!, we considered the action to bea hand regard behavior. Tests were two-sided andp-values p < 0.005 were regarded as statistically sig-nificant. We did not observe any di!erence in the

left hand right hand left hand right hand0

0.1

0.2

0.3

unifromneuromodulation

non-unifromneuromodulation

0 - 2000 [sec.]

48000 - 50000 [sec.]

Time ratio of

hand regard behaviors

p < .005

n.s.

n.s.

*

**

Figure 13: Time ratio of hand regard behaviors before

and after learning in infant models with uniform and non-

uniform neuromodulation.

time ratio before and after learning when neuromod-ulation was uniform. In contrast, time ratios in bothhands increased over the same learning period whenneuromodulation was non-uniform.

4. Discussion

To increase our understanding of the emergence ofgoal-directed behaviors in human babies, we pro-posed an early developmental scenario for the self-organization of these behaviors. This scenario wasguided by principles of sensorimotor integrationbased on sensory constraints. We constructed a neu-ral model that represented our proposed scenario in-corporating essential and realistic (although simpli-fied) biological features.

The results of a series of computer simulation ex-periments revealed that sensory constraints triggeredtwo developmental transitions. First, we showed thattactile distribution guided the organization of motiontoward areas of greater tactile sensor density. Fur-thermore, experiments with fetus models involvingnon-uniform distribution of tactile sensors revealedthat hand-face contact behaviors emerged as a resultof a self-organizing process. Second, we showed thatthe neuromodulation model that adjusts the plastic-ity of sensory self-organization was able to modulatemotion. In infant models, we found that the limita-tion of visual field (reflected by a non-uniform neuro-modulation model) led to the emergence of hand re-gard behaviors in a self-organized manner. Taken to-gether, these results suggest that sensory constraintsdrive the emergence of goal-directed behaviors andguide early development in an embodied and self-organized manner.

Further research with other body models incor-porating various sensory constraints and other goal-directed behaviors are required to further elucidatethis issue. Despite this, this preliminary study con-stitutes a first step toward understanding the emer-

151

Page 8: A Fetus and Infant Developmental Scenario: Self-organization ...A Fetus and Infant Developmental Scenario: Self-organization of Goal-directed Behaviors Based on Sensory Constraints

gence of goal-directed behaviors in human babies.

Acknowledgements

We are grateful to Alex Pitti for his advice on thispaper.

References

Asada, M., Hosoda, K., Kuniyoshi, Y., Ishiguro,H., Inui, T., Yoshikawa, Y., Ogino, M., andYoshida, C. (2009). Cognitive developmentalrobotics: A survey. IEEE Transaction on Au-tonomous Mental Development, 1(1):12–34.

Bekedam, D. J., Visser, G. H. A., de Vries, J. I. P.,and Prechtl, H. F. R. (1985). Motor behaviourin the growth retarded fetus. Early Human De-velopment, 12:155–165.

Bradley, R. M. and Mistretta, C. M. (1975). Fe-tal sensory receptors. Physiological Reviews,55(3):352–382.

Chen, Y. (1997). A motor control model based onself-organizing feature maps. Phd thesis, Uni-versity of Maryland.

Dayan, P. and Abbott, L. F. (2001). TheoreticalNeuroscience: Computational and Mathemati-cal Modeling of Neural Systems. The MIT Press.

de Vries, J. I. P., Visser, G. H. A., and Prechtl, H.F. R. (1982). The emergence of fetal behaviour.i. qualitative aspects. Early Human Develop-ment, 7:301–322.

Dimitrijevic, M. R., Gerasimeko, Y., and Pinter,M. M. (1998). Evidence for a spinal central pat-tern generator in humansa. Annals of the NewYork Academy of Sciences, 860:360–376.

Freeman, A. W. and Johnson, K. O. (1982). Amodel accounting for e!ects of vibratory ampli-tude on responses of cutaneous mechanorecep-tors in macaque monkey. Journal of Physiology,323:43–64.

Johnson, M. H. (2005). Developmental CognitiveNeuroscience. Blackwell Publishing.

Korner, A. and Kraemer, H. (1972). Individ-ual di!erences in spontaneous oral behavior inneonates. In Bosma, J., (Ed.), Proceedings ofthe 3rd Symposium on Oral Sensation and Per-ception, pages 335–346.

Kuniyoshi, Y. and Sangawa, S. (2006). Early motordevelopment from partially ordered neural-bodydynamics: experiments with a cortico-spinal-musculo-sleletal model. Biological Cybernetics,95:589–605.

Lamour, Y., Dutar, P., Jobert, A., and Dykes,R. W. (1988). An iontophoretic study of sin-gle somatosensory neurons in rat granular cortexserving the limbs: a laminar analysis of gluta-mate and acetylcholine e!ects on receptive-fieldproperties. J Neurophysiol, 60(2):725–750.

Mori, H. and Kuniyoshi, Y. (2010). A human fe-tus development simulation: Self-organizationof behaviors through tactile sensation. In IEEE9th International Conference on Developmentand Learning, pages 82–97.

Noppeney, U., Waberski, T. D., Gobbele, R., andBuchner, H. (1999). Spatial attention modulatesthe cortical somatosensory representation of thedigits in humans. Neuroreport, 10:3137–3141.

Pascual-Leone, A. and Torres, F. (1937). Plasticityof the sensorimotor cortex representation of thereading finger in braille readers. Brain, 116:39–52.

Penfield, W. and Boldrey, E. (1937). Somatic motorand sensory representation in the cerebral cor-tex of man as studied by electrical stimulation.Brain, 60:389–443.

Prechtl, H. F. R. (1997). State of the art of a newfunctional assessment of the young nervous sys-tem. an early predictor of cerebral palsy. EarlyHuman Development, 50(1):1–11. SpontaneousMotor Activity as a Diagnostic Tool FunctionalAssessment of the Young Nervous System.

Prechtl, H. F. R., Fargel, J. M., Weinmann, H. M.,and Bakker, H. H. (1979). Postures, motilityand respiration of low-risk pre-term infants. DevMed Child Neural, 21:3–27.

Rochat, P. (1997). Di!erential rooting response byneonates: evidence for an early sense of self.Early Development and Parenting, 6:105–112.

Rochat, P. (2001). The Infant’s World. HarvardUniv. Press.

Sarnat, H. B. (2003). Functions of the corticospinaland corticobulbar tracts in the human newborn.Journal of Pediatric Neurology, 1(1):3–8.

Smith, R. Open dynamics engine - ode.http://www.ode.org/.

Visser, G. H. A., Laurini, R. N., de Vries, J. I. P.,Bekedam, D. J., and Prechtl, H. F. R. (1985).Abnormal motor behaviour in anencephalic fe-tuses. Early Human Development, 12(2):173–182. Ultrasound studies of human fetal be-haviour.

152