SciELO - Scientific Electronic Library Online

vol.19 issue2Part-of-speech tagging with maximum entropy and distributional similarity features in a subregional corpus of SpanishEffect of the milk-whey relation over physicochemical and rheological properties on a fermented milky drink author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand



Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google


Ingeniería y competitividad

Print version ISSN 0123-3033

Ing. compet. vol.19 no.2 Cali July/Dec. 2017 


Identification of movement intention of gait on various terrains -a bioinspired approach-

Identificación de intención de movimiento de marcha sobre diferentes terrenos -una aproximación bioinspirada-

Oscar Campo1  , Eduardo F. Caicedo2 

1Departamento de Automática y Electrónica, Universidad Autónoma de Occidente. Cali, Colombia.

2Escuela de Ingeniería Eléctrica y Electrónica, Universidad del Valle. Cali, Colombia.


In this paper we propose an approach to the neuromotor language of the transfemoral amputee user. We do this by identifying the user’s intention from the perception of both internal and external manifestations (to be explained into a next section) of Fixed Action Patterns (FAPs, by making use of artificial proprioception and exteroception of the prosthesis. The formalization of a General Expression of the Rhythmic Gesture, generation procedures for artificial FAPs, and a Response Algorithm for Gestures Development are presented. By identifying the user’s intention through proprioceptive and exteroceptive information, the prosthesis discriminates between repertories of artificial FAPs and chooses the most suitable one to meet the users’s requirements. Experimental data of tests carried out in healthy and amputee individuals showed high performance on identification (97.06 % of true identifications) of the user’s intention and good tracking of gestures such as gait, walking up stairs, down stairs, up hill and down hill, independently of the speed of execution of the gesture.

Key words: Artificial fixed action pattern; identification algorithm; user’s intention


En este documento se propone una aproximación al entendimiento del lenguaje neuromotor de una persona con una amputación por encima de rodilla, mediante la identificación de su intención de movimiento a partir de la percepción de las manifestaciones tanto internas como externas de los Patrones de Acción Fija (PAF) mediante el uso de una propiocepción artificial y la exterocepción de su prótesis durante el desarrollo de diferentes gestos (marcha, subir y bajar escaleras y subir y bajar rampas). Se presentan la formalización de una expresión General del Gesto Rítmico, los procedimientos para la generación de PAF artificiales y un Algoritmo de Respuesta ante el Desarrollo de Gestos. Mediante la identificación de la intención del usuario a través de la información propioceptiva y exteroceptiva, la prótesis discrimina entre un repertorio de PAF artificiales y selecciona el más adecuado para satisfacer las necesidades de movimiento del usuario. Los datos experimentales de las pruebas desarrolladas en individuos sanos y amputados mostraron un alto desempeño en la identificación de la intención del usuario (97.06% de identificaciones correctas) y un buen seguimiento de los gestos de movimiento independientemente de la velocidad con que fueron ejecutados.

Palabras-clave: Algoritmo de identificación; intención del usuario; patrón de acción fija artificial

1. Introduction

The two-century development of conceptualization of the brain’s motor organization has had two strong diametrically opposed branches: the first one developed by William James in 1890 1, and the second one, by Graham Brown, in around 1915 2. On the one hand, James considered the functional organization of a neural system only in terms of reflexes. This would mean that the brain only acts as a black box where inputs are operated to produce an output driven by instantaneous demands from the environment. Thus, from this point of view, sensation drives the movement, and the movement’s generation is basically a response to an external signal 1. This idea was well received in the mid-50s, and prompted the study of the central reflexes, studies of sinaptic transmission and neural integration, which all together have been key concepts and have played an important role in contemporary neuroscience 3.

On the other hand, Graham Brown considered that the spinal cord did not have a basically reflex organization, but that organization of this system is self-referential and based on neural circuits. These circuits promote the generation of necessary neural patterns for organized movement 4-6. In addition, Llinas and Roy 7 have suggested that organized movement is generated internally without the need for sensory input. So, reflex activity is only necessary to modulate the organized movement and not to generate it 8. According to this theory, even though locomotion is organized in an intrinsic way, the sensory information changes the rhythm in a reflex form in such a way that there is an adaptation to the environment where the motion takes place 9.

Some authors 10,11, mentioned Fixed Action Patterns (FAPs) as the “automatic modules” that the brain uses in order to produce more complex movements. These FAPs are sets of motor actions automatically activated to produce coordinated, rhythmic, and relatively fixed movements. Gait is one of the FAPs 11. There are neural networks that generate relatively fixed movements, known as Central Pattern Generators (CPG) 12,13. The results of such CPG actions are the FAPs. Without any assistance of the Central Nervous System (CNS), the FAP allows progress as a movement regardless their supervision at this level, freeing the brain to concentrate on different tasks that require its attention. Some degree of consciousness comes into play to regulate a FAP in the case where, for example, there is an obstacle in the way.

Nandi and Gupta 14 have shown that current modeling algorithms for bipedal locomotion and models of CPG used for humanoid robots have serious shortcomings. These issues are related to synchronization and stability when considering the wide variations in the development of locomotion gestures in humans, which make them not suitable for use in prosthetic applications. These results are aligned with the vision of centralized organization of motor control from the CNS. It has been noticed that the mere involvement of CPG without regulation would not allow adaptation to changing dynamics in the development of locomotion gestures 15,16. This reinforces the idea of a pattern of self-organization, also supported by Llinas 11 which affirms that gait can be considered an expression of a FAP. In this regard, a control scheme led by FAPs and the modulation of their action is taken as a metaphorical approach to be adopted as a biological inspiration for developing the control strategy presented in this paper. To properly incorporate knowledge of the physical environment in the decision making of prosthesis, a new process was required. It was necessary to develop an adequate representation of the dynamics in terms of the user’s needs, environmental conditions, and the prothesis’ capabilities itself.

As an objective, such representation must be embedded in the knowledge base of the prosthesis and should be comprehensive, accessible, understandable, and comparable with the actual performance of the dynamics desired by the user. Considering this objective, the coordinated control by FAPs is shown in this paper as one alternative. In this context, “coordinated” refers to the joint interaction between the user and the prosthesis. Such action was intended by using the FAP as a common language for both during the development of a gesture. This paper presents an adequate representation of the FAPs for the development of gestures derived from the intention of the user, allowing the prosthesis to decide on physically feasible, safe, and better coordinated controls. This can lead to an improvement in the performance of the system user-prosthesis in coordinated control scenarios and it is a novel control approach.

2. Methodology

2.1 Coordinated control by FAPs

Biological organisms, show behaviors such as opening their mouths to be fed, suction, or mating displays, which are developed with a minimum of sensory experience 17. In these cases, the behavior pattern is innate and it appears essentially complete on the first time that the organism encounters the appropriate stimulus 18.

These behaviors are highly stereotyped, rigid and predictable and have no control of external feedback, i.e., once such behavior is triggered, it continues to completion 19. This type of behavior is a fixed action pattern (FAP). The FAPs are specific to each species. When compared with individuals of a specific species which share morphological and physiological features, they are homogenous in terms of the anatomical characteristics of that species.

FAPs are initiated by external stimuli known as stimulus signals. There are certain pre-established mechanisms (innate release mechanisms), in which specific areas within the brain respond to the stimulus signal. A fixed pattern of action is not evident until the body meets the appropriate stimulus signal, which then stimulates the innate release mechanism, putting into action the sequence of movements that represents the behavior. Under this description, the following is a formal approach to the development of concepts that allow the structuring of an expression of the gesture according to the FAP that defines it.

Let S be the system formed by the pair user-prosthesis, in which each element is a body B (Bu for user body and Bp for prosthesis body) that performs a movement together and in a coordinated manner to ensure the development of a gesture (walking, climbing stairs, etc.).

Figure 1 shows that both bodies, the user and the prosthesis, each have a set of internal attributes (IA(Bu) and IA(Bp)). These are attributes of a particular movement to be developed, i.e., a single gesture, as described below. For the user, the internal attributes are his/her own FAP (fap u) and attitudes (act u) for each gesture, which together make up the intention (I) to move. On their part, the internal attributes of the prosthesis are constituted by their own artificial FAP (fap p). Both the user’s and the prosthesis’ FAP are connected by using a Function of Gesture Identification Φ; then this function becomes the interpreter between the FAP of both bodies.

Figure 1 Formalization of the gesture developed by the system. 

A set of the prosthesis’ external attributes EA(Bp) is made up of information extracted from the particular attitudes of a gesture to be developed by the user. The extraction of this information is provided by a Function of Special Features of Gesture. From this point of view, functions Φ and Θ (Figure 1) respectively, extract general and particular information from user’s intention of movement. These two types of information are inputs of a third function that allows smooth and coordinated interaction between the user and the prosthesis: the modulating function Γ. The modulating function basically constitutes the gestural synchronization link between the internal attributes of the user (movement intention) and the internal and external attributes of the prosthesis (understanding of the intention), allowing a unified gesture developed by the system.

Each concept and element presented on Figure 1 will be explained in a more detailed manner as follows. The bodies Bu and Bp performs a movement mov K in a joint and coordinated way to ensure the development of a gesture so that mov s k end up being movements developed by S:


Where, UP is the set of all possible configurations of the system,

Definition 1: A gesture is a movement of either the body or a part of it, which is the expression of an idea, thought, emotion or intention. The body may develop different types of movements depending on the configuration in which it is involved. A possible movement of the body in general and of a particular limb can be defined and classified as a gesture in terms of the typical characteristics of that movement.

Let G(S) be the set of possible rhythmic gestures that can be developed by S, which is defined as a subset of the possible movement of S, thus:

G(S) &‌#8838; MOV(S) (2)

Where, &‌#8707;G i,s , G j,s &‌#8714; G(S) &‌#8739; G i,s &‌#8800; G j,s so

G(s) = {G 1 , G 2 , G 3 , …, G n } s (3)

This defines a set of FAPs with different rhythmic patterns of motor activity which are generators of a subset of the possible movements of the body Bα. Each fap α is designed to guide each body to special conditions of movement and their actions cause a well-defined dynamics in the body. So

FAP(B α ) ⊆ MOV(B α )(4)

Where, ∃ƒ ap i,α ƒ ap j,α ⊆ FAP ∈(B α ) ∣ ƒ ap i,α ≠ ƒ ap j,α, so,

FAP(Bα) = {ƒap 1 , ƒap 2 , ƒap 3 , … , ƒap n } α (5)

These FAPs allow the execution of rhythmic movements in S or in each body B α . They are also a characteristic of the body, so they are part of its internal attributes (IA). This defines a set of requirements (RG) for each gesture (G), as a subset of the external attributes (EA) of a body B α . The requirements of each gesture come from the particular features of the environment in which the gesture is being performed.


RG(B α ) ⊆ AE(B α )(6)

Where ∃rg i,α , rg j,α ∈ RG(B α )∣ ∃rg i,α ≠ rg j,α so,

RG(Bα) = {rg 1 , rg 2 , rg 3 , …, rg n } α (7)

Given these requirements, the body B (

performs an Adaptation Response (AR), which generates specific movements that are particular modifications of the rhythmic dynamics defined by the FAP of each body. So, AR is defined as a subset of the possible movements of the body B ( , i.e.,

AR(Bα) ⊆ MOV(Bα) (8)


∃ar i,α , ar j,α ∈ AR(B α )∣ ar i,α ≠ ar j,α so,

AR(B α ) = {ar 1 , ar 2 , ar 3 , …, ar n } (9)

Therefore from Eq.(4) and Eq.(9) we obtain

MOV(S) ⊇ {FAP, AR} (10)

Each gesture (walking, climbing stairs, down stairs, etc.,) performed by the system consists of a series of Typical Movements (TM) which make it unique. This is expressed as:

∀G(S) i ,G(S) i ∃TM(S) i , TM(S) j ∣ TM(S) i ≠ TM(S) j(11)

Then, by Eq.(10) we obtain

G(S)i = {ƒapj, ar1, ar2, ar3, …, arq,tm1,tm2,tm3, …, tmr} (12)

Equation (12) is the General Expression of the Rhythmic Gesture, which enables the description of the particularities of such gestures by incorporating the specific elements of movement. These elements make the performance of such movement unique in a person. At the level of the user, for each gesture G i,u of the user there is a fap j,u which dictates the rhythmic execution of the gesture. There will also be a series of requirements rg k that modulate that rhythmic execution according to the particular characteristics and needs of the environment of the user’s movement. Additionally, there will be a series of typical movements tm q of the individual that give a personal stamp to the execution of the gesture. That is:

∀Gi,u∃ ƒapj,u ∧ ∃rgk ∧ ∃tmq¿ (13)

Ignoring the tm q , it is possible to relate the FAP and RG through an adequate Gesture Adaptation Function Γ, and thus it can be expressed in general terms as follows:

G ≈ Γ(fap,rg 1 ,rg 2 ,rg 3 , …, rg r ) (14)

A set of intentions I is now defined as a subset of internal attributes, IA of the user Bu. That is:

I(B u ) ⊆ IA(B u ) (15)

Definition 2: The intention of the user is an intention of movement, which comes from the user’s own will to develop a gesture, and it is expressed perceptibly both internally and externally. At the internal level, it occurs through neurophysiological expressions that transmit this intention to the body kinematic chain involved in the gesture. In terms of the external level, it occurs through a position (attitude) of the body.

Then, based on this definition, it is possible to perceive the intention of the user through his/her internal and external events. Thus, it is defined for each user’s intention I i,u , a single fap i,u and an attitude act i,u , so that:

∈I i,u ,I j,u ∈ I(B u ), ∃fap i,u , fap j,u ∣ fap i,u ≠fap j,u ∧ ∃act i,u , act j,u ∣ act i,u ≠ act j,u I(B u ) = {fap u , act u } (16)

Therefore, each intention is composed by a neuromotor expression and a corporal attitude. The neuromotor expression itself brings information related to the gesture, which means the information of the user's FAP and its modifications according to their proprioceptive and exteroceptive information. The attitude is manifested through the dynamics of the user's execution of the gesture. The prosthesis takes the information of the intention in order to choose an artificial FAP, fap p , and the requirements of the gesture that will be needed to serve as inputs to modify the function Γ.

The selection of a fap p for each gesture is accomplished by evaluating a Function of Gesture Identification, Φ. The function Φ allows of determining the correct identification of the rhythmic component (fap u ) of the user’s intention of movement and linking it to an adequate fap p .

The output of Φ is derived from the evaluation of the general characteristics of the user’s intention of movement. Then, it is possible to relate the fap p with the user’s intention as follows:

fap p = Φ(fap u ,act u ) (17)

Additionally, it correlates the requirements of the gesture with the attitudes of the user through a suitable function such that:

rg = Θ(act u ) (18)

where Θ is the Function of Special Features of Gesture. The output of the function Θ is obtained from the evaluation of the special features of the user’s intention of movement. So, in general, from Eq.14 we may obtain:

G = Γ( Φ, Θ) (19)

This relation is the general expression of gesture generation from the user’s intention. The same relation is expressed graphically in Figure 1.

2.2. General algorithm for the development of gestures

One of the most important functions of prosthesis is to produce answers to respond to the user’s intention of movement in a consistent manner, i.e., assuming the development of particular actions or tasks. For this purpose, the connection between the identification and the answer is simple: the prosthesis must “make a decision” based on its knowledge that a certain action or task is best to fulfill the user’s intention needs. Additionally, this response is influenced by the demand of the type of gesture and the requirements to develop it. Then, the prosthesis must generate a behavior. In this sense, it is argued that the prosthesis can discriminate between the FAPs in the user’s intention and it can choose its own FAP which governs its behavior-response.

As an example, consider a User-Prosthesis System which consists of the bodies Bu and Bp, trying to develop a coordinated gesture. It is assumed that the prosthesis Bp is able to understand the intention of the user Bu. Then, the response algorithm executed by the prosthesis for the execution of a particular gesture is:


  • if in t 1 CONSIDER (rg(G k ) u > ρ)

  • then in t 2 LOCK Bp until (rg(G k ) u < ρ)

  • else in t 2 EVAL (G k ) u


  • in t 3 PERCEIVE (fap(G k ) u ; act(G k ) u ) and

  • IDENTIFY ((Knowledge Base) Bp ;G k )


  • in t 4 GENERATE (fap(G k ) p )


  • in t 5 MODIFY (rg(G k ) u ; fap(G k ) p )

2.3. Data capture

This study was developed in collaboration with five healthy individuals (1.76 + 0.05 m tall, 26.2 + 5.6 years old) and an individual with a transfemoral medial amputation (1.70 m tall and 38 years old). It was done by using a motion capture system BTS SMART-e 900 in the Laboratorio di Bioingenieria of the Don Gnocci Foundation (Milan, Italy) following the LAMB Bilateral protocol for the placement of markers (Figure 2) and cameras, as recommended 20 and described 21. Details of tests are describen in appendix 1.

Figure 2 Placement of markers (LAMB protocol). (0)PSIS MX, (1)ASIS RX, (2)ASIS LX, (3)THIGH RX, (4)LATCON RX, (5)FH RX, (6)SHANK RX, (7)LATMAL RX, (8)HEEL RX, (9)META5 RX, (10)TOE1 RX, (11)THIGH LX, (12)LATCON LX, (13)FH, LX, (14)SHANK LX, (15)LATMAL LX, (16)HEEL LX, (17)META5 LX, (18)TOE1 LX, (19)TROCH RX, (20)TROCH LX. 

The kinetic information, the mecanomyography signal (MMG) and the acceleration of the movements of the amputee were simultaneously recorded. The MMG signal with the acceleration signals were captured by using a sensor developed for this purpose which consists of a coupling between a microphone and a triaxial accelerometer, inspired by the work developed by Silva et al 22. The sensor was placed in the middle region of the thigh (or stump for amputee) over the Rectus Femoris (Figure 3).

Figure 3 Placement of MMG sensor and reference plenes used on data capture. 

The kinetic signal was recorded with a sampling frequency of 60 Hz, while the MMG and triaxial accelerometry (A3x) were recorded with a sampling frequency of 960 Hz.

2.4. Classification using fuzzy clustering

To identify a structure that will group, in accordance with the intention of movement that originated the first signal, the gestures (developed at different free speeds), a specific type of analysis is required for the data sets collected for each shape. This problem involves discovering a clustering structure within a number of objects. Here, clustering algorithms are the most appropriate strategy.

Typically, there is a set of observations or features which were taken from each object. These sets of features are inputs for the clustering algorithm and this algorithm provides a description of the grouping structure discovered inside the objects. The description consists of a list that contains a cluster for each assigned object. In this way, a series of seemingly different objects, from which a number of features have been extracted, can be clustered into groups of approximately similar characteristics. In the case of fuzzy clustering, the universe of discourse is the set of all objects, and the subsets defined within the universe are the clusters. Objects are not classified as belonging to a single cluster but they have a degree of belonging to each cluster.

For the particular case of this research, the clustering objects are the intentions of movement, which have a set of observations or numerical features x(k). For each set of characteristics x(k) of each intention (gait on flat terrain, walking up and down stairs, going up and down hills, standing), a fuzzy clustering algorithm around centers known as fuzzy c-mean clustering (FCM) was used. Each group x(k) represents the coordinate of a point p in an n-dimensional space where the values of the features obtained from the signal are registered on each axis Ei.

For each intention of movement there is a group of signals that carry information from the components of intention, i.e., the FAP and attitude, as defined in Eq.16. The MMG is the signal that carries the information of the FAP, while the attitude is recorded by a vector of acceleration in the three coordinate axes.

The characteristics that were analyzed for this case were the RMS for each acceleration signal and dRMS for MMG. The RMS value is a time characteristic of the MMG which has been used in prosthetic applications 23 and it is spatially independent of the location of the sensor 24. The dRMS characteristic is defined as:


so, the sets of values x(k) are obtained from


The dRMS feature was used because better results on prior classification tests were obtained when using it.

The FCM assigns each p a degree of belonging to a specific cluster u k (p) such that:


where N C is the total number of clusters. Each cluster is built around a center defined by the average of all points which are weighted by their degree of belonging to each cluster:


Initially, and for signals of each gesture, 500 clusters were used for the tests. The centers of these clusters were fuzzified into a fuzzy set with a universe of discourse U = {Extreme, Higher, High, Medium, Low, Lower, Off}. Each fuzzified center represents a rule that defines the classification of each x(k) in a particular gesture. Repeated values were deleted from the rules in order to define which a minimum number of rules for the classification of each movement. From this fuzzification, it can be observed that the RMS value for the signal of the accelerometer showed a constant value “Off,” and therefore it was rejected. The value “off” of this signal is due to the gestures that were developed on a two-dimensional plane framed by the vertical axis and the direction of the progression of the movement. Additionally, the number of rules was further reduced, pruning the unnecessary ones, i.e., those which, no matter their value, do not alter the final outcome of the classification. For both healthy individuals and the amputee, the width of the sampling window was chosen in such a way that it was as small as possible and that it provides the maximum performance of the identifier.

2.5. Response generator

The Identification Function is developed at the starting point of the swing phase, and specifically right after the toe-off (TO). The generation of response and modulation are obtained before the appearance of the effect of frustration, due to the slow response of the prosthesis reported in the literature. This phenomenon has been reported when the speed of response of the electronic prostheses is over 240 ms 25.

The FAP generator takes the gesture identification emitted by the identifier and generates an artificial FAP as a pattern of movement to be followed by the controller during the rest of the swing phase. The FAP is maintained until a new one is generated due to the identification of a new gesture.

FAP generation was obtained through the implementation of a Pattern Generator based on Artificial Neural Networks (PGANN), composed of a three-layer MLP network with eight, five, and one neurons respectively in each layer (10-fold cross-validation technique was used for data partitioning for training). The input information to the network was a unique gesture identifier G(k) and a vector of requirements of the gesture:

rg = [α,t c ] (24)

where α and t c are the angle of the knee and the contact time of the prosthetic member, respectively, recorded at the time when the TO occurs. The tc is an indicator of the progression rate of the gesture, as demonstrated by Wilkenfeld 26.

Synchronization is performed by the proprioception signal (α angle) of the prosthesis as input signal in order to determine at which point of the swing phase the leg is, after the identification. The PGANN delivers an artificial synchronized FAP with the current position of the leg in the rhythmic pattern typical of the gesture which is identified from the user’s intention.

2.6. Modulator

The information in the vector of requirements rg of the gesture is the input to the modulator. Modulation is applied directly over the artificial FAP that is being generated in PGANN, altering directly the generated coefficients. This modulation acts as the function Θ(.) according to Eq.18. Thus, the modulation is implemented internally as a result of generalization from PGANN.

3. Results and discussion

In the case of healthy individuals, the implemented classifier showed the best performance (97.06% of true identifications), with a window width of 75 ms of signal. Table 1 shows the details of the rules used to identify the intention of movement in this case. In the case of individual with amputation, the best performance (98.04% of true identifications) was obtained by using a window width of 84.4 ms of signal. Table 2 shows the details of the rules used to identify the intention of movement in this other case.

Table 1 Rules of the classifier for the signal of movement intention of healthy individuals. 

Number of Clusters Rule RMS1 RMS2 dRMS Class
1 1 Off Off Off S
2 2 Low Lower Lower DH
3 Lower Off Low
1 4 Medium Lower Off UH
1 5 Higher Higher Higher G
1 6 Higher Higher Highest US
1 7 High High Off DS

G: gait, UP: up stairs, DS: down stairs, UH: up hill, DH: down hill, S: still.

Table 2 Rules of the classifier for the signal of movement intention of an amputee. 

Number of Clusters Rule RMS1 RMS2 dRMS Class
1 1 Lower Lower Low DH
1 2 Off Off Off S
1 3 Low Low Off DS
1 4 Medium Medium Medium US
1 5 Medium High Medium G
1 6 Higher Higher Higher UH

G: gait, UP: up stairs, DS: down stairs, UH: up hill, DH: down hill, S: still.

Through the implemented strategy, a good tracking of artificial FAP during the swing phase for the gait gestures on flat terrain, walking up and down stairs and going up and down a hill was obtained. The general behavior of the controller is shown in Figure 4. Each one of the gestures was developed at different speeds, subjectively defined as slow, normal and fast. The starting and ending points of each signal were the TO and IC, respectively. Table 3 shows high true positive rates in all of the gestures evaluated on a group of healthy individuals. These results are independent of gait speed or individuals. Table 4 also shows high true positive rates for an amputee, whose are gait speed independent.

Figure 4 Performance of controller (dashed line) for tracking the artificial FAP (solid line) for differerent gestures at three different speeds. The starting and ending points of each signal were the TO and IC, respectively. A)“Gait”, MSE=0.0828. B)“Up stairs”, MSE=0.0487. C)“Down stairs”, MSE=0.1591. D)“Up hill”, MSE=0.1192. E)“Down hill”, MSE=0.1819. 

Table 3 Percentage Confusion Matrix for movement intention classification for healthy individuals.  

G 88.24 11.76 0 0 0 0
UP 0 94.12 5.88 0 0 0
DS 0 0 100 0 0 0
UH 0 0 0 100 0 0
DH 0 0 0 0 100 0
S 0 0 0 0 0 100

G: gait, UP: up stairs, DS: down stairs, UH: up hill, DH: down hill, S: still.

From Tables 1,2,3,4,5 and, 6, it can be observed that the architecture of the classifier provides important information about the nature of development of the evaluated gestures. It is noted that, in general, while in the case of healthy individuals there is a low activation of the RF during the “Up hill” gesture, the activation in the amputee has the highest value, the same as in the accelerometer’s signal. This can be explained from the view of muscular substitution that the amputee must do in order to replace the action of additional muscles that the healthy individual does have and focus on the RF as the prime motor of gait.

Table 4 Percentage Confusion Matrix for movement intention classification for one amputee.  

G 100 0 0 0 0 0
UP 5.88 94.12 0 0 0 0
DS 0 0 100 0 0 0
UH 0 0 0 100 0 0
DH 0 0 0 5.88 94.12 0
S 0 0 0 0 0 100

G: gait, UP: up stairs, DS: down stairs, UH: up hill, DH: down hill, S: still.

Table 5 Accuracy, Error Rate, True Positive Rate (TPR) and False Positive Rate (FPR) for movement intention classification for healthy individuals.  

Class Accuracy Error Rate TPR FPR
G 0.994 0.006 0.882 0.000
UP 0.997 0.003 0.941 0.000
DS 1.000 0.000 1.000 0.000
UH 1.000 0.000 1.000 0.000
DH 1.000 0.000 1.000 0.000
S 1.000 0.000 1.000 0.000

N = 50, Kappa = 0.964

G: gait, UP: up stairs, DS: down stairs, UH: up hill, DH: down hill, S: still.

Table 6 Accuracy, Error Rate, True Positive Rate (TPR) and False Positive Rate (FPR) for movement intention classification for an amputee.  

Class Accuracy Error Rate TPR FPR
G 1.000 0.006 1.000 0.000
UP 0.997 0.003 0.941 0.000
DS 1.000 0.000 1.000 0.000
UH 1.000 0.000 1.000 0.000
DH 0.997 0.003 1.000 0.000
S 1.000 0.000 1.000 0.000

N = 50, Kappa = 0.976

G: gait, UP: up stairs, DS: down stairs, UH: up hill, DH: down hill, S: still.

Additionally, the accelerometer signal shows a tendency to increase the mechanical energy gained by inertia by accelerating the thigh and, this way, supporting the muscular action of RF. Proof of this assertion is the noticeable difference in the development of the gait gesture. Here, the relationship RMS2=dRMS is higher compared to the healthy individual, i.e., RMS2 > dRMS, which indicates the tendency of the amputee to use inertia to optimize the gait. Similarly, it can be observed that the “Down Stairs” gesture was performed by the healthy individual at a higher speed than the amputee. Regarding the width of the window used, it can be concluded (based on the previous statement), that the muscle activation (and thus the emission of the neuromotor message) is a little delayed in the amputee. This is related to the adaptation to handling the dynamics of the prostheses that the amputee carries out for modulating the stages of development of each gesture.

Furthermore, there is a good tracking of the artificial FAP generated by the PGANN. Minor errors (Mean Square Error -MSE-) in the tracking were observed in the limits of the maximum flexion or the maximum extension, without exceeding 8.28% MSE for gait, 4.87% MSE and 15.91% MSE for going up and down stairs, and 11.9% MSE and 18.19% MSE for going up and down hills. The highest error values occurred during the execution of gestures “Down Stairs” and “Down hill” due to inertia and that these errors increase when execution speed of the gesture is increased. These results are superior or comparable to those obtained in similar works based on EMG 27,28, ground reaction force measurements 29, or multisensor fusion 30) (Table 7).

Table 7 Comparison of accuracy identification results from various authors with the proposal of this paper.  

Jin et al. (27)
13 healthy ind.+ 95.7 97.0 98.0 98.1 94.6 86.1
1 amputee 98.5 88.8 75.0 85.7 92.3 84.6
Chen et al., (28)
8 healthy ind.+ - - 98.1 97.8 94.8* 94.8*
5 amputees+ - - 97.1 92.1 99.3* 99.3*
Yuan et al., (30)
5 healty ind.+ 99.3 99.8 97.9 99.1 98.9* 98.9*
1 amputee 95.6 99.3 95.6 98.6 98.5* 98.5*
This paper
5 healthy ind.** 100 100 94.1 100 88.2* 88.2*
1 amputee 100 94.1 94.1 100 100* 100*

UH: up hill, DH: down hill, UP: up stairs, DS: down stairs, FG: Fast Gait, SG: Slow Gait. Notes:

+: Average values.

*: Results are independent from gesture speed,

**: Results are independent from subject.

4. Conclusions

This paper proposed an approach to the understanding of neuromotor language of the user by identifying the user’s intention from the perception of both internal and external manifestations of the FAP through proprioception and exteroception of the prosthesis. By identifying the user’s intention through proprioceptive and exteroceptive information, the prosthesis discriminates between a repertories of artificial FAP and chooses the most suitable one to meet the user’s requirements.

The results indicate that the general conceptualization of the intention, according to Eq.16, is a valid approach for addressing the problem of identifying the intention of movement and it is applicable to both healthy individuals and amputees.

The approach proposed in this paper yields a procedure with high performance in the classification of gestures in amputees and in healthy individuals, independently of the speed performance of the gesture. It can be implemented in real time and is likely to cause little harm due to the contact of the sensor with the skin, since this device is embedded in a silicone capsule. The values of the sampling window used in this study represent a further advantage compared to a similar job where the authors have suggested methods to determine the optimal window size using EMG signal, with optimization window lengths between 150 ms and 250 ms 31.

It is important to notice that the formalization presented here is valid provided that the gestures developed must fulfill the condition to be rhythmic and repetitive, so they are suitable to assigning an artificial FAP. Additionally, the total gesture of the system is an approach that does not take into account the movements that assign particular individual characteristics to the gesture execution for every subject.

5. References

1. James W. Principles of psychology. New York: Dover; 1950 (1890). [ Links ]

2. Brown G. On the activities of the central nervous system of the un-born fœtus of the cat; with a discussion of the question whether progression (walking, etc.) is a "learnt " complex. Journal of Physiology. 1915;49(4):208-215. [ Links ]

3. Pearson K, Gordon J. Spinal reflexes. In: Kandel ER, Schwartz JH, Jessell TM. Principles of neural science. New York, NY: McGraw-Hill; 2000. p. 713-736. [ Links ]

4. Brown G. On the nature of the fundamental activity of nervous centers; together with an analysis of the conditioning of rhytmic activity in progression and a theory of the evolution of the nervous system. Journal of Physiology. 1914;48(1):18-46. [ Links ]

5. Buzsáki G, Peyrache A, Kubie J. Emergence of cognition from action. Cold Spring Harbor Symposia on Quantitative Biology. 2015;79:41-50. [ Links ]

6. Alnajjar F, Itkonen M, Berenz V, Tournier M, Nagai C, Shimoda S. Sensory synergy as environmental input integration. Frontiers in Neuroscience. 2015;8:436. [ Links ]

7. Llinas R, Roy S. The ' prediction imperative ' as the basis for self-awareness. Philosophical transactions of the Royal Society of London. Series B, Biological sciences. 2009;364 (1521):1301-1307. [ Links ]

8. Buzsáki G. Neural syntax: cell assemblies, synapsembles, and readers. Neuron. 2010;68(3):362-385. [ Links ]

9. Watson BO, Buzsáki G. Sleep, Memory & Brain Rhythms. Journal of Neurophysiology. 2015;144(1):67-82. [ Links ]

10. Beer R, Chiel H, Sterling L. A biological perspective on autonomous agent design. Robotics and Autonomous Systems. 1990;6(1-2):169-186. [ Links ]

11. Llinas R. I of the vortex: From neurons to self. Cambridge: MIT press; 2002. 302 p. [ Links ]

12. Guertin PA. Central pattern generator for locomotion: anatomical, physiological, and pathophysiological considerations. Frontiers in Neuroscience. 2013;8(3):183. [ Links ]

13. Aoi S, Ogihara N, Funato T, Sugimoto Y, Tsuchiya K. Evaluating functional roles of phase resetting in generation of adaptive human bipedal walking with a physiologically based model of the spinal pattern generator. Biological Cybernetics. 2010;102(5):373-387. [ Links ]

14. Nandi GC, Gupta B. Bio-inspired control methodology of walking for intelligent prosthetic knee. In: Proceedings of the 3rd International Conference of Informatics in Control, Automation and Robotics, ICINCO. Nice, France. IEEE; 2006. p. 2368-2373. [ Links ]

15. Bauer C, Braun S, Chen Y, Jakob W, Mikut R. Optimization of artificial central pattern generators with evolutionary algorithms. In: Mikut R, ed. Proceedings of the 18 Workshop Computational Intelligence; Dortmund, Germany: Universit tsverlag Karlsruhe; 2006. p. 40-54. [ Links ]

16. Ambroise M , Levi T , Joucla S , Yvert B , Saïghi S . Real-time biomimetic Central Pattern Generators in an FPGA for hybrid experiments. Frontiers in Neuroscience. 2013;7:215. [ Links ]

17. Alcock J. Evolution, Nervous Systems, and Behavior. In: Alcock J. Animal Behavior: An Evolutionary Approach (10th edition). Sunderland, Massachusetts: Sinauer Associates, Inc.; 1998. p. 363-9. [ Links ]

18. Mazur JE. Learning and Behavior (6th Edition). New Jersey: Prentice Hall; 2005. 448 p. [ Links ]

19. Campbell NA. Animal behavior. In: Biology (9 ed). New York: Benjamin Cummings; 1996. p. 11-19. [ Links ]

20. Wu G, Sieglerb S, Allardc P, Kirtleyd C, Leardinie A, Rosenbaumf D, et al. ISB recommendation on definitions of joint coordinate system of various joints for the reporting of human joint motion-part I: ankle, hip, and spine. Journal of Biomechanics. 2002;35(4):543-8. [ Links ]

21. Ferrari A, Benedetti MG, Pavan E, Frigo C, Bettinelli D, Rabuffetti M, et al. Quantitative comparison of five current protocols in gait analysis. Gait Posture. 2008;28(2):207-16. [ Links ]

22. Silva J, Chau T, Naumann S, Heim W. Systematic characterisation of silicon-embedded accelerometers for mechanomyography. Medical and Biological Engineering and Computing. 2003;41(3):290-5. [ Links ]

23. Silva J. Mechanomyography sensor design and multisensor fusion for upper-limb prosthesis control. Master thesis. Toronto: Mechanical Engineering Department, University of Toronto; 2004. [ Links ]

24. Madeleine P, Cescon C, Farina D. Spatial and force dependency of mechanomyographic signal features. Journal of Neuroscience Methods. 2006;158(1):89-99. [ Links ]

25. Scott RN. An introduction to myoelectric prostheses,Volumen 1 U.N.B. Monographs on Myoelectric Prostheses. Bio-Engineering Institute, University of New Brunswick, 1984; pp 17. [ Links ]

26. Wilkenfeld AJ. Biologically inspired autoadaptive control of a knee prosthesis. Master thesis. Massachusetts, USA: Massachusetts Institute of Technology; 2000. [ Links ]

27. Jin D, Yang J, Zhang R, Wang R, Zhang J. Terrain identification for prosthetic knees based on electromyographic signal features. Tsinghua Science and Technology. 2006 ;11(1):74-9. [ Links ]

28. Chen B, Zheng E, Fan X, Liang T, Wang Q, Wei K, et al. Locomotion mode classification using a wearable capacitive sensing system. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2013;21(5):744-55. [ Links ]

29. Wang F, Su J, Xie H, Xu X. Terrain identification of intelligent bionic leg based on ground reaction force. In: IEEE International Conference on Integration Technology, ICIT-07; Shenzhen, China: IEEE; 2007. p. 609-13. [ Links ]

30. Yuan K, Sun S, Wang Z, Wang Q, Wang L. A fuzzy logic based terrain identification approach to prosthesis control using multi-sensor fusion. In: IEEE International Conference on Robotics and Automation (ICRA); Karlsruhe, Alemania: IEEE; 2013. p. 3376-81 [ Links ]

31. Smith L, Hargrove L, Lock B, Kuiken T. Determining the optimal window length for pattern recognition-based myoelectric control: Balancing the competing effects of classification error and controller delay. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2011;19(2):186-92. [ Links ]

Appendix 1

Data Capture Protocol

Test with Coupled Sensor for MMG signal on SMART DAQ system


1. Power Supply: 5V

2. Signal Cable: 4 signals 1 ground = 5 wires

3. Microphone Freq. Range: 0-100 Hz

4. Accelerometer Freq. Range: 0-100 Hz


1. Marker Array and Camera Protocol: LAMB-Bilateral


1. Level overground: 3 tests at 3 free cadences (low, mid, high). Total: 9 tests

2. Up Slope-overground walking: 3 tests at 3 free cadences (low, mid, high). Total: 9 tests

3. Down Slope-overground walking: 3 tests at 3 free cadences (low, mid, high). Total: 9 tests

4. Stepping upstairs: 3 tests at 3 free cadences (low, mid, high). Total: 9 tests

5. Stepping downstairs: 3 tests at 2 free cadences (low, mid). Total: 9 tests

6. Standing and Calibration.

Subtotal: 45 tests per subject.

Subjects: 5 healthy persons for control, 1 transfemoral amputee.

Total: 180 tests

Scheduled dates: May 11th, 14th and 15th, from 10:30 am

Technician: Paulo.

Assistant: Francesco.

Head of Lab.: M. Ferrarin, PhD.

Received: July 19, 2016; Accepted: December 16, 2016

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License