Artigo Acesso aberto Revisado por pares

Synthetized inertial measurement units (IMUs) to evaluate the placement of wearable sensors on human body for motion recognition

2022; Institution of Engineering and Technology; Volume: 2022; Issue: 5 Linguagem: Inglês

10.1049/tje2.12137

ISSN

2051-3305

Autores

Damien Hoareau, Gurvan Jodin, Pierre-Antoine Chantal, Sara Bretin, Jacques Prioux, Florence Razan,

Tópico(s)

Indoor and Outdoor Localization Technologies

Resumo

The Journal of EngineeringVolume 2022, Issue 5 p. 536-543 ORIGINAL RESEARCHOpen Access Synthetized inertial measurement units (IMUs) to evaluate the placement of wearable sensors on human body for motion recognition Damien Hoareau, Corresponding Author Damien Hoareau damien.hoareau@ens-rennes.fr orcid.org/0000-0002-5725-383X CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Correspondence Damien Hoareau, CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Department of Mechatronics, 35170, Bruz, France. Email: damien.hoareau@ens-rennes.frSearch for more papers by this authorGurvan Jodin, Gurvan Jodin orcid.org/0000-0003-0178-9376 CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 FranceSearch for more papers by this authorPierre-Antoine Chantal, Pierre-Antoine Chantal Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 FranceSearch for more papers by this authorSara Bretin, Sara Bretin Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France OASIS, IETR UMR CNRS 6164, Avenue du Général Leclerc, Université de Rennes 1, Rennes, 35042 FranceSearch for more papers by this authorJacques Prioux, Jacques Prioux Faculty of Sport Science, Movement, Sport and Health Laboratory, (EA-1274), Rennes, 35000 FranceSearch for more papers by this authorFlorence Razan, Florence Razan CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France OASIS, IETR UMR CNRS 6164, Avenue du Général Leclerc, Université de Rennes 1, Rennes, 35042 FranceSearch for more papers by this author Damien Hoareau, Corresponding Author Damien Hoareau damien.hoareau@ens-rennes.fr orcid.org/0000-0002-5725-383X CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Correspondence Damien Hoareau, CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Department of Mechatronics, 35170, Bruz, France. Email: damien.hoareau@ens-rennes.frSearch for more papers by this authorGurvan Jodin, Gurvan Jodin orcid.org/0000-0003-0178-9376 CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 FranceSearch for more papers by this authorPierre-Antoine Chantal, Pierre-Antoine Chantal Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 FranceSearch for more papers by this authorSara Bretin, Sara Bretin Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France OASIS, IETR UMR CNRS 6164, Avenue du Général Leclerc, Université de Rennes 1, Rennes, 35042 FranceSearch for more papers by this authorJacques Prioux, Jacques Prioux Faculty of Sport Science, Movement, Sport and Health Laboratory, (EA-1274), Rennes, 35000 FranceSearch for more papers by this authorFlorence Razan, Florence Razan CSEE, SATIE UMR CNRS 8029, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France Department of Mechatronics, École normale supérieure de Rennes, Avenue Robert Schuman, Bruz, 35170 France OASIS, IETR UMR CNRS 6164, Avenue du Général Leclerc, Université de Rennes 1, Rennes, 35042 FranceSearch for more papers by this author First published: 22 March 2022 https://doi.org/10.1049/tje2.12137AboutSectionsPDF ToolsRequest permissionExport citationAdd to favoritesTrack citation ShareShare Give accessShare full text accessShare full-text accessPlease review our Terms and Conditions of Use and check box below to share full-text version of article.I have read and accept the Wiley Online Library Terms and Conditions of UseShareable LinkUse the link below to share a full-text version of this article with your friends and colleagues. Learn more.Copy URL Share a linkShare onFacebookTwitterLinked InRedditWechat Abstract Movement data from athletes are useful to quantify performance or more specifically the workload. Inertial measurement units (IMUs) are useful sensors to quantify body movements. Sensor placement on human body is still an open question that this paper focuses on. A method that develops synthesized inertial data is proposed for determining optimal sensors placement. Comparison between virtual and real inertial data is achieved. Training motion recognition algorithm on synthesized and real inertial data exhibits less than 7% difference. This method highlights the ability of the numerical model to determine relevant sensor placement of IMUs on human body for motion recognition algorithm using virtual sensors. 1 INTRODUCTION Sensors are widely used in sports [1], they allow the measurement and monitoring of performance to prevent injuries or to adjust training content. Sensors evaluating physical and physiological parameters can quantify the workload (WL) or training load of athletes. Also, measurements related to the movements and actions performed by the athlete help to estimate the external WL. Automatic activity assessment is possible via several tools such as video monitoring [2, 3] or signal processing algorithms [4]. However, video and radio-based local positioning systems require equipping the gym, which is restrictive. Individual wearable sensors overcome the gym constraints and bring more flexibility in sports monitoring. The synthesis of data such as accelerations or angular velocities relative to the movements of the human body brings a real added value in terms of time and the possibility of experimentation. The use of technology such as inertial measurement units (IMUs) allows the recognition of movements and actions. An IMU is a sensor that collects at least acceleration and angular velocity data. IMUs are widely used because of their integration capabilities, low economic cost, and simple implementation [5, 6]. IMUs with the appropriate bindings [7] are developed by manufacturers. The combination of several IMU reduces the inherent sensor errors and improves the information of performed actions [8]. When these data are intended to train classification algorithms for motion recognition purposes, an extensive dataset of all possible positions is needed to evaluate the best placement of sensors [9]. Thus, the difficulty comes from the combinatorial explosion of the number of trials with the different activities and the different sensor positions considered. The Firat University [10] proposes a method of co-recognition of the human activity and the positioning of an inertial unit because they are more advantageous than cameras. The strength of the approach is to estimate both the sensor location on the human body and the motion recognition. Only one inertial unit was used, 14 activities were performed with 10 repetitions and three subjects. A total of 98 combination activity-sensor locations are recorded. All sensor locations were not investigated because of the high number of experiments needed. This would require the implementation of too many experiments. Another work [11] presents a similar study. In this case, two IMUs are used at the same time. The effect of sensor positioning on human activity recognition is illustrated. The study is conducted in the context of health monitoring, sedentary activities. Different sensor positions are investigated. Experimental limitations persist: not all combinations and positions are tested. However, there are methods based on biomechanical models that can overcome these limitations. Xu et al. [12] have developed an approach to recognize human activity based on biomechanical analysis. Their method is based on the use of a camera (Kinect) sampled at 30 frames per second (FPS) to build a labelled pose database with three different movements repeated 30 times by five subjects. The Kinect biomechanical model is used to extract 2D locations and velocities of 15 markers on the human body. The data were used for motion classification. This approach based on a biomechanical model to extract motion data for classification is validated. Nevertheless, this study does not find the best locations of real wearable IMUs, accelerations are not evaluated. Moreover, this method requires constraining experimental conditions such as the distance from the camera, so it is not applicable to all situations. In this context, our study tends to go further by recovering via a biomechanical model the data of inertial units called synthesized inertial data (SID) on the whole human body with a minimum number of experiments. This allows us to test several combinations of sensors but also to study various positions on the human body in order to recognize the activity. In the first part, a biomechanical model is established via motion capture and synthesized data are generated on the whole human body. As the subject is also equipped with real IMUs, this paper compares the real and synthesized data in time and frequency domain. The main objective is to use IMUs for motion recognition, the two types of data using typical statistical features of motion recognition are compared. In the last part, we present an application of these synthesized data for motion recognition. 2 MATERIALS AND METHODS Optimal IMU placement on human body requires a design method that must be validated. This tool synthesizes IMUs data – named Synthesized IMU Data (SID) – on the whole human body. The core of the tool is a biomechanical model based on motion capture. SID data is compared to real IMU data (RID) through statistical and frequency feature analysis. The limitation of the proposed tool is assessed by motion recognition algorithm on both SID and RID. 2.1 Synthetized inertial data (SID) A biomechanical model is established by motion capture experimentations. First, the data from the inertial units of the Xsens MVN Link suit equipped by the subject are captured by wireless communication to establish the biomechanical model. The Xsens suit has 17 sensors, they are fixed on the body thanks to self-gripping bands. These data are then processed by the Xsens MVN Animate software. Proprietary fusion and reconstruction algorithms allow for generating accurate numerical avatar motions at 240 FPS [13]. This model is an osteoarticular model composed of 23 rigid segments and 22 kinematic links of 6 degrees of freedom [14]. Once the biomechanical model is created, it is exported from the Xsens environment to be analysed and processed under CUSTOM (Customizable Toolbox for Musculoskeletal simulation) [15], a Mathworks MATLAB library allowing the simulation of biomechanical models. The segments of this model are linearly discretized to increase the number of points where IMU data are synthesized. This discretized model is presented in Figure 2. To the authors' knowledge, such discretized modelling of IMU has not been studied before. The SID system is calibrated between each set of 10 repetitions. 2.2 Real inertial data (RID) The RID consists of a set of 14 IMUs originated from Delsys Incorporated (https://delsys.com/), the IMU range is 16 g for the accelerometer and 2000 deg/s for the gyroscope, the data are sampled at 370 Hz. These IMUs are placed accordingly to Figure 1. FIGURE 1Open in figure viewerPowerPoint Subject equipped with inertial measurement units (IMUs). Sensor's locations: Hands, lower arms, upper arms, feet, lower legs, upper legs, pelvis and sternum FIGURE 2Open in figure viewerPowerPoint Discretized biomechanical model. (a) Synthesized inertial data (SID) positions where each green circle represents a sensor location (b) real inertial data (RID) sensor location with the corresponding numbering 2.3 Data collection procedure Experiments are performed in a gym. The subject performs 10 repetitions of three types of movements: Countermovement vertical jump: The subject jumps on the spot trying to go as high as possible, the movement starts with a squat. Left-side sprint: The subject starts from a static standing position; at the start, he performs a sprint with great acceleration in the left lateral direction from his initial position. Right-side sprint: The subject starts from a static standing position; at the start, he performs a sprint with great acceleration in the right lateral direction of his initial position. The final database is therefore composed of 30 trials for SID and RID. 2.4 SID and RID comparison SID and RID are compared to validate the use of only SID for assessing the relevance of a sensor location for motion recognition. SID and RID acquisitions are processed. The data is temporally synchronized to match the beginning and the end of the SID and RID acquisitions. Each acquisition performed is labelled with the corresponding name (counter movement vertical jump, left-side sprint or right-side sprint). Differences between the two data sets are evaluated through statistical and frequency parameters called features. The considered features are listed in Table 1. This selection corresponds to the most widely used features in literature. TABLE 1. List of features used for signal characterization (PSD for power spectral density) Number Feature 1 Mean 2 Standard deviation 3 Root mean square 4 Max 5 Min 6 Skewness of PSD 7 Kurtosis of PSD 8 First quartile 9 Second quartile 10 Third quartile 11 Mean crossing rate 12 Mean of PSD 13 Standard deviation of PSD These features are applied to the temporal data of each axis of the IMU data, but the results presented in this paper focus on features applied to the norm of acceleration and angular velocity. The alignment between the axes is not constant during the experiments, the norm does not consider this possible misalignment between the SID and RID reference frames. A 0.5 s window and a 50% overlapping are selected for feature extraction. The frequency study is done with the fast Fourier transform (FFT). Equation (1) presents how the normalized relative error between SID and RID is calculated: RMSE = 1 N ∑ N RID − SID 2 1 N ∑ N RI D 2 \begin{equation}{\rm{RMSE}}\; = \frac{{\sqrt {\frac{1}{N}\mathop \sum \nolimits_N {{\left( {{\rm{RID}} - {\rm{SID}}} \right)}^2}} }}{{\sqrt {\frac{1}{N}\mathop \sum \nolimits_N {\rm{RI}}{{\rm{D}}^2}} }}\;\end{equation} (1) In the equation, N stands for the number of windows for all samples. 2.5 Motion recognition algorithm As a proof of concept, a very common and simple support vector machine (SVM) classifier with a second-order polynomial kernel is implemented. The classifier is trained on each sensor location. Its inputs are features of Table 1. The evaluation of the motion recognition is done by fivefold cross-validation. To determine the classifier accuracy, the classification score is computed, which is 1 minus the average classification loss overall folds. The classification loss is obtained by calculating the misclassification rate. The data set is composed of 10 repetitions of three movements. For each sample consisting of a 0.5 s window of the norm of acceleration and norm of angular velocity, the 13 previously defined features are extracted, leading to a total of 26 features. So far, the dataset is composed of 702 samples with 226 samples for the countermovement jump, 234 samples for the left-side sprint and 242 samples for the right-side sprint. The duration of each movement differs, which is why we obtain a different number of samples after extraction of the features via the 0.5 s windowing. 3 RESULTS Temporal and frequencies studies focus on the entire signals. Features used for comparison and classification are then calculated on windowed signals. 3.1 Temporal waveforms Figure 3 shows SID and RID temporal waveforms. The data correspond to the norm of the acceleration of an IMU located on the left hand during a jump; this is a classical movement in motion classification. The SID and RID acceleration match, as visible in Figure 3. This is a general result for all sensors. Correlation in the frequency domain was also checked. FIGURE 3Open in figure viewerPowerPoint Norm of acceleration of an inertial measurement unit (IMU) on the left hand for a jump for synthesized inertial data (SID) and real inertial data (RID) 3.2 Frequency domain Figure 4 is the FFT of the norm of acceleration on the left hand for a jump. While signals match at low frequencies, the RID signals have a larger frequency bandwidth than the SID signals. FIGURE 4Open in figure viewerPowerPoint Fast Fourier transform (FFT) of the norm of acceleration of an IMU on the left hand for a jump for synthesized inertial data (SID) and real inertial data (RID) for the entire signal However, the frequency of human motions is mainly between 4 and 26 Hz [16]. To compare temporally each sample of the two signals, RID and SID are filtered using a fifth-order Butterworth IIR 10 Hz low pass filter. Figure 5 illustrates the difference between temporal filtered RID and SID data with Bland–Altman plot. This plot presents in log scale the instantaneous difference between RID and SID as a function of the averaged log value, for each time sample. FIGURE 5Open in figure viewerPowerPoint Bland–Altman plot of the filtered RID and SID acceleration data. The observed distribution is not normal, the quantiles of 2.75% and 97.5% were calculated The average difference between RID and SID acceleration is about 0.1 g. We observe that for 95% of the values, the error is 1000%. Figure 7b shows a more relevant feature with errors below 20%. Despite this diversity of errors, the results of the classification comparison on the SID and RID show the same trend, as will be explained later in Figure 9. We recall that the main objective of this paper is to evaluate the effect of position on motion recognition. We thus observe a robustness of the proposed method with respect to the errors between the features extracted on the SID and RID. FIGURE 7Open in figure viewerPowerPoint Box plot of normalized error of the norm of acceleration (top) and angular velocity (bottom) between synthesized inertial data (SID) and real inertial data (RID). (a) Plot for a non-relevant feature (Mean of PSD, on all sensors). (b) Plot for a relevant feature (Skewness, on all sensors). Sensor numbering from Figure 2 applies 3.4 Application for motion recognition SVM classifiers have been trained on each SID position, using the previously mentioned features on velocity and acceleration amplitudes. Figure 8 presents the results of an avatar based on the biomechanical model. The sensor locations are indicated by circles, which colours indicate the classification scores. Therefore, a classification score colour map on human body is plotted. The classification scores of 154 synthetized IMU locations are represented, which assess a significatively larger number of locations compared to the only 14 real sensors: RMSE _ C = 1 N ∑ N RID − SID 2 \begin{equation}{\rm{RMSE}}\_{\rm{C}}\; = \sqrt {\frac{1}{N}\mathop \sum \limits_N {{\left( {{\rm{RID}} - {\rm{SID}}} \right)}^2}} \;\end{equation} (2) FIGURE 8Open in figure viewerPowerPoint Synthesized inertial data (SID) classification score on biomechanical model for 154 sensor locations SID and RID are also compared (Figure 9), the difference in classification scores for each sensor is calculated by Equation (2), which stay below 7%. Equation (2) computes the absolute errors, it is related to Equation (1) through Equation (3): RMSE = RMSE _ C 1 N ∑ N RI D 2 \begin{equation}{\rm{RMSE}}\; = \frac{{{\rm{RMSE}}\_{\rm{C}}}}{{\sqrt {\frac{1}{N}\mathop \sum \nolimits_N {\rm{RI}}{{\rm{D}}^2}} }}\;\end{equation} (3) FIGURE 9Open in figure viewerPowerPoint RMSE_C of classification score between synthesized inertial data (SID) and real inertial data (RID) signals for different sensors locations Regardless of the quality of the motion recognition, the trends on SID and RID are similar. Thus, the sensor locations with the best performance between the two approaches are close. In this study, the sensors placed on the right leg are the best for motion recognition. 4 DISCUSSION The differences between synthesized data and real data can be explained by elements of biomechanics, signal processing and machine learning. First, the method is based on a digital biomechanical model relying on assumptions. Indeed, the model does not consider soft bodies. This may explain differences on the thigh as skins and flesh lead to artefacts as well as sensor misplacement. Soft tissue can add up to 50% additive noise on acceleration and orientation measurements compared to actual bone motion [17]. Some joints are approximated by combinations of kinematic joints, as it is an osteo-articular model. For instance, shoulders are described by two rotations while their kinematics are much more complex [18]. The pelvis is the first kinematic node. It defines the global location of the body. It is sensitive to digital noise coming from the solver that minimizes distances between real and virtual markers for the whole body. So, these locations require special attention if selected as targets for real sensors. Second, the biomechanical model computations imply errors. While signals match at low frequencies, the RID signals have a larger frequency bandwidth than the SID signals. This is caused by various filters stabilizing and smoothing trajectories of the biomechanical model, which are generally low pass filters. This is responsible for differences in features dynamic-sensitive like minimum, maximum, standard deviation as well as power spectral density (PSD) based features like Kurtosis or Skewness. One should keep in mind that highly dynamic-sensitive features are useful to discriminate motions, while they present a poor suitability between SID and RID. Finally, the selected classifier used in this paper for motion recognition is chosen to be simple. Therefore, the results can be improved significantly, but this is out of the scope of this publication. SVM classifier with the second-order polynomial kernel is selected because it is a classic and mostly used algorithm. The modelling of the motion, that is, considering statistical features on 0.5 s window, is not the best fit to describe motions that can last several seconds decomposed in multiple sub-steps. The results may benefit from considering multiple sensors or from more advanced classification algorithms like Markov Chains considering data history or from the latest artificial intelligence algorithms such as deep artificial neural networks. Other limitations that could be discussed in more detail are that only one sensor is considered at a time, the kernel bias, or the unbalance database with the small number of data and motion diversity. 5 CONCLUSION Sensors such as IMUs are increasingly used to assess the physical activity of the human body. Finding the optimal placement of sensors is an important issue. In this paper, a new method providing synthesized IMUs is proposed, based on motion capture. The comparison between synthesized data SID with real data RID is performed and it validates synthesized IMUs data for motion classification based on statistical features. This numerical model is applied to a classification algorithm for motion recognition. A score classification colourmap on human body has been realized for 10 times more synthetized sensors than the real available IMUs. The motion recognition is also done on real data; this comparison with a 7% RMSE validates definitively synthesized IMUs and open many perspectives for the future. This method avoids many experiments necessary to evaluate the optimal placement of sensors for the design of wearable systems. This is a novelty in the state of the art, which applied to sports movements will allow the development of wearable sensor systems to monitor and improve sports performance. CONFLICT OF INTEREST The authors declare no conflict of interest. Open Research DATA AVAILABILITY STATEMENT The data that support the findings of this study are available from the corresponding author upon reasonable request. REFERENCES 1Aroganam, G., Manivannan, N., Harrison, D.: Review on wearable technology sensors used in consumer sport applications. Sensors 19(9), 1983 (2019). https://doi.org/10.3390/s19091983CrossrefWeb of Science®Google Scholar 2Colyer, S.L., et al.: A review of the evolution of vision-based motion analysis and the integration of advanced computer vision methods towards developing a markerless system. Sports Med. Open. 4(1), 1– 15(2018). https://doi.org/10.1186/s40798-018-0139-yCrossrefPubMedGoogle Scholar 3Van Der Kruk, E., Reijne, M.M.: Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 18(6), 806– 819 (2018). https://doi.org/10.1080/17461391.2018.1463397CrossrefPubMedWeb of Science®Google Scholar 4Cust, E.E., et al.: Machine and deep learning for sport-specific movement recognition: A systematic review of model development and performance. J. Sports Sci. 37(5), 568– 600 (2019). https://doi.org/10.1080/02640414.2018.1521769CrossrefPubMedWeb of Science®Google Scholar 5Yurur, O., Liu, C.-H., Moreno, W.: Unsupervised posture detection by smartphone accelerometer. Electron. Lett. 49(8), 562– 564 (2013) Wiley Online LibraryWeb of Science®Google Scholar 6Guo, H., et al.: Smartphone-based activity recognition independent of device orientation and placement. Int. J. Commun. Syst. 29(16), 2403– 2415 (2016). https://doi.org/10.1002/dac.3010Wiley Online LibraryWeb of Science®Google Scholar 7Li, A., et al.: Physical activity classification using a single triaxial accelerometer based on HMM. In: International Conference on Wireless Sensor Network 2010 (IET-WSN 2010), Beijing, China, 15– 17 November 2010. https://doi.org/10.1049/cp.2010.1045Google Scholar 8Brock, H., Ohgi, Y.: Assessing motion style errors in ski jumping using inertial sensor devices. IEEE Sens. J. 17(12), 3794– 3804 (2017). https://doi.org/10.1109/JSEN.2017.2699162CrossrefWeb of Science®Google Scholar 9Raiano, L., et al.; A PCA-based method to select the number and the body location of piezoresistive sensors in a wearable system for respiratory monitoring. IEEE Sens. J. 21(5), 6847– 6855 (2020). https://doi.org/10.1109/JSEN.2020.3043140CrossrefWeb of Science®Google Scholar 10Ay, B., Karakose, M.: Motion classification approach based on biomechanical analysis of human activities. In: 2013 IEEE International Conference on Computational Intelligence and Computing Research, Enathi, India, 28– 30 December 2013. https://doi.org/10.1109/ICCIC.2013.6724198Google Scholar 11Aziz, O., Robinovitch, S.N., Park, E.J.: Identifying the number and location of body worn sensors to accurately classify walking, transferring and sedentary activities. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016. https://doi.org/10.1109/EMBC.2016.7591851Google Scholar 12Xu, W., Zhang, M., Sawchuk, A.A., et al.: Co-recognition of human activity and sensor location via compressed sensing in wearable body sensor networks. In: 2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks, London, UK, 9– 12 May, 2012. https://doi.org/10.1109/BSN.2012.14Google Scholar 13Benjaminse, A., et al.: A validity study comparing xsens with vicon. ISBS Proc. Arch. 38(1), 752 (2020) Google Scholar 14Roetenberg, D., Luinge, H. & Slycke, P.: Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors. Xsens Motion Technologies BV, Tech. Rep, vol. 1. (2009) Google Scholar 15Muller, A., et al.: CusToM: A Matlab toolbox for musculoskeletal simulation. J. Open Source Software 4(33), 1– 3 (2019). https://doi.org/10.21105/joss.00927CrossrefGoogle Scholar 16Skogstad, S., Nymoen, K., Høvin, M., Holm, S. & Jensenius, A.: Filtering motion capture data for real-time applications (2013). http://urn.nb.no/URN:NBN:no-40245Google Scholar 17Solav, D., et al.: Bone orientation and position estimation errors using Cosserat point elements and least squares methods: Application to gait. J. Biomech. 62, 110– 116 (2017). https://doi.org/10.1016/j.jbiomech.2017.01.026CrossrefPubMedWeb of Science®Google Scholar 18Lugo, R., Kung, P., Ma, C.B.: Shoulder biomechanics. Eur. J. Radiol. 68(1), 16– 24 (2008). https://doi.org/10.1016/j.ejrad.2008.02.051CrossrefPubMedWeb of Science®Google Scholar Volume2022, Issue5May 2022Pages 536-543 FiguresReferencesRelatedInformation

Referência(s)