Skip to main content Skip to main navigation menu Skip to site footer
Published: 2022-12-28

Segmentation boundaries in accelerometer data of arm motion induced by music: Online computation and perceptual assessment

University of Jyväskylä

Juan Ignacio Mendoza Garay

ORCID: 0000-0003-3996-7537

gestural interface perceptual evaluation temporal segmentation accelerometer bodily motion similarity


Segmentation is a cognitive process involved in the understanding of information perceived through the senses. Likewise, the automatic segmentation of data captured by sensors may be used for the identification of patterns. This study is concerned with the segmentation of dancing motion captured by accelerometry and its possible applications, such as pattern learning and recognition, or gestural control of devices. To that effect, an automatic segmentation system was formulated and tested. Two participants were asked to ‘dance with one arm’ while their motion was measured by an accelerometer. The performances were recorded on video, and manually segmented by six annotators later. The annotations were used to optimize the automatic segmentation system, maximizing a novel similarity score between computed and annotated segmentations. The computed segmentations with highest similarity to each annotation were then manually assessed by the annotators, resulting in Precision between 0.71 and 0.89, and Recall between 0.82 to 1.


Metrics Loading ...


  1. Aminikhanghahi, S., & Cook, D. J. (2017). A survey of methods for time series change point detection. Knowledge and information systems, 51(2), 339-367. DOI:
  2. Bernard, J., Dobermann, E., Vögele, A., Krüger, B., Kohlhammer, J., & Fellner, D. (2017). Visual-interactive semi-supervised labeling of human motion capture data. Electronic Imaging, 2017(1), 34-45. DOI:
  3. Bläsing, B.E. (2015). Segmentation of dance movement: effects of expertise, visual familiarity, motor experience and music. Frontiers in psychology 5, 1500. DOI:
  4. Cornacchia, M., Ozcan, K., Zheng, Y., & Velipasalar, S. (2017). A survey on activity detection and classi cation using wearable sensors. IEEE Sensors Journal 17(2), 386–403. DOI:
  5. Dreher, C. R., Kulp, N., Mandery, C., Wächter, M., & Asfour, T. (2017). A framework for evaluating motion segmentation algorithms. In 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids) (pp. 83-90). IEEE. DOI:
  6. Endres, D., Christensen, A., Omlor, L., & Giese, M.A. (2011). Emulating human observers with bayesian binning: Segmentation of action streams. ACM Transactions on Applied Perception (TAP), 8(3), 1-12. DOI:
  7. Fathy, Y., Barnaghi, P., & Tafazolli, R. (2018). An Online Adaptive Algorithm for Change Detection in Streaming Sensory Data. IEEE Systems Journal, 13(3), 2688-2699. DOI:
  8. Foote, J. (2000). Automatic audio segmentation using a measure of audio novelty. In 2000 ieee international conference on multimedia and expo. ICME2000. Proceedings. (Vol. 1, pp. 452-455). IEEE. DOI:
  9. Foote, J. T., & Cooper, M. L. (2003). Media segmentation using self-similarity decomposition. In Storage and Retrieval for Media Databases 2003 (Vol. 5021, pp. 167-175). International Society for Optics and Photonics. DOI:
  10. Gharghabi, S., Yeh, C.C.M., Ding, Y., Ding, W., Hibbing, P., LaMunion, S., Kaplan, A., Crouter, S.E., & Keogh, E. (2019). Domain agnostic online semantic segmentation for multi-dimensional time series. Data Mining and Knowledge Discovery, 33(1), 96–130. DOI:
  11. Gibb, B., Gibb, R., & Gibb, M. (1977). Stayin’ alive. In Saturday Night Fever, The Original Motion Picture Soundtrack. Germany: RSO.
  12. Gong, D., Medioni, G., & Zhao, X. (2014). Structured time series analysis for human action segmentation and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36(7), 1414–1427. DOI:
  13. Kahol, K., Tripathi, P., & Panchanathan, S. (2004). Automated gesture segmentation from dance sequences. In Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings. (pp. 883–888). IEEE. DOI:
  14. Krüger, B., Vögele, A., Willig, T., Yao, A., Klein, R., & Weber, A. (2016). Efficient unsupervised temporal segmentation of motion data. IEEE Transactions on Multimedia, 19(4), 797-812. DOI:
  15. Krüger, V., Kragic, D., Ude, A., & Geib, C. (2007). The meaning of action: A review on action recognition and mapping. Advanced robotics, 21(13), 1473-1501. DOI:
  16. Lin, J.F.S., Karg, M., & Kulić, D. (2016). Movement primitive segmentation for human motion modeling: A framework for analysis. IEEE Transactions on Human-Machine Systems 46(3), 325–339. DOI:
  17. Liu, S., Yamada, M., Collier, N., & Sugiyama, M. (2013). Change-point detection in time-series data by relative density-ratio estimation. Neural Networks, 43, 72-83. DOI:
  18. Markou, M., & Singh, S. (2003). Novelty detection: a review—part 1: statistical approaches. Signal processing, 83(12), 2481-2497. DOI:
  19. Mendoza, J.I. (2014). Self-report measurement of segmentation, mimesis and perceived emotions in acousmatic electroacoustic music. Master’s thesis. University of Jyväskylä.
  20. Mendoza, J. I., & Thompson, M. (2017). Modelling Perceived Segmentation of Bodily Gestures Induced by Music. In ESCOM 2017: Conference proceedings of the 25th Anniversary Edition of the European Society for the Cognitive Sciences of Music (ESCOM). Ghent University.
  21. Otondo, F. (2008). Ciguri. In Tutuguri. Sargasso.
  22. Patterson, T., Khan, N., McClean, S., Nugent, C., Zhang, S., Cleland, I., & Ni, Q. (2016). Sensor-based change detection for timely solicitation of user engagement. IEEE Transactions on Mobile Computing, 16(10), 2889-2900. DOI:
  23. Petzold, C. (ca. 1725). Minuet in G major. The Anna Magdalena Bach Notebook, Anh. 114.
  24. Rodrigues, J., Probst, P., & Gamboa, H. (2021). TSSummarize: A Visual Strategy to Summarize Biosignals. In 2021 Seventh International conference on Bio Signals, Images, and Instrumentation (ICBSII) (pp. 1-6). IEEE. DOI:
  25. Schätti, G. (2007). Real-Time Audio Feature Analysis for Decklight3. rep1&type=pdf
  26. Tardieu, D., Chessini, R., Dubois, J., Dupont, S., Hidot, S., Mazzarino, B., ... & Visentin, A. (2009). Video Navigation Tool: Application to browsing a database of dancers’ performances. on Multimodal Interfaces eNTERFACE’09, 35.;jsessionid=0249E27EDBD8D12E8FF58DE4F9ABC18A?doi=
  27. Zacks, J. M., Kumar, S., Abrams, R. A., & Mehta, R. (2009). Using movement and intentions to understand human activity. Cognition, 112(2), 201-216. DOI:
  28. Zameni, M., Sadri, A., Ghafoori, Z., Moshtaghi, M., Salim, F. D., Leckie, C., & Ramamohanarao, K. (2020). Unsupervised online change point detection in high-dimensional time series. Knowledge and Information Systems, 62(2), 719-750. DOI:
  29. Zhou, F., De la Torre, F., & Hodgins, J. K. (2012). Hierarchical aligned cluster analysis for temporal clustering of human motion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(3), 582-596. DOI:

How to Cite

Mendoza Garay, J. I. (2022). Segmentation boundaries in accelerometer data of arm motion induced by music: Online computation and perceptual assessment. Human Technology, 18(3), 250–266.