2022-06-172022-06-177/10/2022ICES-2022-111https://hdl.handle.net/2346/89649Linh Vu, Aegis Aerospace (NASA Anthropometry & Biomechanics Facility), USHan Kim, Leidos (NASA Anthropometry & Biomechanics Facility), USAlex Gordon, KBR (NASA Anthropometry & Biomechanics Facility), USSudhakar Rajulu, NASA, USICES400: Extravehicular Activity: Space SuitsThe 51st International Conference on Environmental Systems was held in Saint Paul, Minnesota, US, on 10 July 2022 through 14 July 2022.Extravehicular Activity (EVA) spacesuits exhibit unique movement patterns due to their design characteristics. Mobility assessments using traditional motion capture systems are cost prohibitive and not feasible for some training conditions (e.g., simulated lunar outdoor terrain). This paper aims to present the ongoing development of machine learning solutions to quantify suit motions from conventional videos without special sensors or hardware. Given the fast growth in deep/machine learning technologies, external expertise was sought from open-source communities. This was expected to accelerate development and provide more cost-effective, time-saving solutions. This work was selected for a NASA Crowdsourcing project through an agency-wide solicitation. Partnerships were formed with the NASA JSC Center of Excellence for Collaborative Innovation and an execution crowdsourcing platform partner to solicit framework developments from external contenders. NASA provided contenders with video clips of spacesuits and simultaneously measured motion capture data during EVA simulation tasks. The contenders used this data to train and develop generalized algorithms to predict motions. At the end of the crowdsourcing event, five solutions were selected from 250 submissions. Each submission was tested and scored using video clips not previously disclosed to the contenders. The scoring metrics measured how well the algorithm detected the suit shape, the 2D suit joint detection accuracy, and 3D joint detection accuracy. The winning solution was able to achieve roughly 85% prediction accuracy (weighted combination of scoring metrics). Overall, the algorithms could efficiently detect various types of spacesuits and motions across different EVA simulation environments such as the Neutral Buoyancy Lab (NBL). However, 3D joint identification is less reliable when parts of the suit were obstructed in the image. After continued improvements and validation, the fully developed system will enable EVA stakeholders to quantify suit kinematic patterns, which can help optimize suit, hardware, and task designs.application/pdfengSpacesuit kinematicsEVA posturesBiomechanicsMachine-learning Solution for Automatic Spacesuit Motion Recognition and Measurement from Conventional VideoPresentation