• English
    • español
    • français
    • Deutsch
  • français 
    • English
    • español
    • français
    • Deutsch
  • Ouvrir une session
Voir le document 
  •   Accueil de TTU DSpace
  • ThinkTech
  • International Conference on Environmental Systems
  • Voir le document
  •   Accueil de TTU DSpace
  • ThinkTech
  • International Conference on Environmental Systems
  • Voir le document
JavaScript is disabled for your browser. Some features of this site may not work without it.

Development of an Augmented Reality System for Human Space Operations

Thumbnail
Voir/Ouvrir
ICES-2019-108.pdf (7.646Mo)
Date
2019-07-07
Auteur
Pinedo, Carlos
Dixon, Jordan
Chang, Christine
Auguste, Donna
Brewer, Mckenna
Desilva, Devin
Hill, Chris
Jensen, Cassidy
Jones, Amanda
Voss, James
Anderson, Allison
Metadata
Afficher la notice complète
Résumé
In this work we develop an augmented reality (AR) heads up display for astronauts during extravehicular activity (EVA) operations. This work takes advantage of recent advances in commercial heads-up-display (HUD) technology to simulate information delivery for an EVA astronaut in microgravity. The primary design objectives were to increase situation awareness, provide timely information to the user and supporting personnel, and facilitate communication among all system elements (user, ground control, and intravehicular astronaut). The design includes a visual interface that provides on-demand information in both egocentric (fixed to the user) and exocentric (fixed to the environment) perspectives. Some of the information includes astronaut bioinformatics, spacesuit informatics, checklist procedures, communication information, and basic navigation. The design also includes an audio interface that receives verbal commands from the user and provides auditory feedback and information. A novel method of interacting with the AR system was explored: electromyography (EMG). EMG receives electrical signal output from muscle groups on the user’s body and is able to map those as specific inputs to the AR system. In this way, the user’s hands and voice are free to complete other tasks as necessary, while still maintaining a mode of communication with the AR device. To aid in communication among all elements, remote display control via telestration (the ability of a remote user, such as ground control or an IVA astronaut, to draw over a still or video image) was included. This provided a means of visual communication to facilitate task completion, aid in emergency situations, highlight any anomalies, increasing user situation awareness, and decreasing workload. Additional capability was provided for object-tool recognition and basic navigation assistance. Preliminary subject testing highlighted the potential benefits of the following critical design elements: minimalistic visual display, redundancy of interaction through modalities, and continuity between internal and external display elements.
Citable Link
https://hdl.handle.net/2346/84611
Collections
  • International Conference on Environmental Systems

DSpace software copyright © 2002-2016  DuraSpace
Contactez-nous
TDL
Theme by 
Atmire NV
 

 

Parcourir

Tout DSpaceCommunautés & CollectionsPar date de publicationAuteursTitresSujetsDepartmentCette collectionPar date de publicationAuteursTitresSujetsDepartment

Mon compte

Ouvrir une session

Statistiques

Statistiques d'usage de visualisation

DSpace software copyright © 2002-2016  DuraSpace
Contactez-nous
TDL
Theme by 
Atmire NV