Human standing balance requires continuously nulling the tendency to fall like an inverted pendulum, and this process relies on the convergence of information from multiple sensory modalities including vestibular, vision, audition, proprioception and somatosensation. Dynamic control of unstable balance is also a critical and difficult task in aviation and spaceflight. Pilots may lose orientation and vehicle control during unstable maneuvers like helicopter hovering in a degraded visual environment or during ambiguous vestibular signaling. It is therefore imperative to determine whether training with combinations of sensory modalities can aid the ability to balance in operational environments where some sensory signals can be insufficient or distorted. The goal of this work during the Link Foundation Fellowship was to determine whether training protocols using auditory environments that provide location, velocity, and acceleration information about self-motion can enhance alignment with the direction of balance (DOB) and the direction of gravity (DOG). Studies here at the Graybiel Laboratory have shown that when the static DOG and dynamic pendulum DOB are experimentally uncoupled, self-balancing involves synergistic, dissociable vestibular/somatosensory mechanisms for orienting to each direction (Panic et al., 2015). The Graybiel Laboratory Multi-Axis Rotation System (MARS) was programmed with parameters that simulated inverted pendulum dynamics about the roll axis. Subjects used a manual joystick to control the MARS during supine (90°pitch) roll balancing where semicircular canal cues about the dynamic DOB were present and otolith cues about the DOG were absent. Orientation to the DOG could involve determining and regulating the angular error between the body’s angular position and the gravitational upright; dynamic stabilization without any reference to the DOG could be accomplished using angular acceleration signals contingent on the direction and magnitude of deviation from the DOB. Experimentally it has been observed that without a reference to gravity, subjects tend to have more difficulty orienting to the DOB in the supine orientation, which is termed as positional drifting (Panic et al., 2015) We used the supine roll condition to assess the efficacy of auditory cues to suppress positional drifting without a gravity reference. Auditory cues were created in the form of gated white noise busts with interaural time differences proportional to angular deviation from DOB. The experimental group received auditory cues while balancing, while the control group was not provided with any auditory cues. The direction of balance was set to 0°. It was expected that with auditory training without relevant gravitational cues, that subjects would use the inter-aural time and level differences to determine the direction of balance and demonstrate better performance and less positional drifting than the control group. The results of this study could potentially be used to enhance the design of multi-sensory training simulators. A refined Neuro Kinetics Multi-Axis Rotation System (MARS) was programmed with parameters that simulated inverted pendulum dynamics about the roll axis corresponding to the equation φ ̈ = kP sin φ, where φ represents the degrees of angular deviation from the direction of balance, and kP represents the pendulum constant. Subjects used a Logitech Freedom 2.4 joystick with a light spring loaded towards the central position. The MARS rotated in roll about a vertical axis in the mid-sagittal plane through their center of mass. A speaker was placed behind the MARS at the direction of balance. The speaker produced a range of frequencies (white noise) with a burst length of 1 ms, and a 30 ms latency between bursts. Microphones were placed on the left and right side of the MARS 18 cm apart with a distance of 13.5 cm between each microphone and the speaker. This setup allowed participants to gain polarity and dynamic cues with interaural time differences (ITD) and interaural level differences (ILD). For example, if the participant moved to the left of the sound source, the magnitude of the sound wave would have been greater and arrived more quickly to the right ear compared to the left. Prior to experimentation, subjects were informed that the MARS was programmed to exhibit inverted pendulum dynamics. They were instructed to move the joystick in the opposite direction of MARS movement to remain at the center of balance. Subjects were also informed that there were 30-degree boundaries programmed. After instructions, subjects put on noise-cancelling headphones and blindfolds and were secured in the MARS. Subjects were not given practice trials prior to experimentation. Subjects in both the control and experimental groups balanced in the supine orientation on first and second days. The first and second day of experimentation each included of five blocks with each block consisting of four trials. The trial continued until subjects balanced for a cumulative 100 seconds of balance time or the trial period reached 120 seconds. After every 4 trials, subjects were given a 1-minute break and asked about their degree of nausea on a scale of 0 to 10. Control participants were not provided with auditory or visual cues throughout experimentation. Participants in the experimental group were provided with auditory white noise clicks without visual cues. On the first day, subjects balanced the MARS at a pendulum constant that was incrementally increased from 60 °/s2 to 600 °/s2 in increments of 60 °/s2. On the second day of experimentation, subjects were tested at 600 °/s2 in all trials. The d.irection of balance was set to 0°. If a subject’s position in the MARS deviated more than 30° from the direction of balance, the subject received an auditory message indicating that control was lost and that the machine was resetting.
Fakharzadeh, Lila Naheed, "Auditory and Vestibular Control of Inverted Pendulum Dynamics in Spatial Orientation" (2019). Link Foundation Modeling, Simulation and Training Fellowship Reports. 17.