In a typical yoga class, students watch an
instructor1 to learn how to properly hold a position. But for people who are blind or can't see well, it can be
frustrating2 to participate in these types of exercises. Now, a team of University of Washington computer scientists has created a software program that watches a user's movements and gives spoken feedback on what to change to
accurately3 complete a yoga pose.
"My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting," said project lead Kyle Rector, a UW doctoral student in computer science and engineering.
The program, called Eyes-Free Yoga, uses Microsoft Kinect software to track body movements and offer auditory feedback in real time for six yoga poses, including
Warrior4 I and II, Tree and Chair poses. Rector and her collaborators published their
methodology(方法学) in the conference
proceedings5 of the Association for
Computing6 Machinery's SIGACCESS International Conference on Computers and Accessibility in Bellevue, Wash., Oct. 21-23.
Rector wrote programming code that instructs the Kinect to read a user's body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose. For example, the program might say: "Rotate your shoulders left," or "Lean sideways toward your left."
The result is an accessible yoga "exergame" -- a video game used for exercise -- that allows people without sight to interact verbally with a simulated yoga instructor. Rector and collaborators Julie Kientz, a UW assistant professor in Computer Science & Engineering and in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.
"I see this as a good way of
helping7 people who may not know much about yoga to try something on their own and feel comfortable and confident doing it," Kientz said. "We hope this acts as a
gateway8 to encouraging people with visual impairments to try exercise on a broader scale."
Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga
instructors9 to put together the
criteria10 for reaching the correct
alignment11 in each pose. The Kinect first checks a person's core and suggests
alignment(队列,校准) changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.
Rector practiced a lot of yoga as she developed this technology. She tested and tweaked each aspect by
deliberately12 making mistakes while performing the exercises. The result is a program that she believes is
robust13 and useful for people who are blind.
"I tested it all on myself so I felt comfortable having someone else try it," she said.