Faculty, students collaborate on augmented reality app to help students with phonetic transcription

Baldwin Wallace faculty and undergraduate students in the Communication Sciences and Disorders and Computer Science departments are currently working together to design an augmented reality (AR) app to help speech-language pathology students hone phonetic transcription skills essential to their future careers.

Christie Needham, Associate Professor of Communication Sciences and Disorders, and Dr. Andrew Watkins, Assistant Professor of Computer Science, are the faculty leaders for the project.  The student team working on the app includes communication sciences and disorders juniors Jamie Braden and Katie Geyer and computer science majors Haley Brown, senior, and Katherine Cendrowski, junior.

Needham and Watkins first came up with the idea for the app after Needham returned from an education technology conference inspired to find ways to use AR technology in BW’s Communication Sciences and Disorders and Speech-Language Pathology (SLP) programs, said Needham.

“I had attended a conference called Educause and came back and talked to Andrew Watkins, and we were discussing ways to use augmented reality in either teaching Communication Sciences and Disorders or for use in our speech clinic for therapy,” Needham said.  “So we worked together to come up with the idea of using it for Phonetics.”

Phonetics is a critical course for speech-language pathologists that teaches the skill of phonetically transcribing others’ speech, but is not always very engaging, said Needham.

“It takes a lot of practice and can be kind of rote and boring, so we thought maybe this would be a great way to motivate students to practice phonetic transcription,” Needham said.

The AR transcription app project would be accessible for undergraduate students interested in working on it while also being useful to BW’s SLP students, Watkins said.

“We were looking for something that would be achievable and something that our undergraduate students could do that would sort of build, hopefully, to bigger systems,” said Watkins.  “And we landed on this gamification of learning phonetics.”

The AR component for the app will likely consist of some logo or graphic that appears on a device’s screen as the user scans over their surroundings with a camera function, said Watkins.  When the logo appears, users will be able to tap on it in order to be given an IPA transcription challenge that they must beat in order to move on to more difficult challenges.

The International Phonetic Alphabet (IPA) is a system of symbols that represent the sounds of human speech and that can be used across languages.  The ability to transcribe patients’ speech using IPA is critical to a speech-language pathologist, said Needham.

The computer science students have worked on developing the app and creating an early prototype, said Watkins.  Meanwhile, the communication sciences and disorders students have captured the recordings for the app and figured out how the transcriptions will work, said Needham.  The two sides of the project met weekly to check in and plan their next steps.

“It’s very much a collaboration,” said Watkins.  “When we have questions about what they need, they’re willing to answer.  When they have suggestions for the way the technology should work, we’re very willing to take those on.”

Both professors said that all members of the project team benefitted from the interdepartmental team-up.

“I think the collaboration has helped my students see what goes into some of the technology we use in therapy and in classes,” Needham said.  “And I think it’s helped the computer science students see the thought process that needs to go behind the language that we use.”

Needham presented the progress on the project at the American Speech-Language-Hearing Association conference in Los Angeles under the title “Phonetic GO: Using Augmented Reality for Teaching in Speech-Language Pathology.”  The project team is now taking time to consider and plan its next steps with the app going forward, said Watkins.

“We’re hoping that we can build upon a variety of different interfaces to do a whole bunch of different things,” Watkins said.  “Maybe it’s learning this material.  Maybe it’s helping clinicians train.  It just depends on where we can go from there, but for right now, this is sort of our beginning project on how we could use augmented reality technology in Speech-Language Pathology training.”