Ultrasound technology offers unique opportunities for studying language. The APIL Lab is engaged in research on the collection, analysis and sharing of ultrasound information, in particular for research on phonetics and phonology.

We have currently developing software to help researchers in their ultrasound work. This includes UltraCapture, which synchronizes ultrasound, video and Kinect tridimensional data, UltraPraat, a modification of Praat that allows for ultrasound visualization next to spectrograms, and AutoTrace, which extracts tongue contours from ultrasound frames for further analysis. All of this software is available, in its alpha version, at the APIL GitHub site.

These are some of our current activities:

Scottish Gaelic:
We are analyzing data collected on the Isle of Skye in June 2013, to better understand the articulation of Initial Consonant Mutations, palatal and palatalized consonants, and epenthetic vowels. Key personnel: Sam Johnston, Jae-Hyun Sung

A Cross-linguistic study of palatalization:
This work is a dissertation looking at lexical, derived, and post-lexical palatalization in Korean, English, and Scottish Gaelic. Key personnel: Jae-Hyun Sung.

Data management:
We are developing a database structure for managing ultrasound image frames and the associated traces. Key personnel: Mohsen Mahdavi

Data analysis:
We are refining AutoTrace, developed originally by Ian Fasel and Jeff Berry, to improve automated identification of the tongue contours from the images. Key personnel: Gus Hahn-Powell

Data collection:
We are exploring the use of Kinect to track head-movement while collecting data, in order to minimize the use of helmet- and head-immobilizing technologies. Key personnel: Rolando Coto

You may also be interested in the ultrasound laboratories of these other universities: