Understanding and Implementing Audio on Android

As app developers, we rarely thinking about how sounds are processed on Android devices because we simply let the Android system handle all our sounds. In this session, we will take a deep dive into how audio is processed by the Android system. By understanding the way sounds are played by Android, we can then look at what it takes to optimize playing sounds with low latency and minimal silence. We’ll also look at the different options you have when implementing sounds in your app, such as using SoundPool, MediaPlayer, or ExoPlayer. We’ll examine the APIs available in each of the options to see what is and isn’t possible. By the end of the session, attendees should have a clear understanding of how audio is processed on Android, and how to best implement audio playback for their own application’s needs.

Caren Chang, June

Caren is an Android developer currently at June helping build an intelligent oven that recognizes and cooks your food. She is most interested in how Android APIs work under the hood, and building cool animations. Caren graduated from UC Berkeley, and has been based in the Bay Area ever since.