A working sound playing app sketch - works as of May 27 2018


Thanks to ‘kfrajer’ for his help in getting sound files to play in an Android app created with Processing. Much appreciated!

Here’s a sketch that demonstrates how to do it:

// works with Processing 3.3.7 as of May 27 2018
// PHEW!
// 2018 - Gord Payne - with help from 'kfrajer'
// on the discourse.processing.org/c/processing-android forum

import android.media.MediaPlayer;
import android.content.res.AssetFileDescriptor;
import android.content.Context;
import android.app.Activity;

MediaPlayer s1, s2, s3, s4; // for sound MediaPlayer objects
Context context; // these two objects determine the file access path to the sound files
Activity act;
AssetFileDescriptor af1, af2, af3, af4; // 4 file path descriptors

void setup() {
  act = this.getActivity();
  context = act.getApplicationContext();
  try {
    s1 = new MediaPlayer();
    s2 = new MediaPlayer();
    s3 = new MediaPlayer();
    s4 = new MediaPlayer();
    // all your .mp3 or other sound files are in the 'data' folder in your project folder
    af1 = context.getAssets().openFd("fogleg09.mp3");
    af2 = context.getAssets().openFd("bd04.mp3");
    af3 = context.getAssets().openFd("bugs11.mp3");
    af4 = context.getAssets().openFd("daffy10.mp3");

    // the offset and length tell us where each file starts and how long it is.
    // if these parameters are not specified, all sounds in the data folder will be played in sequence
    // even if you only want one
    s1.setDataSource(af1.getFileDescriptor(), af1.getStartOffset(), af1.getLength());
    s2.setDataSource(af2.getFileDescriptor(), af2.getStartOffset(), af2.getLength());
    s3.setDataSource(af3.getFileDescriptor(), af3.getStartOffset(), af3.getLength());
    s4.setDataSource(af4.getFileDescriptor(), af4.getStartOffset(), af4.getLength());

    s1.prepare();// prepare the sound for playing/usage
  catch(IOException e) {

// draw some buttons to mousePress on screen
  fill(200, 0, 0); // red
  rect(100, 100, 50, 50);
  fill(0, 200, 0);// green
  rect(100, 200, 50, 50);
  fill(0, 0, 200);// blue
  rect(200, 100, 50, 50);
  fill(200, 0, 200);// magenta
  rect(200, 200, 50, 50);

void draw() {

void mousePressed() { // check which screen zone you picked and play that sound
  if ((mouseX>100)&&(mouseX<150)&&(mouseY>100)&&(mouseY<150)) {
  if ((mouseX>100)&&(mouseX<150)&&(mouseY>200)&&(mouseY<250)) {
  if ((mouseX>200)&&(mouseX<250)&&(mouseY>100)&&(mouseY<150)) {
  if ((mouseX>200)&&(mouseX<250)&&(mouseY>200)&&(mouseY<250)) {

// housekeeping

public void pause() {
  if (s1 !=null) {
    s1 = null;
  if (s2 !=null) {
    s2 = null;
  if (s3 !=null) {
    s1 = null;
  if (s4 !=null) {
    s4 = null;

public void stop() {
  if (s1 !=null) {
    s1 = null;
   if (s2 !=null) {
    s2 = null;
   if (s3 !=null) {
    s3 = null;
   if (s4 !=null) {
    s4 = null;


A lil’ cool trick: When we’ve got a sequence of numbered variables such as below: :wink:

MediaPlayer s1, s2, s3, s4;
AssetFileDescriptor af1, af2, af3, af4;

It’s highly advisable to turn those variable sequences into arrays: :smiley_cat:

static final int PLAYERS = 4;
final MediaPlayer[] players = new MediaPlayer[PLAYERS];
final AssetFileDescriptor[] descriptors = new AssetFileDescriptor[PLAYERS];

Read more about it here:

Also notice that both array variables players[] & descriptors[] got the same length equal to PLAYERS. :face_with_monocle:

And when accessing them w/ the operator [], the current index used on 1 is the same index when accessing the other array. :cowboy_hat_face:

Well, in that situation, given the same current index applies to both arrays, you can go even further and make a class to store both as fields: :star_struck:

class Player {
  final MediaPlayer player;
  final AssetFileDescriptor descriptor;

  Player(final MediaPlayer play, final AssetFileDescriptor descript) {
    player = play;
    descriptor = descript;

And now, rather than multiple arrays, you’re gonna need 1 only to access both via that class: :innocent:

static final int PLAYERS = 4;
final Player[] players = new Player[PLAYERS];

You can even add methods to that class too if you wish. Read more below: :money_mouth_face:


@gpteacher – thank you for sharing with the forum!


Thank you for sharing your code with us.

Where do you put the data folder ? I work under Android Studio and my projects are located in a folder called “AndroidStudioProjects” and my sketch project is located in “MySketch”

Do I have to put the date folder in MySketch or MySketch/src ? Thank you for your help


That’s a cool audio file you have there. Can you add sound synthesis to it like playing a sine wave, white noise and so on? Audio files probably aren’t sufficient for sound effects, especially if it takes too long to buffer them.

Related to Yassin’s comment, does anyone know how you make android apps with android studio from processing sketches? What’s the best way to build android apps in processing code? Is there a definitive and easy to follow resource for this?


the ‘data’ folder just goes inside the processing sketch folder at the same level as your project.pde file


Hi Birb:

I worked with the sineTo method on a music app a few years back. But it uses Minim so that’s a no-go for Android.

You’ll have to explore for libraries that do tone-generatation that work with MediaPlayer is my best guess.

Hope that gives you a place to start.


go to

I have links to processing.org’s how-to pages for android app development with some clarifications I made to help understand some of the steps.

Hope this helps.



Hello gpteacher, thank you for your clarification.

I do know where I have to put the data file when I’m working on the PDE, the thing is that I’m working under Android Studio and it’s not the same thing as there is no .pde file in the project.

I was wondering uf you had any ideas on the subject


Hi Yassim. I don’t have experience with Android Studio.

Putting out your call for assistance here is a great start. I wish I could be of more help

Best wishes



@Yassin : In Android Studio mp3/ogg/wav files are to be placed here :


And image files go here :


@GoToLoop :

It’s highly advisable to turn those variable sequences into arrays:

Just wondering : for the sake of speed or for readability ?


I believe this poster gave that comment as a suggestion for efficient grouping of data of the same type and purpose. My original version was presented aimed at Grade 10 high school student learners who havn’t learned about arrays yet. My presentation is meant for readability for new learners. Either is fine.


@yassin Try this: create an empty sketch in the PDE in Android mode. Add an image of your choice in the data folder. Now export the sketch (you need to save it first). Now, you will find inside your sketch folder a folder called export and inside there is a folder labeled App. That is where Processing places your files. As described by @Zappy, you image will be located in the assets folder. So, after you export the sketch, you can open it directly using AS and it will work without making any changes.




working with AS
(and supposing you have imported the P5 file)
you have to distinguish your resources:

  • .html, .txt…: are to be put into the raw folder (create it if it does not exist)

  • images are to be put into the drawable folders, according to size and screens (ldpi, mdpi, xhdpi, xxhdpi, xxxhdpi); if size does not import into a general folder called “drawable”

  • .mp3, .MP4,.3gp… are to be put into the assets folder.



I really should have mentioned that I’m a beginner and that the paths I gave above for image and sound files worked for my app, this “my app” being the only one I have written so far… :confused:

The images I use are svg files, maybe that’s the reason why I couldn’t use the drawable folder(s) for them to be displayed.

As for the sound files (ogg format), I simply followed the Android guide for MediaPlayer.

So much to learn ! Thanks to you and @gpteacher :=)


everybody begins…and that has no end with coding…
as for files like .svg i suppose that they have to be put in the assets folder using P5; as for AS they have to be imported in drawable using the Vector asset command in the import.


Hello ! Thanks for sharing the code of how to play an audio file in android,

i´ve tryed with cassete library but sometimes it stops working, this seems to be working fine.

Is there any way play an audio again once you stop ?

Im puting s1.stop() but then when i put s1.play() it won´t start again, any suggestions?


@Jpupper If you are referring to the cassette library, after you call stop() then you need to assign the resources again before you call play() again if I remember correctly. I haven’t use it in a while. Cassette library is a nice lib for small demos and quick prototyping but I have to say it has a limited functionality. MediaPlayer should be your second, and a better option.