Saturday, December 7, 2013

Audio scope tutorial - Capturing audio and displaying it real time in Android

This is an Android example that displays a rolling (drifting) oscilloscope graph with the envelop of the microphone audio. Very simple, no adjustments. Also, to demonstrate the NDK use, we will do a portion of the tasks (the processing of the samples) in two ways, with Java and with C, using JNI. Get code here.

In the interest of readability, I limit this post to the top level about the app architecture but you can click below and see specific posts explaining each of the following topics used on it:
The top level structure is shown on the picture below:

The main activity holds a circular queue (Audio Buffer) to store the audio samples. A separate thread (Record Thread) will fill it at a rate of 44100 samples per second. That thread runs on its own and it is actually blocked (that is why should be a separate thread) every time we do a "read" from the internal "hardware buffer".

The same activity has a display (managed by SurfaceView, see extensive explanation on this object here) which will launch two more threads, the Processor Thread and the Scope Thread.

The Processor Thread is "activated" every time a message indicating that there are new samples in the Audio Buffer is received. The Record Thread is the one alerting/sending those messages (this is described above by a thin dotted line). Once it receives the message from the Record Thread (see a detailed explanation here), the thread will break the input samples in groups of 200, find the maximum of that group, and place that value on a Scope Buffer, which hold the samples that are going to be displayed. I.e., each pixel in the screen represents the maximum of 200 audio samples.

Notice that I have a flag (yes, not really a compiler flag, as I was planning to control this from a GUI option but never got to it...) that will choose the processing to be done in Java or in C (through the NDK). Check this post where I look into the C code as an example of handling arrays in C (changing dimension, copying, appending...).

Asynchronously to this, every certain time, the second thread started by SurfaceView, the Scope Thread, dumps/draws all what is in the Scope Buffer into the screen. Basically, the most right point in the screen corresponds to the latest data the Processor Thread added.

Notice that the buffers are of the Q object type (custom). There is a pointer to write and one to read. The write is in general sequential. Every time you write you move the pointer. For the read you can set the read pointer to any position and choose what to read... I am pretty sure this can be done in a cleaner way and maybe using some already built in object, but I thought I would just build one myself to learn...

So, without more delay, here is the code. Scope2.java includes the audio recording thread:
 package com.cell0907.scope2;  
 import android.media.AudioRecord;  
 import android.os.Bundle;  
 import android.os.Message;  
 import android.util.Log;  
 import android.app.Activity;  
 public class Scope extends Activity {  
      public static final boolean JNI=true; // Tells us if we want to use JNI C function or java  
      // RECORDING VARIABLES  
      private AudioRecord AR=null;  
      public int BufferSize;                    // Length of the chunks read from the hardware audio buffer  
      private Thread Record_Thread=null; // The thread filling up the audio buffer (queue)  
      private boolean isRecording = false;  
      public Q audio_buffer=new Q(20000); // Record_Thread read the AR and puts it in here.  
      private static final int AUDIO_SOURCE=android.media.MediaRecorder.AudioSource.MIC;  
      public static final int SAMPLE_RATE = 44100;  
      private static final int CHANNEL_CONFIG = android.media.AudioFormat.CHANNEL_IN_MONO;  
      private static final int AUDIO_FORMAT = android.media.AudioFormat.ENCODING_PCM_16BIT;       
      private ScopeSurfaceView scope_screen_view;  
   @Override  
      protected void onCreate(Bundle savedInstanceState) {  
           super.onCreate(savedInstanceState);  
           Log.d("MyActivity", "onCreate");  
   }  
   @Override  
   public void onPause(){  
        Log.d("MyActivity", "onPause");  
           // TODO Auto-generated method stub  
           if (null != AR) {  
          isRecording = false;  
          boolean retry = true;  
          while (retry) {  
               try {  
                    Record_Thread.join();  
                    retry = false;  
               } catch (InterruptedException e) {}     
          }  
          AR.stop();  
          AR.release();  
          AR = null;  
          //recordingThread = null;  
           }  
           scope_screen_view.surfaceDestroyed(scope_screen_view.getHolder());  
        super.onPause();  
   }  
      protected void onResume(){  
           super.onResume();  
           Log.d("MyActivity", "onResume");  
           scope_screen_view=new ScopeSurfaceView(this,audio_buffer);  
           setContentView(scope_screen_view);  
           BufferSize=AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, AUDIO_FORMAT);                 
           isRecording=true;  
           Record_Thread=new Thread(new Runnable() {  
          public void run() {  
            Capture_Audio();  
          }  
        },"AudioRecord Thread");  
           Record_Thread.start();  
      }  
      /*  
       * This runs in a separate thread reading the data from the AR buffer and dumping it  
       * into the queue (circular buffer) for processing (in java or C).  
       */  
      public void Capture_Audio(){  
           byte[] AudioBytes=new byte[BufferSize]; //Array containing the audio data bytes  
           int[] AudioData=new int[BufferSize/2]; //Array containing the audio samples  
           try {  
                AR = new AudioRecord(AUDIO_SOURCE,SAMPLE_RATE,CHANNEL_CONFIG,AUDIO_FORMAT,BufferSize);  
                try {  
                     AR.startRecording();  
                } catch (IllegalStateException e){  
                     System.out.println("This didn't work very well");  
                     return;  
                     }  
                } catch(IllegalArgumentException e){  
                     System.out.println("This didn't work very well");  
                     return;  
                     }  
           while (isRecording)  
           {  
                AR.read(AudioBytes, 0, BufferSize); // This is the guy reading the bytes out of the buffer!!  
                //First we will pass the 2 bytes into one sample   
                //It's an extra loop but avoids repeating the same sum many times later  
                //during the filter  
                int r=0;       
                for (int i=0; i<AudioBytes.length-2;i+=2)  
                {// Before the 8 we had the end of the previous data  
                     if (AudioBytes[i]<0)   
                          AudioData[r]=AudioBytes[i]+256;   
                     else   
                          AudioData[r]=AudioBytes[i];  
                     AudioData[r]=AudioData[r]+256*AudioBytes[i+1];       
                     r++;  
                }                      
                // Write on the QUEUE            
                synchronized(audio_buffer){  
                     audio_buffer.put(AudioData);  
                }       
                // Not a very pretty way to hit the handler (declaring everything public)  
                // but just for the sake of demo.  
                Message.obtain(scope_screen_view.ProcessorThread.mHandler,   
                          scope_screen_view.ProcessorThread.DO_PROCESSING, "").sendToTarget();  
           }  
           Log.d("MyActivity", "Record_Thread stopped");  
      }  
   @Override  
      protected void onStop() {  
           super.onStop();  
   }   
 }  

The ScopeSurfaceView.java (to create the display):
 package com.cell0907.scope2;  
 import android.content.Context;  
 import android.util.Log;  
 import android.view.SurfaceHolder;  
 import android.view.SurfaceView;  
 // We extend SurfaceView. Internally (private) SurfaceView creates an object SurfaceHolder  
 // effectively defining the methods of the SurfaceHolder interface. Notice that it does  
 // not create a new class or anything, it just defines it right there. When we extend  
 // the SurfaceView with the SurfaceHolder.Callback interface, we need to add in that extension  
 // the methods of that interface.  
 public class ScopeSurfaceView extends SurfaceView implements SurfaceHolder.Callback {  
      private SurfaceHolder holder;     // This is no instantiation. Just saying that holder  
                                              // will be of a class implementing SurfaceHolder  
      private ScopeThread ScopeThread;// The thread that displays the data  
      public ProcessorThread ProcessorThread; // The thread that reads audio and creates the  
                                              // scope samples  
      private Q source_buffer;          // Audio data  
      private Q scope_buffer=null;      // Buffer for the screen of the scope  
      public ScopeSurfaceView(Context context){  
           super(context);  
      }  
      public ScopeSurfaceView(Context context, Q source) {  
           super(context);  
           source_buffer=source;          // Where to get the samples to display  
           holder = getHolder();          // Holder is now the internal/private mSurfaceHolder object   
                                              // in the SurfaceView object, which is from an anonymous  
                                              // class implementing SurfaceHolder interface.  
           holder.addCallback(this);  
      }  
      @Override  
      public void surfaceCreated(SurfaceHolder holder) {  
      }  
      @Override  
      // This is always called at least once, after surfaceCreated  
      public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {  
        if (scope_buffer==null){  
             scope_buffer=new Q(width);              
                ScopeThread = new ScopeThread(holder,scope_buffer);  
                ScopeThread.setRunning(true);  
                ScopeThread.setSurfaceSize(width, height);  
                ScopeThread.start();  
                Log.v("MyActivity","Screen width "+width);  
                ProcessorThread = new ProcessorThread(source_buffer, scope_buffer);  
                ProcessorThread.setRunning(true);  
                ProcessorThread.start();  
        }  
      }  
      @Override  
      public void surfaceDestroyed(SurfaceHolder holder) {  
           Log.d("MyActivity", "DESTROY SURFACE");  
           boolean retry = true;  
           ScopeThread.setRunning(false);  
           while (retry) {  
                try {  
                     ScopeThread.join();  
                     retry = false;  
                } catch (InterruptedException e) {}  
           }  
           retry = true;  
           Log.d("MyActivity", "Going to stop the processor thread");  
           ProcessorThread.setRunning(false);  
           while (retry) {  
                try {  
                     ProcessorThread.mHandler.getLooper().quit();  
                     ProcessorThread.join();  
                     retry = false;  
                     Log.d("MyActivity", "FAIL");  
                } catch (InterruptedException e) {  
                     Log.d("MyActivity", "FULL FAIL");  
                }  
           }  
   }  
   public Thread getThread() {  
     return ScopeThread;  
   }  
 }  
The ProcessorThread.java (which takes 200 audio samples and creates a pixel for the screen):
 package com.cell0907.scope2;  
 import java.util.Arrays;  
 import android.os.Handler;  
 import android.os.Looper;  
 import android.os.Message;  
 import android.util.Log;  
 public class ProcessorThread extends Thread {  
      private Q source_buffer;                // Where to get the samples from  
      private Q destiny_buffer;               // Where to put the samples to  
   private int speed=200;                 // Number of samples in one pixel  
   private boolean running = true;  
   private int[] reminder;  
   public Handler mHandler;               // Processor handler  
   public static final int DO_PROCESSING = 1; // Message that the audio capture send to  
                                                          // the processing thread.  
      ProcessorThread(Q source, Q destiny){  
     this.source_buffer=source;  
     this.destiny_buffer=destiny;  
     reminder=new int[0];  
      }  
      @Override  
   public void run() {  
           Looper.prepare();  
           mHandler = new Handler()  
           {  
                public void handleMessage(Message msg)  
                {  
                     switch (msg.what)  
                     {  
                     case DO_PROCESSING:  
                          // GOT NEW DATA. ANALYZE  
                          int[] intermediate_buffer;  
                          int[] intermediate_buffer_2;  
                          synchronized(source_buffer){  
                               intermediate_buffer=source_buffer.get();  
                          }  
                          if (Scope.JNI==false) process(intermediate_buffer);  
                          else {  
                               intermediate_buffer_2=processjni(intermediate_buffer);  
                         synchronized(destiny_buffer){  
                              destiny_buffer.put(intermediate_buffer_2);  
                         }  
                          }   
                          break;  
                     }  
                }   
           };  
           Looper.loop();   
           while(running){ // I do not think you actually need this anymore  
           }  
           Log.d("MyActivity", "Processor Thread stopped");  
   }  
      private void process(int[] audio){  
           int x=0;  
           int maximum;   
           // speed is the number of original audio samples that form one  
           // pixel in the screen. As long as we got enough for one, we write it  
           // in.  
           audio=concat(reminder,audio);  
           int i=audio.length;  
           while (i>=speed){  
                maximum=0;  
          for (int j=0;j<speed;j++)  
               if (audio[x+j]>maximum) maximum=audio[x+j];  
               synchronized(destiny_buffer){  
                    destiny_buffer.put(maximum);  
               }  
          x+=speed;  
          i-=speed;  
           }  
           if (x>0) x-=speed;  
           reminder=Arrays.copyOf(reminder, i); // Resize reminder  
           System.arraycopy(audio, x, reminder, 0, i); // Copy what was left  
      }  
      public static int[] concat(int[] first, int[] second) {  
            int[] result = Arrays.copyOf(first, first.length + second.length);  
            System.arraycopy(second, 0, result, first.length, second.length);  
            return result;  
           }  
d 
      private native int[] processjni(int[] audio);  
      static {  
     System.loadLibrary("JNImodule");  
   }    
 }  
The ProcessorThread can do its function in Java (see the code) or by calling an external C function, using JNI and the NDK. JNIlib.c:
 #include <jni.h>  
 #include <stdlib.h>  
 jintArray  
 Java_com_cell0907_scope2_ProcessorThread_processjni (JNIEnv *env, jobject thisObj, jintArray inJNIArray) {  
      int speed=200;  
      static int reminder[200];      // The reminder samples can't be longer than  
                                         // the speed  
      static int left=0;  
   // Step 1: Convert the incoming JNI jintarray to C's jint[]  
   jint *audioin = (*env)->GetIntArrayElements(env, inJNIArray, NULL);  
   if (NULL == audioin) return NULL;  
   jsize length = (*env)->GetArrayLength(env, inJNIArray);  
   // Put together whatever was left from the past (see below)  
   // plus the new audio samples  
   int audio[(int)length+left];  
   memcpy(audio,reminder,left*sizeof(int));  
   memcpy(&audio[left],audioin,(int)length*sizeof(int));  
   // Step 2: Perform its intended operations  
   // speed is the number of original audio samples that form one  
   // pixel in the screen. As long as we got enough for one, we write it  
   // in.  
   jint outCArray[(int)length/speed+1];  
   int x=0,i=0,j;  
   int maximum;  
   //left+=(int)length;  
   left=(int)length;  
   while (left>=speed){  
        maximum=0;  
        for (j=0;j<speed;j++)  
             if (audio[x+j]>maximum) maximum=audio[x+j];  
        outCArray[i]=maximum;  
        x+=speed;  
        left-=speed;  
        i++;  
   }  
   // Whatever was left, save it for next  
   if (x>0) x-=speed;  
   memcpy(reminder,&audio[x],left*sizeof(int));  
   (*env)->ReleaseIntArrayElements(env, inJNIArray, audio, 0); // release resources  
   // Step 3: Convert the C's Native jdouble[] to JNI jdoublearray, and return  
   jintArray outJNIArray = (*env)->NewIntArray(env, i); // allocate  
   if (NULL == outJNIArray) return NULL;  
   (*env)->SetIntArrayRegion(env, outJNIArray, 0 , i, outCArray); // copy  
   return outJNIArray;  
 }  
You will need this Android.mk to compile the C function using the NDK:
 # Copyright (C) 2009 The Android Open Source Project  
 #  
 # Licensed under the Apache License, Version 2.0 (the "License");  
 # you may not use this file except in compliance with the License.  
 # You may obtain a copy of the License at  
 #  
 #   http://www.apache.org/licenses/LICENSE-2.0  
 #  
 # Unless required by applicable law or agreed to in writing, software  
 # distributed under the License is distributed on an "AS IS" BASIS,  
 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  
 # See the License for the specific language governing permissions and  
 # limitations under the License.  
 #  
 LOCAL_PATH := $(call my-dir)  
 include $(CLEAR_VARS)  
 LOCAL_MODULE  := JNImodule  
 LOCAL_SRC_FILES := JNIlib.c  
 include $(BUILD_SHARED_LIBRARY)  
And finally the thread that takes the samples created by Processor Thread and displays them in the screen, ScopeThread.java:
 package com.cell0907.scope2;  
 import android.graphics.Canvas;  
 import android.graphics.Color;  
 import android.graphics.Paint;  
 import android.graphics.Paint.Style;  
 import android.graphics.RectF;  
 import android.util.Log;  
 import android.view.SurfaceHolder;  
 public class ScopeThread extends Thread {  
   private int mCanvasWidth;  
   private int mCanvasHeight;  
      private Q scope_buffer;           // Circular buffer for the scope in the screen  
      int[] intermediate_buffer;               // Will store the display  
      private SurfaceHolder holder;  
   private boolean running = true;  
   private final Paint paint = new Paint(Paint.ANTI_ALIAS_FLAG);  
   private final int refresh_rate=20;      // How often we update the scope screen, in ms  
   public ScopeThread(SurfaceHolder holder, Q scope_buffer) {  
     this.holder = holder;  
     this.scope_buffer=scope_buffer;  
        paint.setColor(Color.BLUE);  
        paint.setStyle(Style.STROKE);  
        paint.setTextSize(50);  
   }  
   @Override  
   public void run() {  
     long previousTime, currentTime;  
        previousTime = System.currentTimeMillis();  
     Canvas canvas = null;  
     while(running) {  
       currentTime=System.currentTimeMillis();  
       while ((currentTime-previousTime)<refresh_rate){  
            currentTime=System.currentTimeMillis();  
       }  
       previousTime=currentTime;  
       // Paint right away, so, it is as smooth as can be...  
       try {  
         canvas = holder.lockCanvas();  
         synchronized (holder) {  
               draw(canvas);           
         }  
       }  
       finally {  
            if (canvas != null) {  
                 holder.unlockCanvasAndPost(canvas);  
                 }  
       }  
       // Update the scope buffer with info from the audio buffer  
                synchronized(scope_buffer){  
                  scope_buffer.set_r_pointer(scope_buffer.get_w_pointer()+1);  
                  intermediate_buffer=scope_buffer.get();     // Reads full buffer  
                }  
                try {  
                     Thread.sleep(refresh_rate-5); // Wait some time till I need to display again  
                } catch (InterruptedException e) {  
                     // TODO Auto-generated catch block  
                     e.printStackTrace();  
                }       
     }  
     Log.d("MyActivity", "Scope Thread stopped");  
   }  
   // Should be called every ~50ms  
   private void draw(Canvas canvas)  
   {  
        int current, previous=mCanvasHeight-10;  
        canvas.drawColor(Color.BLACK);  
        paint.setColor(Color.WHITE);  
        canvas.drawRect(new RectF(1,1,mCanvasWidth-1,mCanvasHeight-1), paint);  
        paint.setColor(Color.RED);  
        for(int x=1;x<mCanvasWidth-2;x++){  
             current=mCanvasHeight-intermediate_buffer[x]*(mCanvasHeight-11)/32767-10;  
             canvas.drawLine(x,previous,x+1,current,paint);  
             previous=current;  
        }  
   }  
   public void setRunning(boolean b) {  
     running = b;  
   }  
   public void setSurfaceSize(int width, int height) {  
        synchronized (holder){// that we removed  
          mCanvasWidth = width;  
          mCanvasHeight = height;  
        }  
        intermediate_buffer=new int[mCanvasWidth];  
   }  
 }  
And yeah, we can't forget the brains behind the management of the circular buffers (the audio samples and the scope pixels), done using the Q object, Q.java:
 import java.util.Arrays;  
 public class Q {  
      private int[] Data;           // Circular buffer  
      private int Buffer_length;     // How long the buffer is before it wraps around  
      private int w_pointer;          // Position to the next element to write in the buffer   
      private int r_pointer;          // Position to the next element to read from the buffer  
      Q(int length){  
           Buffer_length=length;  
           Data=new int[Buffer_length];  
           w_pointer=0;  
           r_pointer=Buffer_length-1;  
      }  
      int get_w_pointer(){  
           return w_pointer;  
      }  
      void set_r_pointer(int pointer){  
           if (pointer<0)  
                this.r_pointer=Buffer_length+pointer;  
           else if (pointer>=Buffer_length)  
                this.r_pointer=pointer%Buffer_length;  
           else  
                this.r_pointer=pointer;  
      }  
      int get_r_pointer(){  
           return r_pointer;  
      }  
      /*  
       * Places an amount of data in the buffer and returns true  
       * if there is a buffer over run (write pointer goes over the  
       * read pointer)  
       */  
      public boolean put(int[] data_in){  
           int i=0;  
           boolean error=false;  
           while (i<data_in.length)  
           {  
                if (w_pointer==r_pointer) error=true;  
                Data[w_pointer]=data_in[i];  
                i++;  
                w_pointer++;  
                if (w_pointer>Buffer_length-1) w_pointer=0;       
           }  
           return error;  
      }  
      /*  
       * Places an single element of data in the buffer and returns true  
       * if there is a buffer over run (write pointer goes over the  
       * read pointer)  
       */       
      public boolean put(int data_in){  
           boolean error=false;  
           if (w_pointer==r_pointer) error=true;  
           Data[w_pointer]=data_in;  
           w_pointer++;  
           if (w_pointer>Buffer_length-1) w_pointer=0;  
           return error;  
      }  
      /*  
       * Returns all the data available in the buffer,  
       * basically, from r_pointer to w_pointer at the time of the call.  
       */  
      public int[] get(){  
           int[] data_out=new int[Buffer_length];  
           int i=0;       
           while (r_pointer!=w_pointer) // Reads till the end of the buffer  
           {  
                data_out[i]=Data[r_pointer];  
                i++;  
                r_pointer++;  
                if (r_pointer>Buffer_length-1) r_pointer=0;  
           }  
           return Arrays.copyOf(data_out,i);  
      }  
      // Reads n elements or none. In the 2nd case, it will still return whatever could   
      // read but will not move the r_pointer  
      public int[] get(int n){  
           int[] data_out=new int[Buffer_length];  
           int i=0;  
           int r_pointer_backup = r_pointer;  
           while ((r_pointer!=w_pointer) & (i<n))     // Reads till the end of the buffer  
                                                             // or till we get n elements  
           {  
                data_out[i]=Data[r_pointer];  
                i++;  
                r_pointer++;  
                if (r_pointer>Buffer_length-1) r_pointer=0;  
           }  
           if (i<n) r_pointer=r_pointer_backup; // As you couldn't read the whole array, go back  
           return Arrays.copyOf(data_out,i);  
      }  
      // Reads one element.  
      public int getone(){  
           int i=Data[r_pointer];  
           r_pointer++;  
           if (r_pointer>Buffer_length-1) r_pointer=0;  
           return i;  
      }  
 }  
And the AndroidManifest.xml:
 <?xml version="1.0" encoding="utf-8"?>  
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
   package="com.cell0907.scope2"  
   android:versionCode="1"  
   android:versionName="1.0" >  
   <uses-sdk  
     android:minSdkVersion="9"  
     android:targetSdkVersion="17" />  
   <uses-permission  
           android:name="android.permission.RECORD_AUDIO" />  
   <application  
     android:allowBackup="true"  
     android:icon="@drawable/ic_launcher"  
     android:label="@string/app_name"  
     android:theme="@style/AppTheme" android:debuggable="true">  
     <activity  
       android:name="com.cell0907.scope2.Scope"  
       android:label="@string/app_name"   
       android:screenOrientation="landscape">  
       <intent-filter>  
         <action android:name="android.intent.action.MAIN" />  
         <category android:name="android.intent.category.LAUNCHER" />  
       </intent-filter>  
     </activity>  
   </application>  
 </manifest>  

And if you made it till here, you deserve a brand new icon :P



Anyhow, hopefully it is easy to follow. Do not forget to check the links in the post as some of them explain what I consider the tougher pieces in the code.

Cheers!

PS.: Please, click here to see an index of other posts on Android.

4 comments:

  1. this is very usefull. how should i download project..

    ReplyDelete
    Replies
    1. Sorry but I didn't put this in any repository anywhere... Let me see if I can do that. In the mean time, sorry but just cut and paste... Will post back once I get it somewhere...

      Delete
    2. Ok. I put it here: https://github.com/cell0907/Audio-scope
      Give it a shot. First time using Github, so, keep me posted if any issues... Will update the post too to show the link above.

      Delete
    3. Thank you so much for your help and it was very help full to my project.
      i have to clear one more thing if i want to display same waves as in this project for
      current playing song from media player..
      please help me out with this problem..

      Delete