Wednesday, December 25, 2013

Presenting images in Android fast

The idea here was to be able to display very fast a slideshow of all the images in a directory. So, in principle I didn't want to be loading from flash/disk while displaying, therefore I wanted to save them in memory.

Top level, the app will simply store the images in some kind of memory structure and then, periodically, display one. The periodicity is done with an AsyncTask acting as timer. I.e., we call the task, in the task we have a 50ms wait (Thread.sleep(50);) and then when done we send a message back to the UI, which displays the new image, calls the task again and so on... See code below...

For the display of the image, we simply assign a Bitmap to the ImageView object using ImageView.setImageBitmap

So, finally, the question is how do we store all the images in memory for a quick retrieval when we need them. I show here two methods. Either one seems to work fine. The key is to be aware that we have limited memory resources and that Bitmaps can be quite big. To figure the resources use:
  1. final int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);
  2. Log.v("MyActivity","maxMemory: "+maxMemory);
which in my case (HTC ONE) returns: 196608 (that is 196MBytes).

If we start loading the pictures that the same phone has taken, each one will take 2688*1520*4 Bytes! We can see this by looking at the LogCat when we run the apps below. We see a message saying "Grow heap to XXX for 16343056-byte allocation". So, bottom line, either one of the two methods below will work only if the amount of images you have x the resolution of each one is kept within some boundaries... You can, of course, save the images in full resolution and resize them within your app or save them, to start with, in lower resolution (that is what I did here, using 640x360).

So, now to the two methods to store the pics. The first one uses simply an array of Bitmaps (duh!). PicActivity.java:
 package com.cell0907.pic;  
 import java.io.File;  
 import java.util.Random;  
 import com.cell0907.pic.R;  
 import android.os.Bundle;  
 import android.os.Environment;  
 import android.os.Handler;  
 import android.app.Activity;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.util.Log;  
 import android.widget.ImageView;  
 public class PicActivity extends Activity {  
      private Bitmap[] mMemoryCache; // A place to store our pics       
      private ImageView picture;  
      public static final int DONE=1;  
      private int numberofitems;  
      int i;  
      long startTime,stopTime;  
      @Override  
      protected void onCreate(Bundle savedInstanceState) {  
           super.onCreate(savedInstanceState);  
           setContentView(R.layout.activity_pic);  
           String root = Environment.getExternalStorageDirectory().toString();  
           File myDir = new File(root + "/DCIM/3D");   
     picture=(ImageView)findViewById(R.id.imageView1);  
     // FOR LOOP TO LOAD ALL THE PICS IN CACHE  
        File[] file_list = myDir.listFiles();   
        numberofitems=file_list.length;  
        mMemoryCache=new Bitmap[numberofitems];  
        Log.v("MyActivity","items: "+numberofitems);  
        for (int i=0;i<numberofitems;i++){  
             mMemoryCache[i]=BitmapFactory.decodeFile(file_list[i].getPath());  
        }        
     // RANDOM ACCESS TO PRESENT THE PICS VERY FAST  
        // We do this in a separate task, when finishes sends a message, the handler  
        // presents the image and send the task again...  
        i=0;  
        new Timer(getApplicationContext(),threadHandler).execute();  
      }  
      ////////////////////////////////////thread Handler///////////////////////////////////////  
   private Handler threadHandler = new Handler() {  
        public void handleMessage(android.os.Message msg) {       
             switch(msg.what){  
                     case DONE:  
                          //Random r = new Random();  
                       //int i=r.nextInt(numberofitems);   
                          startTime = System.nanoTime();  
                       picture.setImageBitmap(mMemoryCache[i]);  
                       i++;  
                       if (i==numberofitems) i=0;  
                       //if (i==4) i=0;  
                       long endTime = System.nanoTime();  
                       System.out.println(String.format("Elapsed time: %.2f ms", (float)(endTime - startTime)/1000000));  
                          new Timer(getApplicationContext(),threadHandler).execute();  
                          break;                           
             }  
        }  
   };  
 }  

And for the timer portion we will do (Timer.java):
 package com.cell0907.pic;  
 import android.content.Context;  
 import android.os.Handler;  
 import android.os.Message;  
 import android.util.Log;  
 import android.os.AsyncTask;  
 public class Timer extends AsyncTask<Void, Void, Void> {  
   Context mContext;  
      private Handler threadHandler;  
   public Timer(Context context,Handler threadHandler) {  
     super();  
     this.threadHandler=threadHandler;  
     mContext = context;  
       }  
   @Override  
      protected Void doInBackground(Void...params) {   
        try {  
                Thread.sleep(50);  
           } catch (InterruptedException e) {  
                // TODO Auto-generated catch block  
                e.printStackTrace();  
           }   
        Message.obtain(threadHandler, PicActivity.DONE, "").sendToTarget();   
         return null;  
   }  
 }  

AndroidManifest.xml:
 <?xml version="1.0" encoding="utf-8"?>  
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
   package="com.cell0907.pic"  
   android:versionCode="1"  
   android:versionName="1.0" >  
   <uses-sdk  
     android:minSdkVersion="12"  
     android:targetSdkVersion="17" />  
   <application  
     android:allowBackup="true"  
     android:icon="@drawable/ic_launcher"  
     android:label="@string/app_name"  
     android:theme="@android:style/Theme.NoTitleBar.Fullscreen" >  
     <activity  
       android:name="com.cell0907.pic.PicActivity"  
       android:label="@string/app_name"   
       android:screenOrientation="landscape">  
       <intent-filter>  
         <action android:name="android.intent.action.MAIN" />  
         <category android:name="android.intent.category.LAUNCHER" />  
       </intent-filter>  
     </activity>  
   </application>  
 </manifest>  

And the layout activity_pic.xml:
 <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"  
   xmlns:tools="http://schemas.android.com/tools"  
   android:layout_width="match_parent"  
   android:layout_height="match_parent"  
   tools:context=".PicActivity" >  
   <ImageView  
     android:id="@+id/imageView1"  
     android:layout_width="match_parent"  
     android:layout_height="match_parent"  
     android:layout_centerHorizontal="true"  
     android:layout_centerVertical="true"  
     android:src="@drawable/ic_launcher" />  
 </RelativeLayout>  

We profile the execution to see if there is any difference between this method and the next:
12-25 14:03:25.469: I/System.out(15336): Elapsed time: 0.70 ms
12-25 14:03:25.519: I/System.out(15336): Elapsed time: 0.24 ms
12-25 14:03:25.579: I/System.out(15336): Elapsed time: 0.61 ms
12-25 14:03:25.629: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:25.679: I/System.out(15336): Elapsed time: 0.21 ms
12-25 14:03:25.729: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:25.779: I/System.out(15336): Elapsed time: 0.15 ms
12-25 14:03:25.839: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:25.889: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:25.940: I/System.out(15336): Elapsed time: 0.15 ms
12-25 14:03:25.990: I/System.out(15336): Elapsed time: 0.15 ms
12-25 14:03:26.040: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:26.090: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:26.140: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:26.190: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:26.240: I/System.out(15336): Elapsed time: 0.18 ms
12-25 14:03:26.300: I/System.out(15336): Elapsed time: 0.21 ms

For the second method, I wanted to use the LruCache class.
 package com.cell0907.pic;  
 import java.io.File;  
 import java.util.Random;  
 import com.cell0907.pic.R;  
 import android.os.Bundle;  
 import android.os.Environment;  
 import android.os.Handler;  
 import android.app.Activity;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.support.v4.util.LruCache;  
 import android.util.Log;  
 import android.widget.ImageView;  
 public class PicActivity extends Activity {  
      private LruCache<String, Bitmap> mMemoryCache; // A place to store our pics       
      private ImageView picture;  
      public static final int DONE=1;  
      private int numberofitems;  
      int i;  
      long startTime,endTime;  
      @Override  
      protected void onCreate(Bundle savedInstanceState) {  
           super.onCreate(savedInstanceState);  
           setContentView(R.layout.activity_pic);  
           String root = Environment.getExternalStorageDirectory().toString();  
           File myDir = new File(root + "/DCIM/3D");   
     picture=(ImageView)findViewById(R.id.imageView1);  
     // SETUP THE CACHE  
        // Get max available VM memory, exceeding this amount will throw an  
        // OutOfMemory exception. Stored in kilobytes as LruCache takes an  
        // int in its constructor.  
        final int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);  
        Log.v("MyActivity","maxMemory: "+maxMemory);  
        // Use half of the available memory at max for this memory cache.  
        final int cacheSize = maxMemory / 2;  
        mMemoryCache = new LruCache<String, Bitmap>(cacheSize) {  
          @Override  
          protected int sizeOf(String key, Bitmap bitmap) {  
            // The cache size will be measured in kilobytes rather than  
            // number of items.  
            return bitmap.getByteCount() / 1024;  
          }  
        };  
     // FOR LOOP TO LOAD ALL THE PICS IN CACHE  
        File[] file_list = myDir.listFiles();   
        numberofitems=file_list.length;  
        Log.v("MyActivity","items: "+numberofitems);  
        String imageKey;  
        for (int i=0;i<numberofitems;i++){  
             imageKey = String.valueOf(i);  
             addBitmapToMemoryCache(imageKey,BitmapFactory.decodeFile(file_list[i].getPath()));  
        }        
     // RANDOM ACCESS TO PRESENT THE PICS VERY FAST  
        // We do this in a separate task, when finishes sends a message, the handler  
        // presents the image and send the task again...  
        i=0;  
        new Timer(getApplicationContext(),threadHandler).execute();  
      }  
      public void addBitmapToMemoryCache(String key, Bitmap bitmap) {  
        if (getBitmapFromMemCache(key) == null) {  
          mMemoryCache.put(key, bitmap);  
        }  
      }  
      public Bitmap getBitmapFromMemCache(String key) {  
        return mMemoryCache.get(key);  
      }  
      ////////////////////////////////////thread Handler///////////////////////////////////////  
   private Handler threadHandler = new Handler() {  
        public void handleMessage(android.os.Message msg) {       
             switch(msg.what){  
                     case DONE:  
                       //Random r = new Random();  
                       //int i=r.nextInt(numberofitems);  
                       startTime = System.nanoTime();  
                       picture.setImageBitmap(getBitmapFromMemCache(String.valueOf(i)));  
                       i++;  
                       if (i==numberofitems) i=0;  
                       //if (i==4) i=0;  
                       long endTime = System.nanoTime();  
                       System.out.println(String.format("Elapsed time: %.2f ms", (float)(endTime - startTime)/1000000));  
                       new Timer(getApplicationContext(),threadHandler).execute();  
                          break;  
             }  
        }  
   };  
 }  

For the timer, manifest and layout we used the same as the first case. Notice that in both cases I had provision to display the pics in random access. I left it there commented out, just for reference...

Profiling this 2nd method, it seems that, curiously, this method actually seems to be slower than simple array of Bitmaps!
12-25 13:58:43.408: I/System.out(14553): Elapsed time: 0.37 ms
12-25 13:58:43.458: I/System.out(14553): Elapsed time: 0.37 ms
12-25 13:58:43.508: I/System.out(14553): Elapsed time: 0.34 ms
12-25 13:58:43.558: I/System.out(14553): Elapsed time: 0.92 ms
12-25 13:58:43.608: I/System.out(14553): Elapsed time: 0.34 ms
12-25 13:58:43.668: I/System.out(14553): Elapsed time: 0.43 ms
12-25 13:58:43.718: I/System.out(14553): Elapsed time: 0.37 ms
12-25 13:58:43.768: I/System.out(14553): Elapsed time: 0.34 ms
12-25 13:58:43.819: I/System.out(14553): Elapsed time: 0.46 ms

Oh well, good to know... I just wonder why then somebody would use the cache approach (?)
Cheers!!

PS.: Please, click here to see an index of other posts on Android. 

Monday, December 23, 2013

From Mat to BufferedImage

There has been couple of comments on the posts around the Mat and BufferedImage classes (see here). So, I took a step back to understand them better. These names refer to the array of pixels in the image in OpenCV  (Mat) or in Java (BufferedImage) and the question comes on how to go from one to the other efficiently. The main reason I needed that was because to display the image processed in OpenCV in Java I had to transform it to BufferedImage first (there is no OpenCV imshow available).

Here is all what you wanted to know about Mat... but it doesn't talk much about what you put inside (the images itself), so for an easier ride (hopefully), keep reading: :)

You can read one element with get(x,y), with x/y the position of the element or a group of elements with get(x,y,byte[]) with x,y=0,0 (for the full array; does it indicate the origin?). The result will get stored in the argument array byte[]. You can change one element with put, in reverse way as get... See example here.

On the other side, BufferedImage is a java subclass that describes an image with an accessible buffer of image data. A BufferedImage is comprised of a ColorModel and a Raster of image data. The 2nd is what holds the image pixels. See how to access them here.

To answer how to read/set the pixels on those structures, we will try to answer the title of this post, i.e., how to go from Mat (the openCV resulting image) to BufferedImage (to display), and do it in the most efficient way. Therefore, we proceed to benchmark few methods (see full code here). Hint: if you want to skip all the reading, jump to Method 4 :)

Method 1 Go through a jpg. Although looks the cleanest code (actually suggested by one commenter), another commenter thought that this would take the highest computation effort, and basically trigger this whole post :).

public boolean MatToBufferedImage(Mat matrix) {  
       long startTime = System.nanoTime();  
       MatOfByte mb=new MatOfByte();  
       Highgui.imencode(".jpg", matrix, mb);  
       try {  
            image = ImageIO.read(new ByteArrayInputStream(mb.toArray()));  
       } catch (IOException e) {  
       // TODO Auto-generated catch block  
            e.printStackTrace();  
            return false; // Error  
       }  
       long endTime = System.nanoTime();  
       System.out.println(String.format("Elapsed time: %.2f ms", (float)(endTime - startTime)/1000000));  
       return true; // Successful  
}  

Detected 2 faces
Elapsed time: 25.94 ms
Detected 1 faces
Elapsed time: 27.56 ms
Detected 2 faces
Elapsed time: 27.37 ms
Detected 1 faces
Elapsed time: 26.96 ms
Detected 1 faces
Elapsed time: 35.70 ms
Detected 1 faces
Elapsed time: 27.32 ms

Method 2 Extract the data from Mat into an array, flip the blue and the red channels/columns and store in BufferedImage.

 public boolean MatToBufferedImage(Mat matrix) {  
        long startTime = System.nanoTime();  
        int cols = matrix.cols();  
        int rows = matrix.rows();  
        int elemSize = (int)matrix.elemSize();  
        byte[] data = new byte[cols * rows * elemSize];  
        int type;  
        matrix.get(0, 0, data);  
        switch (matrix.channels()) {  
          case 1:  
            type = BufferedImage.TYPE_BYTE_GRAY;  
            break;  
          case 3:   
            type = BufferedImage.TYPE_3BYTE_BGR;  
            // bgr to rgb  
            byte b;  
            for(int i=0; i<data.length; i=i+3) {  
              b = data[i];  
              data[i] = data[i+2];  
              data[i+2] = b;  
            }  
            break;  
          default:  
            return false; // Error  
        }  
        image = new BufferedImage(cols, rows, type);  
        image.getRaster().setDataElements(0, 0, cols, rows, data);  
        long endTime = System.nanoTime();  
        System.out.println(String.format("Elapsed time: %.2f ms", (float)(endTime - startTime)/1000000));  
        return true; // Successful  
}  

Detected 2 faces
Elapsed time: 2.27 ms
Detected 2 faces
Elapsed time: 2.81 ms
Detected 2 faces
Elapsed time: 2.25 ms
Detected 2 faces
Elapsed time: 2.75 ms
Detected 2 faces
Elapsed time: 2.22 ms

Substantial (10x!!) improvement, as the anonymous commenter had anticipated...

Method 3 We do the color conversion in OpenCV (see here for color conversions within OpenCV) and then save to BufferedImage:

public boolean MatToBufferedImage(Mat matrix) {  
        long startTime = System.nanoTime();  
        Imgproc.cvtColor(matrix, matrix, Imgproc.COLOR_BGR2RGB);   
        int cols = matrix.cols();  
        int rows = matrix.rows();  
        int elemSize = (int)matrix.elemSize();  
        byte[] data = new byte[cols * rows * elemSize];  
        matrix.get(0, 0, data);  
        image = new BufferedImage(cols, rows, BufferedImage.TYPE_3BYTE_BGR);  
        image.getRaster().setDataElements(0, 0, cols, rows, data);  
        long endTime = System.nanoTime();  
        System.out.println(String.format("Elapsed time: %.2f ms", (float)(endTime - startTime)/1000000));  
        return true; // Successful  
}  

Detected 2 faces
Elapsed time: 2.19 ms
Detected 2 faces
Elapsed time: 12.68 ms
Detected 3 faces
Elapsed time: 2.04 ms
Detected 2 faces
Elapsed time: 2.91 ms
Detected 3 faces
Elapsed time: 2.05 ms
Detected 2 faces
Elapsed time: 2.84 ms

Maybe slightly faster than doing it by hand but... Notice also the long 12ms case above. Nevertheless, I observed that in the other algorithms too, so, I feel that is because the processor gets distracted with some other function...

Method 4 Finally, we get to what is the most efficient method. It basically goes straight from the BGR to the BufferedImage.

public boolean MatToBufferedImage(Mat matBGR){  
      long startTime = System.nanoTime();  
      int width = matBGR.width(), height = matBGR.height(), channels = matBGR.channels() ;  
      byte[] sourcePixels = new byte[width * height * channels];  
      matBGR.get(0, 0, sourcePixels);  
      // create new image and get reference to backing data  
      image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);  
      final byte[] targetPixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();  
      System.arraycopy(sourcePixels, 0, targetPixels, 0, sourcePixels.length);  
      long endTime = System.nanoTime();  
      System.out.println(String.format("Elapsed time: %.2f ms", (float)(endTime - startTime)/1000000));  
      return true;  
}  

In this 4th method suggested by the anonymous commenter we get the best. Even up to 50x improvement respect to the method 1:
Detected 2 faces
Elapsed time: 0.51 ms
Detected 2 faces
Elapsed time: 1.22 ms
Detected 1 faces
Elapsed time: 0.47 ms
Detected 1 faces
Elapsed time: 1.32 ms
Detected 1 faces
Elapsed time: 0.48 ms
Detected 1 faces
Elapsed time: 1.77 ms

I still need to understand why this other method does not require to flip R and B, but hey, it works... So, thank you, sir! :)
Cheers!!

PS.: This other post talks about how to get the picture in BufferedImage into an array and does a nice work benchmarking two methods, either using getRGB or using ((DataBufferByte) bufferedImage.getRaster().getDataBuffer()).getData();
PS2.: More theory on BufferedImages.
PS3.: By the way, the method may be nicer if it was returning the image instead of accessing it as a variable of the mother class, but anyhow...

Useful Software

Just writing down some free/shareware software that I use and it is pretty handy. Please consider donating/supporting these folks for their nice work!
  1. Binary file viewer
  2. Space Sniffer: nice way to visualize what is taking space in your hard drive.
  3. Firefox Web browser and some of the plug-ins I use.


Saturday, December 7, 2013

Audio scope tutorial - Capturing audio and displaying it real time in Android

This is an Android example that displays a rolling (drifting) oscilloscope graph with the envelop of the microphone audio. Very simple, no adjustments. Also, to demonstrate the NDK use, we will do a portion of the tasks (the processing of the samples) in two ways, with Java and with C, using JNI. Get code here.

In the interest of readability, I limit this post to the top level about the app architecture but you can click below and see specific posts explaining each of the following topics used on it:
The top level structure is shown on the picture below:

The main activity holds a circular queue (Audio Buffer) to store the audio samples. A separate thread (Record Thread) will fill it at a rate of 44100 samples per second. That thread runs on its own and it is actually blocked (that is why should be a separate thread) every time we do a "read" from the internal "hardware buffer".

The same activity has a display (managed by SurfaceView, see extensive explanation on this object here) which will launch two more threads, the Processor Thread and the Scope Thread.

The Processor Thread is "activated" every time a message indicating that there are new samples in the Audio Buffer is received. The Record Thread is the one alerting/sending those messages (this is described above by a thin dotted line). Once it receives the message from the Record Thread (see a detailed explanation here), the thread will break the input samples in groups of 200, find the maximum of that group, and place that value on a Scope Buffer, which hold the samples that are going to be displayed. I.e., each pixel in the screen represents the maximum of 200 audio samples.

Notice that I have a flag (yes, not really a compiler flag, as I was planning to control this from a GUI option but never got to it...) that will choose the processing to be done in Java or in C (through the NDK). Check this post where I look into the C code as an example of handling arrays in C (changing dimension, copying, appending...).

Asynchronously to this, every certain time, the second thread started by SurfaceView, the Scope Thread, dumps/draws all what is in the Scope Buffer into the screen. Basically, the most right point in the screen corresponds to the latest data the Processor Thread added.

Notice that the buffers are of the Q object type (custom). There is a pointer to write and one to read. The write is in general sequential. Every time you write you move the pointer. For the read you can set the read pointer to any position and choose what to read... I am pretty sure this can be done in a cleaner way and maybe using some already built in object, but I thought I would just build one myself to learn...

So, without more delay, here is the code. Scope2.java includes the audio recording thread:
 package com.cell0907.scope2;  
 import android.media.AudioRecord;  
 import android.os.Bundle;  
 import android.os.Message;  
 import android.util.Log;  
 import android.app.Activity;  
 public class Scope extends Activity {  
      public static final boolean JNI=true; // Tells us if we want to use JNI C function or java  
      // RECORDING VARIABLES  
      private AudioRecord AR=null;  
      public int BufferSize;                    // Length of the chunks read from the hardware audio buffer  
      private Thread Record_Thread=null; // The thread filling up the audio buffer (queue)  
      private boolean isRecording = false;  
      public Q audio_buffer=new Q(20000); // Record_Thread read the AR and puts it in here.  
      private static final int AUDIO_SOURCE=android.media.MediaRecorder.AudioSource.MIC;  
      public static final int SAMPLE_RATE = 44100;  
      private static final int CHANNEL_CONFIG = android.media.AudioFormat.CHANNEL_IN_MONO;  
      private static final int AUDIO_FORMAT = android.media.AudioFormat.ENCODING_PCM_16BIT;       
      private ScopeSurfaceView scope_screen_view;  
   @Override  
      protected void onCreate(Bundle savedInstanceState) {  
           super.onCreate(savedInstanceState);  
           Log.d("MyActivity", "onCreate");  
   }  
   @Override  
   public void onPause(){  
        Log.d("MyActivity", "onPause");  
           // TODO Auto-generated method stub  
           if (null != AR) {  
          isRecording = false;  
          boolean retry = true;  
          while (retry) {  
               try {  
                    Record_Thread.join();  
                    retry = false;  
               } catch (InterruptedException e) {}     
          }  
          AR.stop();  
          AR.release();  
          AR = null;  
          //recordingThread = null;  
           }  
           scope_screen_view.surfaceDestroyed(scope_screen_view.getHolder());  
        super.onPause();  
   }  
      protected void onResume(){  
           super.onResume();  
           Log.d("MyActivity", "onResume");  
           scope_screen_view=new ScopeSurfaceView(this,audio_buffer);  
           setContentView(scope_screen_view);  
           BufferSize=AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, AUDIO_FORMAT);                 
           isRecording=true;  
           Record_Thread=new Thread(new Runnable() {  
          public void run() {  
            Capture_Audio();  
          }  
        },"AudioRecord Thread");  
           Record_Thread.start();  
      }  
      /*  
       * This runs in a separate thread reading the data from the AR buffer and dumping it  
       * into the queue (circular buffer) for processing (in java or C).  
       */  
      public void Capture_Audio(){  
           byte[] AudioBytes=new byte[BufferSize]; //Array containing the audio data bytes  
           int[] AudioData=new int[BufferSize/2]; //Array containing the audio samples  
           try {  
                AR = new AudioRecord(AUDIO_SOURCE,SAMPLE_RATE,CHANNEL_CONFIG,AUDIO_FORMAT,BufferSize);  
                try {  
                     AR.startRecording();  
                } catch (IllegalStateException e){  
                     System.out.println("This didn't work very well");  
                     return;  
                     }  
                } catch(IllegalArgumentException e){  
                     System.out.println("This didn't work very well");  
                     return;  
                     }  
           while (isRecording)  
           {  
                AR.read(AudioBytes, 0, BufferSize); // This is the guy reading the bytes out of the buffer!!  
                //First we will pass the 2 bytes into one sample   
                //It's an extra loop but avoids repeating the same sum many times later  
                //during the filter  
                int r=0;       
                for (int i=0; i<AudioBytes.length-2;i+=2)  
                {// Before the 8 we had the end of the previous data  
                     if (AudioBytes[i]<0)   
                          AudioData[r]=AudioBytes[i]+256;   
                     else   
                          AudioData[r]=AudioBytes[i];  
                     AudioData[r]=AudioData[r]+256*AudioBytes[i+1];       
                     r++;  
                }                      
                // Write on the QUEUE            
                synchronized(audio_buffer){  
                     audio_buffer.put(AudioData);  
                }       
                // Not a very pretty way to hit the handler (declaring everything public)  
                // but just for the sake of demo.  
                Message.obtain(scope_screen_view.ProcessorThread.mHandler,   
                          scope_screen_view.ProcessorThread.DO_PROCESSING, "").sendToTarget();  
           }  
           Log.d("MyActivity", "Record_Thread stopped");  
      }  
   @Override  
      protected void onStop() {  
           super.onStop();  
   }   
 }  

The ScopeSurfaceView.java (to create the display):
 package com.cell0907.scope2;  
 import android.content.Context;  
 import android.util.Log;  
 import android.view.SurfaceHolder;  
 import android.view.SurfaceView;  
 // We extend SurfaceView. Internally (private) SurfaceView creates an object SurfaceHolder  
 // effectively defining the methods of the SurfaceHolder interface. Notice that it does  
 // not create a new class or anything, it just defines it right there. When we extend  
 // the SurfaceView with the SurfaceHolder.Callback interface, we need to add in that extension  
 // the methods of that interface.  
 public class ScopeSurfaceView extends SurfaceView implements SurfaceHolder.Callback {  
      private SurfaceHolder holder;     // This is no instantiation. Just saying that holder  
                                              // will be of a class implementing SurfaceHolder  
      private ScopeThread ScopeThread;// The thread that displays the data  
      public ProcessorThread ProcessorThread; // The thread that reads audio and creates the  
                                              // scope samples  
      private Q source_buffer;          // Audio data  
      private Q scope_buffer=null;      // Buffer for the screen of the scope  
      public ScopeSurfaceView(Context context){  
           super(context);  
      }  
      public ScopeSurfaceView(Context context, Q source) {  
           super(context);  
           source_buffer=source;          // Where to get the samples to display  
           holder = getHolder();          // Holder is now the internal/private mSurfaceHolder object   
                                              // in the SurfaceView object, which is from an anonymous  
                                              // class implementing SurfaceHolder interface.  
           holder.addCallback(this);  
      }  
      @Override  
      public void surfaceCreated(SurfaceHolder holder) {  
      }  
      @Override  
      // This is always called at least once, after surfaceCreated  
      public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {  
        if (scope_buffer==null){  
             scope_buffer=new Q(width);              
                ScopeThread = new ScopeThread(holder,scope_buffer);  
                ScopeThread.setRunning(true);  
                ScopeThread.setSurfaceSize(width, height);  
                ScopeThread.start();  
                Log.v("MyActivity","Screen width "+width);  
                ProcessorThread = new ProcessorThread(source_buffer, scope_buffer);  
                ProcessorThread.setRunning(true);  
                ProcessorThread.start();  
        }  
      }  
      @Override  
      public void surfaceDestroyed(SurfaceHolder holder) {  
           Log.d("MyActivity", "DESTROY SURFACE");  
           boolean retry = true;  
           ScopeThread.setRunning(false);  
           while (retry) {  
                try {  
                     ScopeThread.join();  
                     retry = false;  
                } catch (InterruptedException e) {}  
           }  
           retry = true;  
           Log.d("MyActivity", "Going to stop the processor thread");  
           ProcessorThread.setRunning(false);  
           while (retry) {  
                try {  
                     ProcessorThread.mHandler.getLooper().quit();  
                     ProcessorThread.join();  
                     retry = false;  
                     Log.d("MyActivity", "FAIL");  
                } catch (InterruptedException e) {  
                     Log.d("MyActivity", "FULL FAIL");  
                }  
           }  
   }  
   public Thread getThread() {  
     return ScopeThread;  
   }  
 }  
The ProcessorThread.java (which takes 200 audio samples and creates a pixel for the screen):
 package com.cell0907.scope2;  
 import java.util.Arrays;  
 import android.os.Handler;  
 import android.os.Looper;  
 import android.os.Message;  
 import android.util.Log;  
 public class ProcessorThread extends Thread {  
      private Q source_buffer;                // Where to get the samples from  
      private Q destiny_buffer;               // Where to put the samples to  
   private int speed=200;                 // Number of samples in one pixel  
   private boolean running = true;  
   private int[] reminder;  
   public Handler mHandler;               // Processor handler  
   public static final int DO_PROCESSING = 1; // Message that the audio capture send to  
                                                          // the processing thread.  
      ProcessorThread(Q source, Q destiny){  
     this.source_buffer=source;  
     this.destiny_buffer=destiny;  
     reminder=new int[0];  
      }  
      @Override  
   public void run() {  
           Looper.prepare();  
           mHandler = new Handler()  
           {  
                public void handleMessage(Message msg)  
                {  
                     switch (msg.what)  
                     {  
                     case DO_PROCESSING:  
                          // GOT NEW DATA. ANALYZE  
                          int[] intermediate_buffer;  
                          int[] intermediate_buffer_2;  
                          synchronized(source_buffer){  
                               intermediate_buffer=source_buffer.get();  
                          }  
                          if (Scope.JNI==false) process(intermediate_buffer);  
                          else {  
                               intermediate_buffer_2=processjni(intermediate_buffer);  
                         synchronized(destiny_buffer){  
                              destiny_buffer.put(intermediate_buffer_2);  
                         }  
                          }   
                          break;  
                     }  
                }   
           };  
           Looper.loop();   
           while(running){ // I do not think you actually need this anymore  
           }  
           Log.d("MyActivity", "Processor Thread stopped");  
   }  
      private void process(int[] audio){  
           int x=0;  
           int maximum;   
           // speed is the number of original audio samples that form one  
           // pixel in the screen. As long as we got enough for one, we write it  
           // in.  
           audio=concat(reminder,audio);  
           int i=audio.length;  
           while (i>=speed){  
                maximum=0;  
          for (int j=0;j<speed;j++)  
               if (audio[x+j]>maximum) maximum=audio[x+j];  
               synchronized(destiny_buffer){  
                    destiny_buffer.put(maximum);  
               }  
          x+=speed;  
          i-=speed;  
           }  
           if (x>0) x-=speed;  
           reminder=Arrays.copyOf(reminder, i); // Resize reminder  
           System.arraycopy(audio, x, reminder, 0, i); // Copy what was left  
      }  
      public static int[] concat(int[] first, int[] second) {  
            int[] result = Arrays.copyOf(first, first.length + second.length);  
            System.arraycopy(second, 0, result, first.length, second.length);  
            return result;  
           }  
d 
      private native int[] processjni(int[] audio);  
      static {  
     System.loadLibrary("JNImodule");  
   }    
 }  
The ProcessorThread can do its function in Java (see the code) or by calling an external C function, using JNI and the NDK. JNIlib.c:
 #include <jni.h>  
 #include <stdlib.h>  
 jintArray  
 Java_com_cell0907_scope2_ProcessorThread_processjni (JNIEnv *env, jobject thisObj, jintArray inJNIArray) {  
      int speed=200;  
      static int reminder[200];      // The reminder samples can't be longer than  
                                         // the speed  
      static int left=0;  
   // Step 1: Convert the incoming JNI jintarray to C's jint[]  
   jint *audioin = (*env)->GetIntArrayElements(env, inJNIArray, NULL);  
   if (NULL == audioin) return NULL;  
   jsize length = (*env)->GetArrayLength(env, inJNIArray);  
   // Put together whatever was left from the past (see below)  
   // plus the new audio samples  
   int audio[(int)length+left];  
   memcpy(audio,reminder,left*sizeof(int));  
   memcpy(&audio[left],audioin,(int)length*sizeof(int));  
   // Step 2: Perform its intended operations  
   // speed is the number of original audio samples that form one  
   // pixel in the screen. As long as we got enough for one, we write it  
   // in.  
   jint outCArray[(int)length/speed+1];  
   int x=0,i=0,j;  
   int maximum;  
   //left+=(int)length;  
   left=(int)length;  
   while (left>=speed){  
        maximum=0;  
        for (j=0;j<speed;j++)  
             if (audio[x+j]>maximum) maximum=audio[x+j];  
        outCArray[i]=maximum;  
        x+=speed;  
        left-=speed;  
        i++;  
   }  
   // Whatever was left, save it for next  
   if (x>0) x-=speed;  
   memcpy(reminder,&audio[x],left*sizeof(int));  
   (*env)->ReleaseIntArrayElements(env, inJNIArray, audio, 0); // release resources  
   // Step 3: Convert the C's Native jdouble[] to JNI jdoublearray, and return  
   jintArray outJNIArray = (*env)->NewIntArray(env, i); // allocate  
   if (NULL == outJNIArray) return NULL;  
   (*env)->SetIntArrayRegion(env, outJNIArray, 0 , i, outCArray); // copy  
   return outJNIArray;  
 }  
You will need this Android.mk to compile the C function using the NDK:
 # Copyright (C) 2009 The Android Open Source Project  
 #  
 # Licensed under the Apache License, Version 2.0 (the "License");  
 # you may not use this file except in compliance with the License.  
 # You may obtain a copy of the License at  
 #  
 #   http://www.apache.org/licenses/LICENSE-2.0  
 #  
 # Unless required by applicable law or agreed to in writing, software  
 # distributed under the License is distributed on an "AS IS" BASIS,  
 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  
 # See the License for the specific language governing permissions and  
 # limitations under the License.  
 #  
 LOCAL_PATH := $(call my-dir)  
 include $(CLEAR_VARS)  
 LOCAL_MODULE  := JNImodule  
 LOCAL_SRC_FILES := JNIlib.c  
 include $(BUILD_SHARED_LIBRARY)  
And finally the thread that takes the samples created by Processor Thread and displays them in the screen, ScopeThread.java:
 package com.cell0907.scope2;  
 import android.graphics.Canvas;  
 import android.graphics.Color;  
 import android.graphics.Paint;  
 import android.graphics.Paint.Style;  
 import android.graphics.RectF;  
 import android.util.Log;  
 import android.view.SurfaceHolder;  
 public class ScopeThread extends Thread {  
   private int mCanvasWidth;  
   private int mCanvasHeight;  
      private Q scope_buffer;           // Circular buffer for the scope in the screen  
      int[] intermediate_buffer;               // Will store the display  
      private SurfaceHolder holder;  
   private boolean running = true;  
   private final Paint paint = new Paint(Paint.ANTI_ALIAS_FLAG);  
   private final int refresh_rate=20;      // How often we update the scope screen, in ms  
   public ScopeThread(SurfaceHolder holder, Q scope_buffer) {  
     this.holder = holder;  
     this.scope_buffer=scope_buffer;  
        paint.setColor(Color.BLUE);  
        paint.setStyle(Style.STROKE);  
        paint.setTextSize(50);  
   }  
   @Override  
   public void run() {  
     long previousTime, currentTime;  
        previousTime = System.currentTimeMillis();  
     Canvas canvas = null;  
     while(running) {  
       currentTime=System.currentTimeMillis();  
       while ((currentTime-previousTime)<refresh_rate){  
            currentTime=System.currentTimeMillis();  
       }  
       previousTime=currentTime;  
       // Paint right away, so, it is as smooth as can be...  
       try {  
         canvas = holder.lockCanvas();  
         synchronized (holder) {  
               draw(canvas);           
         }  
       }  
       finally {  
            if (canvas != null) {  
                 holder.unlockCanvasAndPost(canvas);  
                 }  
       }  
       // Update the scope buffer with info from the audio buffer  
                synchronized(scope_buffer){  
                  scope_buffer.set_r_pointer(scope_buffer.get_w_pointer()+1);  
                  intermediate_buffer=scope_buffer.get();     // Reads full buffer  
                }  
                try {  
                     Thread.sleep(refresh_rate-5); // Wait some time till I need to display again  
                } catch (InterruptedException e) {  
                     // TODO Auto-generated catch block  
                     e.printStackTrace();  
                }       
     }  
     Log.d("MyActivity", "Scope Thread stopped");  
   }  
   // Should be called every ~50ms  
   private void draw(Canvas canvas)  
   {  
        int current, previous=mCanvasHeight-10;  
        canvas.drawColor(Color.BLACK);  
        paint.setColor(Color.WHITE);  
        canvas.drawRect(new RectF(1,1,mCanvasWidth-1,mCanvasHeight-1), paint);  
        paint.setColor(Color.RED);  
        for(int x=1;x<mCanvasWidth-2;x++){  
             current=mCanvasHeight-intermediate_buffer[x]*(mCanvasHeight-11)/32767-10;  
             canvas.drawLine(x,previous,x+1,current,paint);  
             previous=current;  
        }  
   }  
   public void setRunning(boolean b) {  
     running = b;  
   }  
   public void setSurfaceSize(int width, int height) {  
        synchronized (holder){// that we removed  
          mCanvasWidth = width;  
          mCanvasHeight = height;  
        }  
        intermediate_buffer=new int[mCanvasWidth];  
   }  
 }  
And yeah, we can't forget the brains behind the management of the circular buffers (the audio samples and the scope pixels), done using the Q object, Q.java:
 import java.util.Arrays;  
 public class Q {  
      private int[] Data;           // Circular buffer  
      private int Buffer_length;     // How long the buffer is before it wraps around  
      private int w_pointer;          // Position to the next element to write in the buffer   
      private int r_pointer;          // Position to the next element to read from the buffer  
      Q(int length){  
           Buffer_length=length;  
           Data=new int[Buffer_length];  
           w_pointer=0;  
           r_pointer=Buffer_length-1;  
      }  
      int get_w_pointer(){  
           return w_pointer;  
      }  
      void set_r_pointer(int pointer){  
           if (pointer<0)  
                this.r_pointer=Buffer_length+pointer;  
           else if (pointer>=Buffer_length)  
                this.r_pointer=pointer%Buffer_length;  
           else  
                this.r_pointer=pointer;  
      }  
      int get_r_pointer(){  
           return r_pointer;  
      }  
      /*  
       * Places an amount of data in the buffer and returns true  
       * if there is a buffer over run (write pointer goes over the  
       * read pointer)  
       */  
      public boolean put(int[] data_in){  
           int i=0;  
           boolean error=false;  
           while (i<data_in.length)  
           {  
                if (w_pointer==r_pointer) error=true;  
                Data[w_pointer]=data_in[i];  
                i++;  
                w_pointer++;  
                if (w_pointer>Buffer_length-1) w_pointer=0;       
           }  
           return error;  
      }  
      /*  
       * Places an single element of data in the buffer and returns true  
       * if there is a buffer over run (write pointer goes over the  
       * read pointer)  
       */       
      public boolean put(int data_in){  
           boolean error=false;  
           if (w_pointer==r_pointer) error=true;  
           Data[w_pointer]=data_in;  
           w_pointer++;  
           if (w_pointer>Buffer_length-1) w_pointer=0;  
           return error;  
      }  
      /*  
       * Returns all the data available in the buffer,  
       * basically, from r_pointer to w_pointer at the time of the call.  
       */  
      public int[] get(){  
           int[] data_out=new int[Buffer_length];  
           int i=0;       
           while (r_pointer!=w_pointer) // Reads till the end of the buffer  
           {  
                data_out[i]=Data[r_pointer];  
                i++;  
                r_pointer++;  
                if (r_pointer>Buffer_length-1) r_pointer=0;  
           }  
           return Arrays.copyOf(data_out,i);  
      }  
      // Reads n elements or none. In the 2nd case, it will still return whatever could   
      // read but will not move the r_pointer  
      public int[] get(int n){  
           int[] data_out=new int[Buffer_length];  
           int i=0;  
           int r_pointer_backup = r_pointer;  
           while ((r_pointer!=w_pointer) & (i<n))     // Reads till the end of the buffer  
                                                             // or till we get n elements  
           {  
                data_out[i]=Data[r_pointer];  
                i++;  
                r_pointer++;  
                if (r_pointer>Buffer_length-1) r_pointer=0;  
           }  
           if (i<n) r_pointer=r_pointer_backup; // As you couldn't read the whole array, go back  
           return Arrays.copyOf(data_out,i);  
      }  
      // Reads one element.  
      public int getone(){  
           int i=Data[r_pointer];  
           r_pointer++;  
           if (r_pointer>Buffer_length-1) r_pointer=0;  
           return i;  
      }  
 }  
And the AndroidManifest.xml:
 <?xml version="1.0" encoding="utf-8"?>  
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
   package="com.cell0907.scope2"  
   android:versionCode="1"  
   android:versionName="1.0" >  
   <uses-sdk  
     android:minSdkVersion="9"  
     android:targetSdkVersion="17" />  
   <uses-permission  
           android:name="android.permission.RECORD_AUDIO" />  
   <application  
     android:allowBackup="true"  
     android:icon="@drawable/ic_launcher"  
     android:label="@string/app_name"  
     android:theme="@style/AppTheme" android:debuggable="true">  
     <activity  
       android:name="com.cell0907.scope2.Scope"  
       android:label="@string/app_name"   
       android:screenOrientation="landscape">  
       <intent-filter>  
         <action android:name="android.intent.action.MAIN" />  
         <category android:name="android.intent.category.LAUNCHER" />  
       </intent-filter>  
     </activity>  
   </application>  
 </manifest>  

And if you made it till here, you deserve a brand new icon :P



Anyhow, hopefully it is easy to follow. Do not forget to check the links in the post as some of them explain what I consider the tougher pieces in the code.

Cheers!

PS.: Please, click here to see an index of other posts on Android.

Communication between two threads in Android

The source code for this explanation can be found here. That is a bigger example, so, I focus here in the intercommunication portion between two threads. Notice that one of the threads could be the UI but it does not have to (in our case it's not). The direction does not matter either. This is basically general enough.

In the example we have a thread pulling data from the microphone and storing it in a buffer. Then wants to let know another thread in charge of processing the data, that the data is available. That way the processing thread does not have to be polling/checking all the time.

So, to the bottom line... to do this:

1. We declare a handle. All what this is, is something that listens for the message to come. So, the handle has to be created in the receiving thread, duh! We do that in the line within ProcessorThread.java:
  public Handler mHandler;               // Processor handler  
2. We send messages to that handle. As we are going to do that likely from a different thread than the receiving thread, we need to know what is the handle. Like any time that you want to access an object in a different object, you have several ways. One is to pass the reference to that object, for instance, in the constructor, somehow. The other (which we use here) is to simply declare that object "public", so, visible to other objects outside the containing object. See that in the line of code above. So, to send a message to it, all what we need to do in the Recording Thread is to detail the path to the handler:
Message.obtain(scope_screen_view.ProcessorThread.mHandler,   
                          scope_screen_view.ProcessorThread.DO_PROCESSING, "").sendToTarget();
The first parameter is the reference, while the second is what message to send (which is also defined in this case in the receiving thread). It is just a constant, so, it could be defined anywhere, but makes sense from an overall object definition to be inside the receiving one.

3. Finally we need to create the handler itself, i.e., what tells what to do with the received messages. That, again, is done within the ProcessorThread:
 Looper.prepare();  
 mHandler = new Handler()  
 {  
     public void handleMessage(Message msg)  
     {  
         switch (msg.what)  
         {  
             case DO_PROCESSING:  
                 // GOT NEW DATA. ANALYZE  
                 int[] intermediate_buffer;  
                 int[] intermediate_buffer_2;  
                 synchronized(source_buffer){  
                     intermediate_buffer=source_buffer.get();  
                 }  
                 if (Scope.JNI==false) process(intermediate_buffer);  
                 else {  
                        intermediate_buffer_2=processjni(intermediate_buffer);  
                        synchronized(destiny_buffer){  
                              destiny_buffer.put(intermediate_buffer_2);  
                         }  
                 }   
                 break;  
         }  
     }   
};  
Looper.loop();
The key things here are:
a/ The handleMessage method which gets the message and selects what to do with it. Within the code you can see what is done with a switch-case approach. In this case, where we had only one message, we could have used "if" too... So, then, based on the case, we just go execute some piece of code (in this case, to process the new audio samples that were placed in the Audio Buffer).
b/ The Looper is the class used to run a message loop for a thread. Threads, unlike the main UI, do not have by default a message loop associated with them. So, that is what we do here. We place the handler between the Looper.prepare and the Looper.loop.
Note: so, if you are doing a handler in the main UI, you just need to declare/create it, but no looper is needed. For instance, to launch different AsyncTasks from the main UI, depending on the incoming messages, this is all what you would do in the UI:
 private Handler threadHandler = new Handler() {  
           public void handleMessage(android.os.Message msg) {            
           switch(msg.what){  
                case ACK_PLAYED:  
                     new async_recorder(getApplicationContext(),threadHandler,audio_buffer).execute();  
                     break;  
                case ACK_RECORDED:  
                     new async_processor(getApplicationContext(),threadHandler,audio_buffer).execute();
                     break;       
                case ACK_LOADED:  
                     new async_processor(getApplicationContext(),threadHandler,audio_buffer).execute();  
                     break;  
                case ACK_PROCESSED:  
                     mResult.setText((String)msg.obj);  
                     break;  
                case ACK_SAVED:  
                     mResult.setText("SAVED");  
                     break;  
           }            
           }  
      };  

4/ This is it. Nevertheless, there is one small tricky thing related to threads.When you got a thread (without handler) running and want to stop it (for instance, when you move your app to background), the traditional way is by using a variable, within the thread, that gates the execution and that you can manipulate from outside. For instance, one would put the thread code whitin a "while (running) {//code}" and then, using a thread user defined method turn running true (to run) or false (to exit). The code of the example does that. Within the thread to be stopped we have:
 public void run() {  
           Looper.prepare();  
           mHandler = new Handler()  
           {  
                public void handleMessage(Message msg)  
                {  
                     switch (msg.what)  
                     {  
                     case DO_PROCESSING:  
                          // Do whatever... 
                          break; 
                     case "other one":  
                          // Do whatever else... 
                          break;   
                     }  
                }   
           };  
           Looper.loop();   
           while(running){ // <<< This is what we are talking about
           }  
   }  
Although the "running" variable is private, we can externally change it through it with a public method that the class exposes:
   public void setRunning(boolean b) {  
        Log.d("MyActivity", "RUNNING "+b);  
     running = b;  
   } 
So, that when we want to stop it, for instance, when the surface is destroyed, we just do:
public void surfaceDestroyed(SurfaceHolder holder) {  
           boolean retry = true;  
           ProcessorThread.setRunning(false);  
           while (retry) {  
                     ProcessorThread.join();  
                     retry = false;  
                }  
           }  
Nevertheless, in our case we have the Looper waiting for messages to come. So, while the looper is running, the thread will just not stop. To stop it you just got to stop the looper. That is done in the call when surfaceDestroyed calls:
ProcessorThread.mHandler.getLooper().quit();
So, I think in this case you basically don't need to use the "running=false" approach... I tested it and it seems to be the case but if anybody knows otherwise, please let me know...
Cheers!

PS.: Please, click here to see an index of other posts on Android.

Thursday, December 5, 2013

Cell0907 blog topics

News: I have created/split some of the content onto another blog (cell0908) with the purpose of  helping someone with little knowledge in biotech, ID opportunities for contribution and cover the knowledge map as fast as possible (the 80% of the learning curve :) ).

As for the main subject for this blog, below the programming stuff you may be looking for. Haven't worked on organized material in very long time. Click here to get the most recent posts (everything).

Latest stuff (quite old, I know):
  1. June 9th, 2014. Using the face tracking I created a 3D display, which now seems to be a similar technique to what Amazon is going to be using on their phone.
For the main topics of this blog:
  1. For OpenCV index, please click here.
  2. For Android index, please click here.
  3. For my tourist guide to Barcelona, please click here.
  4. For these and all other posts, please select a tab from the top.
Thank you for stopping by!

Sunday, December 1, 2013

Android Debugging

Quick overview of some features for debugging Android app under Eclipse. I'll add them as I discover them...

If you look at the bottom right of your IDE (at least that's where it is for me), you can see the following panel. If for any reason, you prefer to edit in full screen mode (double clicking on the tab of the given source file), then this screen will pop when you run the app.


There are several tabs. If you click on Devices, you can see whatever device is connected to the computer. Also, on the same tab, you can get a snapshot of the screen of the selected device. We are going to look now into the LogCat tab. This has to do with Android logging. You can have the program output messages (logs) into your console as it is executing. Very useful to trace what is happening. Simply use something like Log.v("TAG", "Stuff to log");
  1. The "v" in log v indicates "verbose". You can use others (see the link) but I seldom do. 
  2. TAG is whatever you want to put so that you can group/filter the logs/messages. TAG can be just text that you put right there, between "" or some folks will make it part of a constants class and use Log.d(Constants.LOG, "started recording"); In our case/picture above, TAG=MyActivity.
  3. "Stuff to log" is the message itself (something that you write to yourself). You can see things like "Scoped Thread Stopped"
An older method to do this was System.out.println("started recording"); but the one above is more proper logging with all its advantages (filtering, etc...). See here the difference between the two.

Also, careful with the use of BuildConfig.DEBUG. This is intended to be able to disable the logging once you are ready for release but some folks say that it doesn't work. Although some say it does.

While all this debugging is done over USB cable, there may be a method for debugging over Wi-Fi
The link above didn't make it all that clear, so, found this other one. Unfortunately, I never got it to work (they say that you may need a rooted device). I do the adb kill-server. Then adb tcpip 5555 gives me a message as daemon started successfully but gets stuck on the "restarting in TCP mode port:5555"

Anyhow, not a great post but hope it was useful to somebody... I'll update this as I get it to work...

Android Index of Posts and Useful Links

Index of my posts and links to other useful stuff on Android, initially a bit disorganized as I keep adding things but I think it will do the work for some time :)

Installation/Environment
  1. An overview of the tools around for programming in Android the "official/standard/free" way.
  2. Install guide is very well documented by Google. Nevertheless, as I was installing OpenCV in Android I put this step by step tutorial on the installation of both. Check it if for any reason you are having problems following the official instructions.
  3. Copy a project in Eclipse
  4. Some debugging tips like logging/screen capture...
  5. 10 ADB commands (external link) and all (?) commands.
Programming
  1. Tutorial for Drawing in Android - Views/Layout Basic stuff, no dynamic...
  2. Tutorial for advanced Drawing in Android -SurfaceView and SurfaceHolder
  3. Displaying bitmaps as fast as possible (or as I possibly can :) )
  4. Importing fonts
  5. Retrieving all files in a directory
  6. Loading jpg
  7. Saving to a file in Android
  8. Flash light.
  9. Camera capture
  10. AsyncTask  
  11. Communication between threads.
  12. LruCache
  13. Creating a timer
  14. NDK: Android Native Development.
  15. Scope application (displays the audio vs time in a rolling graph in the screen)  
  16. Using some of the techniques above and OpenCV I created a 3D display, which now seems to be a similar technique to what Amazon is going to be using on their phone.
Other Code Examples
  1. For a list of examples in OpenCV, please see this index.
  2. Basic4Android example on a Baby Flashcards app.

Flash light control in Android

I am not really satisfied with this. I am just turning it on and off, but wanted to be able to adjust intensity. See some links all the way to the bottom that may be able to help on that. Got to do more research. My email to HTC was completely unfruitful... :(

Also, sorry I didn't clean it up much. It is pretty straightforward. Only other thing besides learning to turn on the LED is how we throw an alert to the user (see below for the case the phone does not have a camera flash light). But anyhow, as I got the code, here it goes (Led.java): 
 package cell0907.example.led;  
 import android.hardware.Camera;  
 import android.hardware.Camera.Parameters;  
 import android.os.Bundle;  
 import android.app.Activity;  
 import android.app.AlertDialog;  
 import android.content.DialogInterface;  
 import android.content.pm.PackageManager;  
 import android.util.Log;  
 import android.view.Menu;  
 import android.view.View;  
 import android.widget.Button;  
 import android.widget.Toast;  
 public class Led extends Activity {  
      public static Camera cam = null;// has to be static, otherwise onDestroy() destroys it  
      private Button light_switch;  
   private boolean isFlashOn;  
   private boolean hasFlash;  
   Parameters p;  
   static final String TAG="MyActivity";  
      @Override  
      protected void onCreate(Bundle savedInstanceState) {  
           super.onCreate(savedInstanceState);  
           setContentView(R.layout.activity_led);  
           light_switch=(Button)this.findViewById(R.id.button1);  
           /*  
            * First check if device is supporting flashlight or not  
            */  
           hasFlash = getApplicationContext().getPackageManager()  
               .hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH);  
           if (!hasFlash) {  
             // device doesn't support flash  
             // Show alert message and close the application  
             AlertDialog alert = new AlertDialog.Builder(Led.this)  
                 .create();  
             alert.setTitle("Error");  
             alert.setMessage("Sorry, your device doesn't support flash light!");  
             alert.setButton(RESULT_OK, ALARM_SERVICE, new DialogInterface.OnClickListener() {  
               public void onClick(DialogInterface dialog, int which) {  
                 // closing the application  
                 finish();  
               }  
             });  
             alert.show();  
             return;  
           }  
           light_switch.setOnClickListener(new View.OnClickListener() {  
                @Override  
                public void onClick(View v) {  
                     // TODO Auto-generated method stub  
               if (isFlashOn) {  
                 // turn off flash  
                    flashLightOff();  
                    light_switch.setText("Turn on");  
               } else {  
                 // turn on flash  
                    light_switch.setText("Turn off");  
                    flashLightOn();  
               }  
                }  
           });  
      }  
      public void flashLightOn() {  
           Log.v(TAG, "ON");  
        try {  
             getcamera();  
       p.setFlashMode(Parameters.FLASH_MODE_TORCH);  
       cam.setParameters(p);  
       cam.startPreview();  
       isFlashOn=true;  
        } catch (Exception e) {  
          e.printStackTrace();  
          Toast.makeText(getBaseContext(), "Exception flashLightOn()",  
              Toast.LENGTH_SHORT).show();  
        }  
      }  
      public void getcamera(){  
           if (cam==null){  
                try{  
                     Log.v(TAG, "CAMERA CONFIG");  
               cam = Camera.open();  
               p = cam.getParameters();  
                }  
                catch (RuntimeException e) {  
         Log.e("Camera Error. Failed to Open. Error: ", e.getMessage());  
                }  
      }  
      }  
      public void flashLightOff() {  
           Log.v(TAG, "OFF");  
        if (cam!=null) try {  
            cam.stopPreview();  
            cam.release();  
            cam = null;  
               isFlashOn=false;  
        } catch (Exception e) {  
          e.printStackTrace();  
          Toast.makeText(getBaseContext(), "Exception flashLightOff",  
              Toast.LENGTH_SHORT).show();  
        }  
      }       
      @Override  
      public boolean onCreateOptionsMenu(Menu menu) {  
           // Inflate the menu; this adds items to the action bar if it is present.  
           getMenuInflater().inflate(R.menu.led, menu);  
           return false;  
      }  
      @Override  
      protected void onStart() {  
        super.onStart();  
      }  
      @Override  
      protected void onRestart() {  
        super.onRestart();  
      }  
      @Override  
      protected void onResume() {  
        super.onResume();  
      }  
      @Override  
      protected void onPause() {  
        super.onPause();  
        // on pause turn off the flash  
        flashLightOff();  
      }  
      @Override  
      protected void onStop() {  
        super.onStop();  
      }  
      @Override  
      protected void onDestroy() {  
        super.onDestroy();  
      }  
 }  


The other thing to remember is to add the following lines to the Manifest so that the app got the right permissions:
<uses-permission android:name="android.permission.CAMERA"/> 
<uses-permission android:name="android.permission.FLASHLIGHT"/>


Others:
  1. A hint on the xda-developers site. In my case I had a brightness file with 0 inside and a max_brightness file with 255 inside. I tried making the second even a zero, but my flashlight app just kept working... 
  2. This other link points to something similar and explains a way to test it quickly. Got to try.
  3. A much more elaborated flashlight example 
  4. A completely different method without using the SDK It may be worth to explore if that can do the intensity.
  5. Full source code of another flashlight app
  6. Answered question in Stackoverflow 
  7. Question without answer in Stackoverflow
  8. Please, click here to see an index of other posts on Android.