Thursday, January 30, 2014

Engineering quotes that I agree with...

“The path to the CEO's office should not be through the CFO's office, and it should not be through the marketing department. It needs to be through engineering and design.”
-- Elon Musk, CEO and chief product architect, Tesla Motors

“The worst thing in the world that can happen to you if you’re an engineer that has given life to something is for someone to rip it off and put their name on it.”
-- Tim Cook, CEO of Apple Inc.

“So here we have pi squared, which an engineer would call 10.”
-- Frank King, cartoonist, creator of Gasoline Alley

“You go to a technology conference or an engineering conference, there are very few women there. At the same time, it’s a blessing in the fact that you do get noticed. People tend to remember you as the only woman in the room ‘who said that’ or the only woman in the room who was an engineer.”
-- Padmasree Warrior, CTO of Cisco Systems, former CTO of Motorola Inc.

“Let’s face it: Engineering companies in general have more men than women. Google has tried really hard to recruit women. On the other hand, we have a standard. Google tries to recruit the best engineers.”
-- Susan Wojcicki, senior vice president in charge of product management and engineering at Google

Source: got these from many others in the following link.

Few more:
"I do not know one millionth part of one percent about anything"
-- Thomas Edison
I do not know one millionth part of one percent about anything - Thomas Edison - See more at: http://www.youcaring.com/medical-fundraiser/team-isaac-/82832#sthash.dgvEZBq8.dpuf
I do not know one millionth part of one percent about anything - Thomas Edison - See more at: http://www.youcaring.com/medical-fundraiser/team-isaac-/82832#sthash.dgvEZBq8.dpuf
I do not know one millionth part of one percent about anything - Thomas Edison - See more at: http://www.youcaring.com/medical-fundraiser/team-isaac-/82832#sthash.dgvEZBq8.dpuf


Saturday, January 25, 2014

Detect faces with Android SDK, no OpenCV

By now, we have tackled this in many ways :). Usually with OpenCV. See a list of posts here...
Now we are going to basically take this post, where we detected the faces with OpenCV in Android, and replace the face detection portion (CascadeClassifier + detectMultiScale) and replace it for FaceDetector.findFaces method in Android SDK. So, everything else is the same as that post, only thing is that we need to:
  1. Convert Mat to Bitmap on the right format for the face detection method (RGB_565).
  2. Do the face detection with the Android SDK.
We also reduce the original image resolution so that the detection happens much faster, as we did in the original post. So, although I copy here all the code so you don't have to go back and forth, the part changing is what goes after VIEW_MODE_GRAY inside the "public Mat onCameraFrame(CvCameraViewFrame inputFrame)" method. Notice that we still use all the framework from OpenCV to capture and display the image, and not the Android SDK approach.

Note: it seems that the FaceDetector used here is not the one my built-in camera app is using. I know this from simple performance evaluation. For instance, if I rotate the camera, the camera app still detects my face but FaceDetector loses it. It seems that there is one more way in the SDK to detect faces starting from Android 4.0 which I have not tried (so, don't now its performance). Still, probably that is not what is used in the camera app. This post here points to the same and has no answer, in case anybody wants to get some StackOverflow points :). I agree though that using the built-in camera app would likely narrow my software down to my phone.

_3DActivity.java
 /*  
  * Working demo of face detection (remember to put the camera in horizontal)  
  * using OpenCV/CascadeClassifier.  
  * Posted in http://cell0907.blogspot.com/2014/01/detecting-faces-in-android-with-opencv.html  
  */  
 package com.cell0907.td1;  
 import org.opencv.android.BaseLoaderCallback;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;  
 import org.opencv.android.LoaderCallbackInterface;  
 import org.opencv.android.OpenCVLoader;  
 import org.opencv.android.Utils;  
 import org.opencv.core.Core;  
 import org.opencv.core.CvException;  
 import org.opencv.core.CvType;  
 import org.opencv.core.Mat;  
 import org.opencv.core.Scalar;  
 import org.opencv.core.Size;  
 import org.opencv.imgproc.Imgproc;  
 import org.opencv.core.Point;  
 import android.app.Activity;  
 import android.graphics.Bitmap;  
 import android.graphics.PointF;  
 import android.media.FaceDetector;  
 import android.media.FaceDetector.Face;  
 import android.os.Bundle;  
 import android.util.Log;  
 import android.view.Menu;  
 import android.view.MenuItem;  
 import android.view.SurfaceView;  
 import android.view.WindowManager;  
 public class _3DActivity extends Activity implements CvCameraViewListener2 {  
   private static final int         VIEW_MODE_CAMERA  = 0;  
   private static final int         VIEW_MODE_GRAY   = 1;  
   private static final int         VIEW_MODE_FACES  = 2;  
   private MenuItem             mItemPreviewRGBA;  
   private MenuItem             mItemPreviewGrey;  
   private MenuItem             mItemPreviewFaces;  
   private int               mViewMode;  
   private Mat               mRgba;  
   private Mat               mGrey;  
   private int                              screen_w, screen_h;  
   private Tutorial3View            mOpenCvCameraView;   
   private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {  
     @Override  
     public void onManagerConnected(int status) {  
       switch (status) {  
         case LoaderCallbackInterface.SUCCESS:  
         {  
           // Load native library after(!) OpenCV initialization  
           mOpenCvCameraView.enableView();        
         } break;  
         default:  
         {  
           super.onManagerConnected(status);  
         } break;  
       }  
     }  
   };  
   public _3DActivity() {  
   }  
   /** Called when the activity is first created. */  
   @Override  
   public void onCreate(Bundle savedInstanceState) {  
     super.onCreate(savedInstanceState);  
     getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);  
     setContentView(R.layout.tutorial2_surface_view);  
     mOpenCvCameraView = (Tutorial3View) findViewById(R.id.tutorial2_activity_surface_view);  
     mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);  
     mOpenCvCameraView.setCvCameraViewListener(this);  
   }  
   @Override  
   public void onPause()  
   {  
     super.onPause();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   @Override  
   public void onResume()  
   {  
     super.onResume();  
     OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);  
   }  
   public void onDestroy() {  
     super.onDestroy();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   public void onCameraViewStarted(int width, int height) {  
        screen_w=width;  
        screen_h=height;  
     mRgba = new Mat(screen_w, screen_h, CvType.CV_8UC4);  
     mGrey = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
     Log.v("MyActivity","Height: "+height+" Width: "+width);  
   }  
   public void onCameraViewStopped() {  
     mRgba.release();  
     mGrey.release();  
   }  
   public Mat onCameraFrame(CvCameraViewFrame inputFrame) {  
        long startTime = System.nanoTime();  
        long endTime;  
        boolean show=true;  
        mRgba=inputFrame.rgba();  
        if (mViewMode==VIEW_MODE_CAMERA) {  
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }     
        if (mViewMode==VIEW_MODE_GRAY){  
          Imgproc.cvtColor( mRgba, mGrey, Imgproc.COLOR_BGR2GRAY);           
          endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mGrey;  
        }  
        // REDUCE THE RESOLUTION TO EXPEDITE THINGS  
        Mat low_res = new Mat(screen_w, screen_h, CvType.CV_8UC4);  
        Imgproc.resize(mRgba,low_res,new Size(),0.25,0.25,Imgproc.INTER_LINEAR);  
        Bitmap bmp = null;  
        try {  
          bmp = Bitmap.createBitmap(low_res.width(), low_res.height(), Bitmap.Config.RGB_565);  
          Utils.matToBitmap(low_res, bmp);  
        }  
        catch (CvException e){Log.v("MyActivity",e.getMessage());}  
           int maxNumFaces = 1; // Set this to whatever you want  
           FaceDetector fd = new FaceDetector((int)(screen_w/4),(int)(screen_h/4),  
                  maxNumFaces);  
           Face[] faces = new Face[maxNumFaces];  
           try {  
                  int numFacesFound = fd.findFaces(bmp, faces);  
                  if (numFacesFound<maxNumFaces) maxNumFaces=numFacesFound;  
                  for (int i = 0; i < maxNumFaces; ++i) {  
                       Face face = faces[i];  
                       PointF MidPoint = new PointF();  
                 face.getMidPoint(MidPoint);  
 /*                      Log.v("MyActivity","Face " + i + " found with " + face.confidence() + " confidence!");  
                       Log.v("MyActivity","Face " + i + " eye distance " + face.eyesDistance());  
                       Log.v("MyActivity","Face " + i + " midpoint (between eyes) " + MidPoint);*/  
                       Point center= new Point(4*MidPoint.x, 4*MidPoint.y);  
                       Core.ellipse( mRgba, new Point(center.x,center.y), new Size(8*face.eyesDistance(), 8*face.eyesDistance()), 0, 0, 360, new Scalar( 255, 0, 255 ), 4, 8, 0 );  
                  }  
             } catch (IllegalArgumentException e) {  
                  // From Docs:  
                  // if the Bitmap dimensions don't match the dimensions defined at initialization   
                  // or the given array is not sized equal to the maxFaces value defined at   
                  // initialization  
                  Log.v("MyActivity","Argument dimensions wrong");  
             }  
           if (mViewMode==VIEW_MODE_FACES) {  
                endTime = System.nanoTime();  
                if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
                //return low_res;  
        }          
           return mRgba;  
    }  
   @Override  
   public boolean onCreateOptionsMenu(Menu menu) {  
     mItemPreviewRGBA = menu.add("RGBA");  
     mItemPreviewGrey = menu.add("Grey");  
     mItemPreviewFaces = menu.add("Faces");  
     return true;  
   }  
   public boolean onOptionsItemSelected(MenuItem item) {  
     if (item == mItemPreviewRGBA) {  
       mViewMode = VIEW_MODE_CAMERA;  
     } else if (item == mItemPreviewGrey) {  
       mViewMode = VIEW_MODE_GRAY;  
     } else if (item == mItemPreviewFaces) {  
       mViewMode = VIEW_MODE_FACES;  
     }  
     return true;  
   }    
 }  

For the rest of the files, please check the original post.

Just a final note... Somebody may wonder why I am mixing OpenCV with face detection from Android SDK. I am using OpenCV because it allows me to display something completely unrelated to what the camera is capturing (I need that for my final app). Although I found a method to do that without OpenCV I think it doesn't work with all the devices out there. And I use the face detection from Android SDK because I think it is more robust and still works, for instance, when turning the head... The weird thing is that it doesn't seem to work as good as the one I get on my camera app (the one that comes from factory with my phone, and HTC One). I also thought it would be faster, but actually looks slower...

PS.: Reference I used...

Monday, January 20, 2014

Detecting faces in Android with OpenCV

Here we are going to detect the faces in images captured by the camera of an Android phone. Notice that Android has built in support for this specific function (see my other post on that) but this example can be applied to anything else where we want to use CascadeClassifier. Also, you can see this with an example for PC using Java/OpenCV. 

Few things to highlight from this post:

1. We wanted to load the .xml cascade classifier from the raw resources, not from a file somewhere. As the casacadeclassifier::load needs a  filepath and raw resources do not have one, the only way I found around this is to save the xml in a file (at the beginning of execution) and load it from there. Explained here. Other stuff I tried is documented in the end, but didn't work...

2. Also, notice that unlike Android camera, in general, and as we have seen in another example, what you capture and what you display, do not have to be the same. Basically, the onCameraFrame  method is called automatically (callback) when a frame from camera is ready. This method returns the Mat that is to be presented in the display (automatically). So, inside the method we can take what the camera captured (object of CvCameraViewFrame class that represents frame from camera) and return whatever we want, related or not to the capture, that will then be displayed in the screen. Note: If you just want to "cut and paste" something that gets your Android camera going and capturing to process in OpenCV, you can use the explanation here.

3. Finally, notice how we choose a smaller size of the capture (setting the camera to lower resolution) in order to expedite the analysis. Using full resolution the processing was taken 750ms in my HTC One! Obviously, we can't work with that... The issue on this case is that we display that same image, so, the display quality is not good. We could, of course, choose to capture full resolution, scale down, detect the faces, highlight them in the original capture and display it (at full resolution). The first code below uses the first method, while the 2nd code below uses the 2nd method.

_3DActivity.java
 /*  
  * Working demo of face detection (remember to put the camera in horizontal)  
  * using OpenCV/CascadeClassifier. Capture at lower resolution  
  */  
 package com.cell0907.td1;  
 import java.io.File;  
 import java.io.FileOutputStream;  
 import java.io.IOException;  
 import java.io.InputStream;  
 import java.util.Arrays;  
 import java.util.List;  
 import org.opencv.android.BaseLoaderCallback;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;  
 import org.opencv.android.LoaderCallbackInterface;  
 import org.opencv.android.OpenCVLoader;  
 import org.opencv.android.Utils;  
 import org.opencv.core.Core;  
 import org.opencv.core.CvType;  
 import org.opencv.core.Mat;  
 import org.opencv.core.MatOfRect;  
 import org.opencv.core.Scalar;  
 import org.opencv.core.Size;  
 import org.opencv.imgproc.Imgproc;  
 import org.opencv.objdetect.CascadeClassifier;  
 import org.opencv.core.Point;  
 import org.opencv.core.Rect;  
 import android.app.Activity;  
 import android.content.Context;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.os.Bundle;  
 import android.os.Environment;  
 import android.util.Log;  
 import android.view.Menu;  
 import android.view.MenuItem;  
 import android.view.SurfaceView;  
 import android.view.WindowManager;  
 public class _3DActivity extends Activity implements CvCameraViewListener2 {  
   private static final int         VIEW_MODE_CAMERA  = 0;  
   private static final int         VIEW_MODE_GRAY   = 1;  
   private static final int         VIEW_MODE_FACES  = 2;  
   private MenuItem             mItemPreviewRGBA;  
   private MenuItem             mItemPreviewGrey;  
   private MenuItem             mItemPreviewFaces;  
   private int               mViewMode;  
   private Mat               mRgba;  
   private Mat               mGrey;  
   private int                              screen_w, screen_h;  
   private CascadeClassifier           face_cascade;  
   private Tutorial3View            mOpenCvCameraView;   
   private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {  
     @Override  
     public void onManagerConnected(int status) {  
       switch (status) {  
         case LoaderCallbackInterface.SUCCESS:  
         {  
           // Load native library after(!) OpenCV initialization  
           mOpenCvCameraView.enableView();        
           load_cascade();  
         } break;  
         default:  
         {  
           super.onManagerConnected(status);  
         } break;  
       }  
     }  
   };  
   public _3DActivity() {  
   }  
   /** Called when the activity is first created. */  
   @Override  
   public void onCreate(Bundle savedInstanceState) {  
     super.onCreate(savedInstanceState);  
     getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);  
     setContentView(R.layout.tutorial2_surface_view);  
     mOpenCvCameraView = (Tutorial3View) findViewById(R.id.tutorial2_activity_surface_view);  
     mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);  
     mOpenCvCameraView.setCvCameraViewListener(this);  
   }  
   @Override  
   public void onPause()  
   {  
     super.onPause();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   @Override  
   public void onResume()  
   {  
     super.onResume();  
     OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);  
   }  
   public void onDestroy() {  
     super.onDestroy();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   android.hardware.Camera.Size set_resolution(){  
        List<android.hardware.Camera.Size> mResolutionList = mOpenCvCameraView.getResolutionList();  
     int lowest= mResolutionList.size()-1;  
     android.hardware.Camera.Size resolution = mResolutionList.get(lowest);  
     // This has worked fine with my phone, but not sure the resolutions are sorted  
     mOpenCvCameraView.setResolution(resolution);  
     return resolution;   
   }  
   public void onCameraViewStarted(int width, int height) {  
        android.hardware.Camera.Size r=set_resolution();   
        //Do we know if the two of them (view and camera) match  
        screen_w=r.width;  
        screen_h=r.height;  
     mRgba = new Mat(screen_w, screen_h, CvType.CV_8UC4);  
     mGrey = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
     Log.v("MyActivity","Height: "+r.height+" Width: "+r.width);  
   }  
   public void onCameraViewStopped() {  
     mRgba.release();  
     mGrey.release();  
   }  
   public Mat onCameraFrame(CvCameraViewFrame inputFrame) {  
        long startTime = System.nanoTime();  
        long endTime;  
        boolean show=true;  
        MatOfRect faces = new MatOfRect();  
        mRgba=inputFrame.rgba();  
        if (mViewMode==VIEW_MODE_CAMERA) {  
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }  
        Imgproc.cvtColor( mRgba, mGrey, Imgproc.COLOR_BGR2GRAY);   
        if (mViewMode==VIEW_MODE_GRAY){             
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mGrey;  
        }  
        Imgproc.equalizeHist( mGrey, mGrey );   
           face_cascade.detectMultiScale(mGrey, faces);  
           if (show==true) Log.v("MyActivity","Detected "+faces.toArray().length+" faces");  
           for(Rect rect:faces.toArray())  
           {  
                Point center= new Point(rect.x + rect.width*0.5, rect.y + rect.height*0.5 );  
                Core.ellipse( mRgba, center, new Size( rect.width*0.5, rect.height*0.5), 0, 0, 360, new Scalar( 255, 0, 255 ), 4, 8, 0 );  
           }  
           if (mViewMode==VIEW_MODE_FACES) {  
                endTime = System.nanoTime();  
                if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }  
           return mRgba;  
    }  
   @Override  
   public boolean onCreateOptionsMenu(Menu menu) {  
     mItemPreviewRGBA = menu.add("RGBA");  
     mItemPreviewGrey = menu.add("Grey");  
     mItemPreviewFaces = menu.add("Faces");  
     return true;  
   }  
   public boolean onOptionsItemSelected(MenuItem item) {  
     if (item == mItemPreviewRGBA) {  
       mViewMode = VIEW_MODE_CAMERA;  
     } else if (item == mItemPreviewGrey) {  
       mViewMode = VIEW_MODE_GRAY;  
     } else if (item == mItemPreviewFaces) {  
       mViewMode = VIEW_MODE_FACES;  
     }  
     return true;  
   }    
   private void load_cascade(){  
        try {  
             // LOAD FROM ASSET  
             InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);  
             //InputStream is = getResources().openRawResource(R.raw.haarcascade_frontalface_alt);  
             File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);  
             File mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");  
             FileOutputStream os = new FileOutputStream(mCascadeFile);  
             byte[] buffer = new byte[4096];  
             int bytesRead;  
             while ((bytesRead = is.read(buffer)) != -1) {  
                  os.write(buffer, 0, bytesRead);  
             }  
             is.close();  
             os.close();  
             face_cascade = new CascadeClassifier(mCascadeFile.getAbsolutePath());  
             if(face_cascade.empty())  
             {  
                  Log.v("MyActivity","--(!)Error loading A\n");  
                  return;  
             }  
             else  
             {  
                  Log.v("MyActivity",  
                            "Loaded cascade classifier from " + mCascadeFile.getAbsolutePath());  
             }  
        } catch (IOException e) {  
             e.printStackTrace();  
             Log.v("MyActivity", "Failed to load cascade. Exception thrown: " + e);  
        }  
   }  
 }  

I've read somewhere that picking the lowest resolution this way is not guaranteed to work, as the list may not be in order. Have not worked further on this and it works for me, but just a warning...

Using the second method, basically we remove the stuff that checks/set the resolution of the camera and use Imgproc.resize to get a lower resolution Mat to be processed, instead of the full resolution. Then we put the results back in the original resolution Mat, to present it nicely:
 /*  
  * Working demo of face detection (remember to put the camera in horizontal)  
  * using OpenCV/CascadeClassifier.  
  */  
 package com.cell0907.td1;  
 import java.io.File;  
 import java.io.FileOutputStream;  
 import java.io.IOException;  
 import java.io.InputStream;  
 import java.util.Arrays;  
 import java.util.List;  
 import org.opencv.android.BaseLoaderCallback;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;  
 import org.opencv.android.LoaderCallbackInterface;  
 import org.opencv.android.OpenCVLoader;  
 import org.opencv.android.Utils;  
 import org.opencv.core.Core;  
 import org.opencv.core.CvType;  
 import org.opencv.core.Mat;  
 import org.opencv.core.MatOfRect;  
 import org.opencv.core.Scalar;  
 import org.opencv.core.Size;  
 import org.opencv.imgproc.Imgproc;  
 import org.opencv.objdetect.CascadeClassifier;  
 import org.opencv.core.Point;  
 import org.opencv.core.Rect;  
 import android.app.Activity;  
 import android.content.Context;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.os.Bundle;  
 import android.os.Environment;  
 import android.util.Log;  
 import android.view.Menu;  
 import android.view.MenuItem;  
 import android.view.SurfaceView;  
 import android.view.WindowManager;  
 public class _3DActivity extends Activity implements CvCameraViewListener2 {  
   private static final int         VIEW_MODE_CAMERA  = 0;  
   private static final int         VIEW_MODE_GRAY   = 1;  
   private static final int         VIEW_MODE_FACES  = 2;  
   private MenuItem             mItemPreviewRGBA;  
   private MenuItem             mItemPreviewGrey;  
   private MenuItem             mItemPreviewFaces;  
   private int               mViewMode;  
   private Mat               mRgba;  
   private Mat               mGrey;  
   private int                              screen_w, screen_h;  
   private CascadeClassifier           face_cascade;  
   private Tutorial3View            mOpenCvCameraView;   
   private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {  
     @Override  
     public void onManagerConnected(int status) {  
       switch (status) {  
         case LoaderCallbackInterface.SUCCESS:  
         {  
           // Load native library after(!) OpenCV initialization  
           mOpenCvCameraView.enableView();        
           load_cascade();  
         } break;  
         default:  
         {  
           super.onManagerConnected(status);  
         } break;  
       }  
     }  
   };  
   public _3DActivity() {  
   }  
   /** Called when the activity is first created. */  
   @Override  
   public void onCreate(Bundle savedInstanceState) {  
     super.onCreate(savedInstanceState);  
     getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);  
     setContentView(R.layout.tutorial2_surface_view);  
     mOpenCvCameraView = (Tutorial3View) findViewById(R.id.tutorial2_activity_surface_view);  
     mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);  
     mOpenCvCameraView.setCvCameraViewListener(this);  
   }  
   @Override  
   public void onPause()  
   {  
     super.onPause();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   @Override  
   public void onResume()  
   {  
     super.onResume();  
     OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);  
   }  
   public void onDestroy() {  
     super.onDestroy();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   public void onCameraViewStarted(int width, int height) {  
        screen_w=width;  
        screen_h=height;  
     mRgba = new Mat(screen_w, screen_h, CvType.CV_8UC4);  
     mGrey = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
     Log.v("MyActivity","Height: "+height+" Width: "+width);  
   }  
   public void onCameraViewStopped() {  
     mRgba.release();  
     mGrey.release();  
   }  
   public Mat onCameraFrame(CvCameraViewFrame inputFrame) {  
        long startTime = System.nanoTime();  
        long endTime;  
        boolean show=true;  
        MatOfRect faces = new MatOfRect();  
        mRgba=inputFrame.rgba();  
        if (mViewMode==VIEW_MODE_CAMERA) {  
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }  
        Imgproc.cvtColor( mRgba, mGrey, Imgproc.COLOR_BGR2GRAY);   
        if (mViewMode==VIEW_MODE_GRAY){             
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mGrey;  
        }  
        Mat low_res = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
        // 1280 x 720  
        Log.v("MyActivity","width: "+screen_w+" height: "+screen_h);  
        Imgproc.resize(mGrey,low_res,new Size(),0.25,0.25,Imgproc.INTER_LINEAR);  
        Imgproc.equalizeHist( low_res, low_res );   
           face_cascade.detectMultiScale(low_res, faces);  
           if (show==true) Log.v("MyActivity","Detected "+faces.toArray().length+" faces");  
           for(Rect rect:faces.toArray())  
           {  
                Point center= new Point(4*rect.x + 4*rect.width*0.5, 4*rect.y + 4*rect.height*0.5 );  
                Core.ellipse( mRgba, new Point(center.x,center.y), new Size( rect.width*2, rect.height*2), 0, 0, 360, new Scalar( 255, 0, 255 ), 4, 8, 0 );  
           }  
           if (mViewMode==VIEW_MODE_FACES) {  
                endTime = System.nanoTime();  
                if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
                //return low_res;  
        }  
           return mRgba;  
    }  
   @Override  
   public boolean onCreateOptionsMenu(Menu menu) {  
     mItemPreviewRGBA = menu.add("RGBA");  
     mItemPreviewGrey = menu.add("Grey");  
     mItemPreviewFaces = menu.add("Faces");  
     return true;  
   }  
   public boolean onOptionsItemSelected(MenuItem item) {  
     if (item == mItemPreviewRGBA) {  
       mViewMode = VIEW_MODE_CAMERA;  
     } else if (item == mItemPreviewGrey) {  
       mViewMode = VIEW_MODE_GRAY;  
     } else if (item == mItemPreviewFaces) {  
       mViewMode = VIEW_MODE_FACES;  
     }  
     return true;  
   }    
   private void load_cascade(){  
        try {  
             // LOAD FROM ASSET  
             InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);  
             //InputStream is = getResources().openRawResource(R.raw.haarcascade_frontalface_alt);  
             File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);  
             File mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");  
             FileOutputStream os = new FileOutputStream(mCascadeFile);  
             byte[] buffer = new byte[4096];  
             int bytesRead;  
             while ((bytesRead = is.read(buffer)) != -1) {  
                  os.write(buffer, 0, bytesRead);  
             }  
             is.close();  
             os.close();  
             face_cascade = new CascadeClassifier(mCascadeFile.getAbsolutePath());  
             if(face_cascade.empty())  
             {  
                  Log.v("MyActivity","--(!)Error loading A\n");  
                  return;  
             }  
             else  
             {  
                  Log.v("MyActivity",  
                            "Loaded cascade classifier from " + mCascadeFile.getAbsolutePath());  
             }  
        } catch (IOException e) {  
             e.printStackTrace();  
             Log.v("MyActivity", "Failed to load cascade. Exception thrown: " + e);  
        }  
   }  
 }  

For any of the two, the following files are the same. Tutorial3View.java:
 package com.cell0907.td1;  
 import java.io.FileOutputStream;  
 import java.util.List;  
 import org.opencv.android.JavaCameraView;  
 import android.content.Context;  
 import android.hardware.Camera;  
 import android.hardware.Camera.PictureCallback;  
 import android.hardware.Camera.Size;  
 import android.util.AttributeSet;  
 import android.util.Log;  
 public class Tutorial3View extends JavaCameraView implements PictureCallback {  
   private static final String TAG = "MyActivity";  
   private String mPictureFileName;  
   public Tutorial3View(Context context, AttributeSet attrs) {  
     super(context, attrs);  
   }  
   public List<String> getEffectList() {  
     return mCamera.getParameters().getSupportedColorEffects();  
   }  
   public boolean isEffectSupported() {  
     return (mCamera.getParameters().getColorEffect() != null);  
   }  
   public String getEffect() {  
     return mCamera.getParameters().getColorEffect();  
   }  
   public void setEffect(String effect) {  
     Camera.Parameters params = mCamera.getParameters();  
     params.setColorEffect(effect);  
     mCamera.setParameters(params);  
   }  
   public List<Size> getResolutionList() {  
     return mCamera.getParameters().getSupportedPreviewSizes();  
   }  
   public void setResolution(Size resolution) {  
     disconnectCamera();  
     mMaxHeight = resolution.height;  
     mMaxWidth = resolution.width;  
     connectCamera(getWidth(), getHeight());  
   }  
   public Size getResolution() {  
     return mCamera.getParameters().getPreviewSize();  
   }  
   public void takePicture(final String fileName) {  
     Log.i(TAG, "Taking picture");  
     this.mPictureFileName = fileName;  
     // Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.  
     // Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue  
     mCamera.setPreviewCallback(null);  
     // PictureCallback is implemented by the current class  
     mCamera.takePicture(null, null, this);  
   }  
   @Override  
   public void onPictureTaken(byte[] data, Camera camera) {  
     Log.i(TAG, "Saving a bitmap to file");  
     // The camera preview was automatically stopped. Start it again.  
     mCamera.startPreview();  
     mCamera.setPreviewCallback(this);  
     // Write the image in a file (in jpeg format)  
     try {  
       FileOutputStream fos = new FileOutputStream(mPictureFileName);  
       fos.write(data);  
       fos.close();  
     } catch (java.io.IOException e) {  
       Log.e("PictureDemo", "Exception in photoCallback", e);  
     }  
   }  
 }  

The layout: Tutorial2_surface_view.xml:
 <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"  
   xmlns:tools="http://schemas.android.com/tools"  
   xmlns:opencv="http://schemas.android.com/apk/res-auto"  
   android:layout_width="match_parent"  
   android:layout_height="match_parent" >  
   <com.cell0907.td1.Tutorial3View  
     android:id="@+id/tutorial2_activity_surface_view"  
     android:layout_width="match_parent"  
     android:layout_height="match_parent"  
     opencv:camera_id="1"  
     opencv:show_fps="false" />  
 </LinearLayout>  

AndroidManifest.xml:
 <?xml version="1.0" encoding="utf-8"?>  
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
      package="com.cell0907.td1"  
      android:versionCode="21"  
      android:versionName="2.1">  
       <supports-screens android:resizeable="true"  
            android:smallScreens="true"  
            android:normalScreens="true"  
            android:largeScreens="true"  
            android:anyDensity="true" />  
   <uses-sdk android:minSdkVersion="8"   
               android:targetSdkVersion="10" />  
   <uses-permission android:name="android.permission.CAMERA"/>  
   <uses-feature android:name="android.hardware.camera" android:required="false"/>  
   <uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>  
   <uses-feature android:name="android.hardware.camera.front" android:required="false"/>  
   <uses-feature android:name="android.hardware.camera.front.autofocus" android:required="false"/>  
   <application  
     android:label="@string/app_name"  
     android:icon="@drawable/icon"  
     android:theme="@android:style/Theme.NoTitleBar.Fullscreen"  
     android:allowBackup="false">  
     <activity android:name="_3DActivity"  
          android:label="@string/app_name"  
          android:screenOrientation="landscape"  
          android:configChanges="keyboardHidden|orientation">  
       <intent-filter>  
         <action android:name="android.intent.action.MAIN" />  
         <category android:name="android.intent.category.LAUNCHER" />  
       </intent-filter>  
     </activity>  
   </application>  
 </manifest>  

Cheers!

PS1.: Click here to see the index of these series of posts on OpenCV
PS2.: This is other stuff I found in some places and "randomly" tried after reading that it was not possible to get a file path to a resource, and sure enough, didn't work... But just for documentation sake:

    Uri uri=Uri.parse("android.resource://com.example.td1/raw/lbpcascade_frontalface");
    File myFile = new File(uri.toString());
    face_cascade=new CascadeClassifier(myFile.getAbsolutePath());

Or this:                                       

    face_cascade=new CascadeClassifier("file:///android_asset/lbpcascade_frontalface.xml");
   
Or this:

    face_cascade=new CascadeClassifier("android.resource://com.example.td1/raw/lbpcascade_frontalface.xml");

Thursday, January 16, 2014

Android Camera capture without display/user interface preview

Here I am going to capture the scene with the Android camera without showing in the screen what the camera is seeing but something completely unrelated to the captured image itself. This has several uses. In my case, I plan to do image processing on the pics but actually display something completely unrelated.

If you want to show what the camera is seeing, it is something standard and you can find more documentation out there. I provide some links all the way down.

Avoiding to show it, which somebody would think it should be straightforward, seems to be unsupported in Android. Some say that it may be because of privacy concerns (you want to see that the camera is on by seeing the display on). I kind of doubt it, because, as I'll show it can actually be done easilly. Just that is not documented.

I found so far two approaches, which basically should yield a third one that I still need to research:
  1. Directly, using the Android camera API. I found the solution here which pointed also here (the same).
  2. Using OpenCV (I'll show that on a different post). 
  3. One would think that if OpenCV in Android can do it, there may be an Android API way using the same trick that OpenCV is using. Got to research that...
So, anyhow, let's look at the first method, directly, using Android camera API:

The top level steps are:
  1. Check if there is a camera
  2. If there is, find the id of the one you want (front or back...). Check the findFrontFacingCamera routine in the code below.
  3. Open it (camera.open API call). See safeCameraOpen.
  4. Create a fake SurfaceView and set it for the camera. Usually this is used for the preview of the camera (again, see all the way below for the typical use), but here we just trick the camera as we never actually display it. This, with #5, are the key differences respect to displaying the preview...
  5. When we want to take a picture, triggered on any way (somebody presses a button or touches the screen or, in my case, with a timer), call camera.takePicture. The arguments of the method are callback functions that happen along the process of taking the picture. From the documentation:
Triggers an asynchronous image capture. The camera service will initiate a series of callbacks to the application as the image capture progresses. The shutter callback occurs after the image is captured. This can be used to trigger a sound to let the user know that image has been captured. The raw callback occurs when the raw image data is available (NOTE: the data will be null if there is no raw image callback buffer available or the raw image callback buffer is not large enough to hold the raw image). The postview callback occurs when a scaled, fully processed postview image is available (NOTE: not all hardware supports this). The jpeg callback occurs when the compressed image is available. If the application does not need a particular callback, a null can be passed instead of a callback method.
This method is only valid when preview is active (after startPreview()). Preview will be stopped after the image is taken; callers must call startPreview() again if they want to re-start preview or take more pictures. This should not be called between start() and stop().
After calling this method, you must not call startPreview() or take another picture until the JPEG callback has returned.
  1. The shutter callback is mostly used to trigger a shutter sound. The raw callback would be the ideal thing to use in my case, for image processing. Nevertheless, many posts warn about the poor documentation and manufacturer specific issues, so, I go, for the time being, for the safe/next one, which uses  jpeg encoding, unfortunately wasting precious processor bandwidth to encode/decode the image and loss of quality (although I don't think the second is so critical to me).
  2. Parallel to that, we have an ImageView that we use to display whatever we want related or not to the camera.
  3. So, that would be it. Notice that in my code I trigger the capture with a timer done with an AsyncTask. After a certain time, the task sends a message that will make the handle call takePicture, setting the callback to get a jpeg from the camera. When done, sends another message back and then the handle presents that picture in the display, re-stating the whole process. Notice that:
    1. I present the picture because I want (just to show is working), but I didn't have to. I can choose to present whatever I want in ImageView. That was the whole point...
    2. In my case, actually I want to take pictures as fast as possible, so, the timer is actually irrelevant. I could just delete it and call takePicture directly, but I leave it there to explain how I would do it with delay in the middle...
MainActivity.java
 package com.example.camera1;  
 import java.io.IOException;  
 import android.os.Bundle;  
 import android.os.Handler;  
 import android.os.Message;  
 import android.app.Activity;  
 import android.content.pm.PackageManager;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.hardware.Camera;  
 import android.hardware.Camera.CameraInfo;  
 import android.util.Log;  
 import android.view.SurfaceView;  
 import android.widget.ImageView;  
 import android.widget.Toast;  
 public class MainActivity extends Activity {  
      public static final int DONE=1;  
      public static final int NEXT=2;  
      public static final int PERIOD=1;   
      private Camera camera;  
      private int cameraId = 0;  
      private ImageView display;  
      private Timer timer;  
      @Override  
      public void onCreate(Bundle savedInstanceState) {  
           super.onCreate(savedInstanceState);  
        setContentView(R.layout.activity_main);  
        display=(ImageView)findViewById(R.id.imageView1);  
        // do we have a camera?  
        if (!getPackageManager()  
          .hasSystemFeature(PackageManager.FEATURE_CAMERA)) {  
         Toast.makeText(this, "No camera on this device", Toast.LENGTH_LONG)  
           .show();  
        } else {  
         cameraId = findFrontFacingCamera();  
         if (cameraId < 0) {  
          Toast.makeText(this, "No front facing camera found.",  
            Toast.LENGTH_LONG).show();  
         } else {  
              safeCameraOpen(cameraId);   
         }  
        }         
        // THIS IS JUST A FAKE SURFACE TO TRICK THE CAMERA PREVIEW  
        // http://stackoverflow.com/questions/17859777/how-to-take-pictures-in-android-  
        // application-without-the-user-interface  
        SurfaceView view = new SurfaceView(this);  
        try {  
                camera.setPreviewDisplay(view.getHolder());  
           } catch (IOException e) {  
                // TODO Auto-generated catch block  
                e.printStackTrace();  
           }  
        camera.startPreview();  
        Camera.Parameters params = camera.getParameters();  
        params.setJpegQuality(100);  
        camera.setParameters(params);  
        // We need something to trigger periodically the capture of a  
        // picture to be processed  
        timer=new Timer(getApplicationContext(),threadHandler);  
        timer.execute();  
        }  
      ////////////////////////////////////thread Handler///////////////////////////////////////  
      private Handler threadHandler = new Handler() {  
           public void handleMessage(android.os.Message msg) {       
                 switch(msg.what){  
                 case DONE:  
                     // Trigger camera callback to take pic  
                      camera.takePicture(null, null, mCall);  
                      break;  
                 case NEXT:  
                      timer=new Timer(getApplicationContext(),threadHandler);  
                      timer.execute();  
                      break;  
                 }  
                 }  
            };  
       Camera.PictureCallback mCall = new Camera.PictureCallback() {  
            public void onPictureTaken(byte[] data, Camera camera) {  
               //decode the data obtained by the camera into a Bitmap  
                  //display.setImageBitmap(photo);  
                  Bitmap bitmapPicture  
                  = BitmapFactory.decodeByteArray(data, 0, data.length);  
                  display.setImageBitmap(bitmapPicture);  
                  Message.obtain(threadHandler, MainActivity.NEXT, "").sendToTarget();   
                  //Log.v("MyActivity","Length: "+data.length);  
             }        
       };  
      private int findFrontFacingCamera() {  
           int cameraId = -1;  
           // Search for the front facing camera  
           int numberOfCameras = Camera.getNumberOfCameras();  
           for (int i = 0; i < numberOfCameras; i++) {  
                CameraInfo info = new CameraInfo();  
                Camera.getCameraInfo(i, info);  
                if (info.facing == CameraInfo.CAMERA_FACING_FRONT) {  
                     Log.v("MyActivity", "Camera found");  
               cameraId = i;  
               break;  
              }  
             }  
             return cameraId;  
            }  
      @Override  
      protected void onPause() {  
           if (timer!=null){  
                timer.cancel(true);  
           }  
        releaseCamera();  
        super.onPause();  
       }       
      // I think Android Documentation recommends doing this in a separate  
      // task to avoid blocking main UI  
      private boolean safeCameraOpen(int id) {  
        boolean qOpened = false;  
        try {  
          releaseCamera();  
          camera = Camera.open(id);  
          qOpened = (camera != null);  
        } catch (Exception e) {  
          Log.e(getString(R.string.app_name), "failed to open Camera");  
          e.printStackTrace();  
        }  
        return qOpened;    
      }  
      private void releaseCamera() {  
        if (camera != null) {  
             camera.stopPreview();  
          camera.release();  
          camera = null;  
        }  
      }  
 }  

Timer.java
 package com.example.camera1;  
 import android.content.Context;  
 import android.os.Handler;  
 import android.os.Message;  
 import android.os.AsyncTask;  
 public class Timer extends AsyncTask<Void, Void, Void> {  
   Context mContext;  
      private Handler threadHandler;  
   public Timer(Context context,Handler threadHandler) {  
     super();  
     this.threadHandler=threadHandler;  
     mContext = context;  
       }  
   @Override  
      protected Void doInBackground(Void...params) {   
        try {  
                Thread.sleep(MainActivity.PERIOD);  
           } catch (InterruptedException e) {  
                // TODO Auto-generated catch block  
                e.printStackTrace();  
           }   
        Message.obtain(threadHandler, MainActivity.DONE, "").sendToTarget();   
         return null;  
   }  
 }  

AndroidManifest.xml
 <?xml version="1.0" encoding="utf-8"?>  
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
   package="com.example.camera1"  
   android:versionCode="1"  
   android:versionName="1.0" >  
   <uses-sdk  
     android:minSdkVersion="9"  
     android:targetSdkVersion="17" />  
      <uses-permission android:name="android.permission.CAMERA"/>  
   <application  
     android:label="@string/app_name"  
     android:icon="@drawable/ic_launcher"  
     android:theme="@android:style/Theme.NoTitleBar.Fullscreen"  
     android:allowBackup="false">  
     <activity android:name="MainActivity"  
          android:label="@string/app_name"  
          android:screenOrientation="landscape"  
          android:configChanges="keyboardHidden|orientation">  
       <intent-filter>  
         <action android:name="android.intent.action.MAIN" />  
         <category android:name="android.intent.category.LAUNCHER" />  
       </intent-filter>  
     </activity>  
   </application>  
 </manifest>  

And the layout activity_main.xml:
 <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"  
   android:layout_width="match_parent"  
   android:layout_height="wrap_content" >  
   <ImageView  
     android:id="@+id/imageView1"  
     android:layout_width="wrap_content"  
     android:layout_height="wrap_content"  
     android:layout_alignParentBottom="true"  
     android:layout_alignParentLeft="true"  
     android:layout_alignParentRight="true"  
     android:layout_alignParentTop="true"  
     android:src="@drawable/ic_launcher" />  
 </RelativeLayout>  

Cheers!

PS.: Check out this video on the Android 3D display I build based on this stuff and the details here.
PS1.: Please, click here to see an index of other posts on Android.
PS2.: Other camera examples:
http://www.vogella.com/articles/AndroidCamera/article.html
http://android-er.blogspot.com.es/2010/12/implement-takepicture-function-of.html
http://android.codota.com/scenarios/518915fdda0a68487f6039c2/android.hardware.Camera?tag=out_2013_05_05_07_19_34&fullSource=1

Sunday, January 12, 2014

USB Pen / Flash Drive: "there is no disk in the drive"

So, I plug the USB pen with all my work in there (few days since the last backup... I know!) and when I try to access it, I get the pop up window saying "there is no disk in the drive". Command prompt says something similar "The device is not ready".

Straight to the solution: here. Easiest fix ever! Just give it a name: right click on the drive - Properties.

Thinking that I had already opened up the drive and was ready to start probing nodes!! LOL!!

PS1.: The first thing I did after getting access was to copy all the data to another drive. While doing that, I discover that some files (very small fraction) were damaged. No big deal, as I had backups and none was the new ones...
PS2.: Unfortunately enough, I went back to an old SD flash card I had discarded for similar reasons (I was getting an error saying "G:\ is not accessible. The file or directory is corrupted and unreadable") and this did not fix it... I can't actually rename it. Oh well..