Monday, January 20, 2014

Detecting faces in Android with OpenCV

Here we are going to detect the faces in images captured by the camera of an Android phone. Notice that Android has built in support for this specific function (see my other post on that) but this example can be applied to anything else where we want to use CascadeClassifier. Also, you can see this with an example for PC using Java/OpenCV. 

Few things to highlight from this post:

1. We wanted to load the .xml cascade classifier from the raw resources, not from a file somewhere. As the casacadeclassifier::load needs a  filepath and raw resources do not have one, the only way I found around this is to save the xml in a file (at the beginning of execution) and load it from there. Explained here. Other stuff I tried is documented in the end, but didn't work...

2. Also, notice that unlike Android camera, in general, and as we have seen in another example, what you capture and what you display, do not have to be the same. Basically, the onCameraFrame  method is called automatically (callback) when a frame from camera is ready. This method returns the Mat that is to be presented in the display (automatically). So, inside the method we can take what the camera captured (object of CvCameraViewFrame class that represents frame from camera) and return whatever we want, related or not to the capture, that will then be displayed in the screen. Note: If you just want to "cut and paste" something that gets your Android camera going and capturing to process in OpenCV, you can use the explanation here.

3. Finally, notice how we choose a smaller size of the capture (setting the camera to lower resolution) in order to expedite the analysis. Using full resolution the processing was taken 750ms in my HTC One! Obviously, we can't work with that... The issue on this case is that we display that same image, so, the display quality is not good. We could, of course, choose to capture full resolution, scale down, detect the faces, highlight them in the original capture and display it (at full resolution). The first code below uses the first method, while the 2nd code below uses the 2nd method.

_3DActivity.java
 /*  
  * Working demo of face detection (remember to put the camera in horizontal)  
  * using OpenCV/CascadeClassifier. Capture at lower resolution  
  */  
 package com.cell0907.td1;  
 import java.io.File;  
 import java.io.FileOutputStream;  
 import java.io.IOException;  
 import java.io.InputStream;  
 import java.util.Arrays;  
 import java.util.List;  
 import org.opencv.android.BaseLoaderCallback;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;  
 import org.opencv.android.LoaderCallbackInterface;  
 import org.opencv.android.OpenCVLoader;  
 import org.opencv.android.Utils;  
 import org.opencv.core.Core;  
 import org.opencv.core.CvType;  
 import org.opencv.core.Mat;  
 import org.opencv.core.MatOfRect;  
 import org.opencv.core.Scalar;  
 import org.opencv.core.Size;  
 import org.opencv.imgproc.Imgproc;  
 import org.opencv.objdetect.CascadeClassifier;  
 import org.opencv.core.Point;  
 import org.opencv.core.Rect;  
 import android.app.Activity;  
 import android.content.Context;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.os.Bundle;  
 import android.os.Environment;  
 import android.util.Log;  
 import android.view.Menu;  
 import android.view.MenuItem;  
 import android.view.SurfaceView;  
 import android.view.WindowManager;  
 public class _3DActivity extends Activity implements CvCameraViewListener2 {  
   private static final int         VIEW_MODE_CAMERA  = 0;  
   private static final int         VIEW_MODE_GRAY   = 1;  
   private static final int         VIEW_MODE_FACES  = 2;  
   private MenuItem             mItemPreviewRGBA;  
   private MenuItem             mItemPreviewGrey;  
   private MenuItem             mItemPreviewFaces;  
   private int               mViewMode;  
   private Mat               mRgba;  
   private Mat               mGrey;  
   private int                              screen_w, screen_h;  
   private CascadeClassifier           face_cascade;  
   private Tutorial3View            mOpenCvCameraView;   
   private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {  
     @Override  
     public void onManagerConnected(int status) {  
       switch (status) {  
         case LoaderCallbackInterface.SUCCESS:  
         {  
           // Load native library after(!) OpenCV initialization  
           mOpenCvCameraView.enableView();        
           load_cascade();  
         } break;  
         default:  
         {  
           super.onManagerConnected(status);  
         } break;  
       }  
     }  
   };  
   public _3DActivity() {  
   }  
   /** Called when the activity is first created. */  
   @Override  
   public void onCreate(Bundle savedInstanceState) {  
     super.onCreate(savedInstanceState);  
     getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);  
     setContentView(R.layout.tutorial2_surface_view);  
     mOpenCvCameraView = (Tutorial3View) findViewById(R.id.tutorial2_activity_surface_view);  
     mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);  
     mOpenCvCameraView.setCvCameraViewListener(this);  
   }  
   @Override  
   public void onPause()  
   {  
     super.onPause();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   @Override  
   public void onResume()  
   {  
     super.onResume();  
     OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);  
   }  
   public void onDestroy() {  
     super.onDestroy();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   android.hardware.Camera.Size set_resolution(){  
        List<android.hardware.Camera.Size> mResolutionList = mOpenCvCameraView.getResolutionList();  
     int lowest= mResolutionList.size()-1;  
     android.hardware.Camera.Size resolution = mResolutionList.get(lowest);  
     // This has worked fine with my phone, but not sure the resolutions are sorted  
     mOpenCvCameraView.setResolution(resolution);  
     return resolution;   
   }  
   public void onCameraViewStarted(int width, int height) {  
        android.hardware.Camera.Size r=set_resolution();   
        //Do we know if the two of them (view and camera) match  
        screen_w=r.width;  
        screen_h=r.height;  
     mRgba = new Mat(screen_w, screen_h, CvType.CV_8UC4);  
     mGrey = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
     Log.v("MyActivity","Height: "+r.height+" Width: "+r.width);  
   }  
   public void onCameraViewStopped() {  
     mRgba.release();  
     mGrey.release();  
   }  
   public Mat onCameraFrame(CvCameraViewFrame inputFrame) {  
        long startTime = System.nanoTime();  
        long endTime;  
        boolean show=true;  
        MatOfRect faces = new MatOfRect();  
        mRgba=inputFrame.rgba();  
        if (mViewMode==VIEW_MODE_CAMERA) {  
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }  
        Imgproc.cvtColor( mRgba, mGrey, Imgproc.COLOR_BGR2GRAY);   
        if (mViewMode==VIEW_MODE_GRAY){             
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mGrey;  
        }  
        Imgproc.equalizeHist( mGrey, mGrey );   
           face_cascade.detectMultiScale(mGrey, faces);  
           if (show==true) Log.v("MyActivity","Detected "+faces.toArray().length+" faces");  
           for(Rect rect:faces.toArray())  
           {  
                Point center= new Point(rect.x + rect.width*0.5, rect.y + rect.height*0.5 );  
                Core.ellipse( mRgba, center, new Size( rect.width*0.5, rect.height*0.5), 0, 0, 360, new Scalar( 255, 0, 255 ), 4, 8, 0 );  
           }  
           if (mViewMode==VIEW_MODE_FACES) {  
                endTime = System.nanoTime();  
                if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }  
           return mRgba;  
    }  
   @Override  
   public boolean onCreateOptionsMenu(Menu menu) {  
     mItemPreviewRGBA = menu.add("RGBA");  
     mItemPreviewGrey = menu.add("Grey");  
     mItemPreviewFaces = menu.add("Faces");  
     return true;  
   }  
   public boolean onOptionsItemSelected(MenuItem item) {  
     if (item == mItemPreviewRGBA) {  
       mViewMode = VIEW_MODE_CAMERA;  
     } else if (item == mItemPreviewGrey) {  
       mViewMode = VIEW_MODE_GRAY;  
     } else if (item == mItemPreviewFaces) {  
       mViewMode = VIEW_MODE_FACES;  
     }  
     return true;  
   }    
   private void load_cascade(){  
        try {  
             // LOAD FROM ASSET  
             InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);  
             //InputStream is = getResources().openRawResource(R.raw.haarcascade_frontalface_alt);  
             File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);  
             File mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");  
             FileOutputStream os = new FileOutputStream(mCascadeFile);  
             byte[] buffer = new byte[4096];  
             int bytesRead;  
             while ((bytesRead = is.read(buffer)) != -1) {  
                  os.write(buffer, 0, bytesRead);  
             }  
             is.close();  
             os.close();  
             face_cascade = new CascadeClassifier(mCascadeFile.getAbsolutePath());  
             if(face_cascade.empty())  
             {  
                  Log.v("MyActivity","--(!)Error loading A\n");  
                  return;  
             }  
             else  
             {  
                  Log.v("MyActivity",  
                            "Loaded cascade classifier from " + mCascadeFile.getAbsolutePath());  
             }  
        } catch (IOException e) {  
             e.printStackTrace();  
             Log.v("MyActivity", "Failed to load cascade. Exception thrown: " + e);  
        }  
   }  
 }  

I've read somewhere that picking the lowest resolution this way is not guaranteed to work, as the list may not be in order. Have not worked further on this and it works for me, but just a warning...

Using the second method, basically we remove the stuff that checks/set the resolution of the camera and use Imgproc.resize to get a lower resolution Mat to be processed, instead of the full resolution. Then we put the results back in the original resolution Mat, to present it nicely:
 /*  
  * Working demo of face detection (remember to put the camera in horizontal)  
  * using OpenCV/CascadeClassifier.  
  */  
 package com.cell0907.td1;  
 import java.io.File;  
 import java.io.FileOutputStream;  
 import java.io.IOException;  
 import java.io.InputStream;  
 import java.util.Arrays;  
 import java.util.List;  
 import org.opencv.android.BaseLoaderCallback;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;  
 import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;  
 import org.opencv.android.LoaderCallbackInterface;  
 import org.opencv.android.OpenCVLoader;  
 import org.opencv.android.Utils;  
 import org.opencv.core.Core;  
 import org.opencv.core.CvType;  
 import org.opencv.core.Mat;  
 import org.opencv.core.MatOfRect;  
 import org.opencv.core.Scalar;  
 import org.opencv.core.Size;  
 import org.opencv.imgproc.Imgproc;  
 import org.opencv.objdetect.CascadeClassifier;  
 import org.opencv.core.Point;  
 import org.opencv.core.Rect;  
 import android.app.Activity;  
 import android.content.Context;  
 import android.graphics.Bitmap;  
 import android.graphics.BitmapFactory;  
 import android.os.Bundle;  
 import android.os.Environment;  
 import android.util.Log;  
 import android.view.Menu;  
 import android.view.MenuItem;  
 import android.view.SurfaceView;  
 import android.view.WindowManager;  
 public class _3DActivity extends Activity implements CvCameraViewListener2 {  
   private static final int         VIEW_MODE_CAMERA  = 0;  
   private static final int         VIEW_MODE_GRAY   = 1;  
   private static final int         VIEW_MODE_FACES  = 2;  
   private MenuItem             mItemPreviewRGBA;  
   private MenuItem             mItemPreviewGrey;  
   private MenuItem             mItemPreviewFaces;  
   private int               mViewMode;  
   private Mat               mRgba;  
   private Mat               mGrey;  
   private int                              screen_w, screen_h;  
   private CascadeClassifier           face_cascade;  
   private Tutorial3View            mOpenCvCameraView;   
   private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {  
     @Override  
     public void onManagerConnected(int status) {  
       switch (status) {  
         case LoaderCallbackInterface.SUCCESS:  
         {  
           // Load native library after(!) OpenCV initialization  
           mOpenCvCameraView.enableView();        
           load_cascade();  
         } break;  
         default:  
         {  
           super.onManagerConnected(status);  
         } break;  
       }  
     }  
   };  
   public _3DActivity() {  
   }  
   /** Called when the activity is first created. */  
   @Override  
   public void onCreate(Bundle savedInstanceState) {  
     super.onCreate(savedInstanceState);  
     getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);  
     setContentView(R.layout.tutorial2_surface_view);  
     mOpenCvCameraView = (Tutorial3View) findViewById(R.id.tutorial2_activity_surface_view);  
     mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);  
     mOpenCvCameraView.setCvCameraViewListener(this);  
   }  
   @Override  
   public void onPause()  
   {  
     super.onPause();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   @Override  
   public void onResume()  
   {  
     super.onResume();  
     OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);  
   }  
   public void onDestroy() {  
     super.onDestroy();  
     if (mOpenCvCameraView != null)  
       mOpenCvCameraView.disableView();  
   }  
   public void onCameraViewStarted(int width, int height) {  
        screen_w=width;  
        screen_h=height;  
     mRgba = new Mat(screen_w, screen_h, CvType.CV_8UC4);  
     mGrey = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
     Log.v("MyActivity","Height: "+height+" Width: "+width);  
   }  
   public void onCameraViewStopped() {  
     mRgba.release();  
     mGrey.release();  
   }  
   public Mat onCameraFrame(CvCameraViewFrame inputFrame) {  
        long startTime = System.nanoTime();  
        long endTime;  
        boolean show=true;  
        MatOfRect faces = new MatOfRect();  
        mRgba=inputFrame.rgba();  
        if (mViewMode==VIEW_MODE_CAMERA) {  
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
        }  
        Imgproc.cvtColor( mRgba, mGrey, Imgproc.COLOR_BGR2GRAY);   
        if (mViewMode==VIEW_MODE_GRAY){             
             endTime = System.nanoTime();  
          if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mGrey;  
        }  
        Mat low_res = new Mat(screen_w, screen_h, CvType.CV_8UC1);  
        // 1280 x 720  
        Log.v("MyActivity","width: "+screen_w+" height: "+screen_h);  
        Imgproc.resize(mGrey,low_res,new Size(),0.25,0.25,Imgproc.INTER_LINEAR);  
        Imgproc.equalizeHist( low_res, low_res );   
           face_cascade.detectMultiScale(low_res, faces);  
           if (show==true) Log.v("MyActivity","Detected "+faces.toArray().length+" faces");  
           for(Rect rect:faces.toArray())  
           {  
                Point center= new Point(4*rect.x + 4*rect.width*0.5, 4*rect.y + 4*rect.height*0.5 );  
                Core.ellipse( mRgba, new Point(center.x,center.y), new Size( rect.width*2, rect.height*2), 0, 0, 360, new Scalar( 255, 0, 255 ), 4, 8, 0 );  
           }  
           if (mViewMode==VIEW_MODE_FACES) {  
                endTime = System.nanoTime();  
                if (show==true) Log.v("MyActivity","Elapsed time: "+ (float)(endTime - startTime)/1000000+"ms");  
             return mRgba;  
                //return low_res;  
        }  
           return mRgba;  
    }  
   @Override  
   public boolean onCreateOptionsMenu(Menu menu) {  
     mItemPreviewRGBA = menu.add("RGBA");  
     mItemPreviewGrey = menu.add("Grey");  
     mItemPreviewFaces = menu.add("Faces");  
     return true;  
   }  
   public boolean onOptionsItemSelected(MenuItem item) {  
     if (item == mItemPreviewRGBA) {  
       mViewMode = VIEW_MODE_CAMERA;  
     } else if (item == mItemPreviewGrey) {  
       mViewMode = VIEW_MODE_GRAY;  
     } else if (item == mItemPreviewFaces) {  
       mViewMode = VIEW_MODE_FACES;  
     }  
     return true;  
   }    
   private void load_cascade(){  
        try {  
             // LOAD FROM ASSET  
             InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);  
             //InputStream is = getResources().openRawResource(R.raw.haarcascade_frontalface_alt);  
             File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);  
             File mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");  
             FileOutputStream os = new FileOutputStream(mCascadeFile);  
             byte[] buffer = new byte[4096];  
             int bytesRead;  
             while ((bytesRead = is.read(buffer)) != -1) {  
                  os.write(buffer, 0, bytesRead);  
             }  
             is.close();  
             os.close();  
             face_cascade = new CascadeClassifier(mCascadeFile.getAbsolutePath());  
             if(face_cascade.empty())  
             {  
                  Log.v("MyActivity","--(!)Error loading A\n");  
                  return;  
             }  
             else  
             {  
                  Log.v("MyActivity",  
                            "Loaded cascade classifier from " + mCascadeFile.getAbsolutePath());  
             }  
        } catch (IOException e) {  
             e.printStackTrace();  
             Log.v("MyActivity", "Failed to load cascade. Exception thrown: " + e);  
        }  
   }  
 }  

For any of the two, the following files are the same. Tutorial3View.java:
 package com.cell0907.td1;  
 import java.io.FileOutputStream;  
 import java.util.List;  
 import org.opencv.android.JavaCameraView;  
 import android.content.Context;  
 import android.hardware.Camera;  
 import android.hardware.Camera.PictureCallback;  
 import android.hardware.Camera.Size;  
 import android.util.AttributeSet;  
 import android.util.Log;  
 public class Tutorial3View extends JavaCameraView implements PictureCallback {  
   private static final String TAG = "MyActivity";  
   private String mPictureFileName;  
   public Tutorial3View(Context context, AttributeSet attrs) {  
     super(context, attrs);  
   }  
   public List<String> getEffectList() {  
     return mCamera.getParameters().getSupportedColorEffects();  
   }  
   public boolean isEffectSupported() {  
     return (mCamera.getParameters().getColorEffect() != null);  
   }  
   public String getEffect() {  
     return mCamera.getParameters().getColorEffect();  
   }  
   public void setEffect(String effect) {  
     Camera.Parameters params = mCamera.getParameters();  
     params.setColorEffect(effect);  
     mCamera.setParameters(params);  
   }  
   public List<Size> getResolutionList() {  
     return mCamera.getParameters().getSupportedPreviewSizes();  
   }  
   public void setResolution(Size resolution) {  
     disconnectCamera();  
     mMaxHeight = resolution.height;  
     mMaxWidth = resolution.width;  
     connectCamera(getWidth(), getHeight());  
   }  
   public Size getResolution() {  
     return mCamera.getParameters().getPreviewSize();  
   }  
   public void takePicture(final String fileName) {  
     Log.i(TAG, "Taking picture");  
     this.mPictureFileName = fileName;  
     // Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.  
     // Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue  
     mCamera.setPreviewCallback(null);  
     // PictureCallback is implemented by the current class  
     mCamera.takePicture(null, null, this);  
   }  
   @Override  
   public void onPictureTaken(byte[] data, Camera camera) {  
     Log.i(TAG, "Saving a bitmap to file");  
     // The camera preview was automatically stopped. Start it again.  
     mCamera.startPreview();  
     mCamera.setPreviewCallback(this);  
     // Write the image in a file (in jpeg format)  
     try {  
       FileOutputStream fos = new FileOutputStream(mPictureFileName);  
       fos.write(data);  
       fos.close();  
     } catch (java.io.IOException e) {  
       Log.e("PictureDemo", "Exception in photoCallback", e);  
     }  
   }  
 }  

The layout: Tutorial2_surface_view.xml:
 <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"  
   xmlns:tools="http://schemas.android.com/tools"  
   xmlns:opencv="http://schemas.android.com/apk/res-auto"  
   android:layout_width="match_parent"  
   android:layout_height="match_parent" >  
   <com.cell0907.td1.Tutorial3View  
     android:id="@+id/tutorial2_activity_surface_view"  
     android:layout_width="match_parent"  
     android:layout_height="match_parent"  
     opencv:camera_id="1"  
     opencv:show_fps="false" />  
 </LinearLayout>  

AndroidManifest.xml:
 <?xml version="1.0" encoding="utf-8"?>  
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
      package="com.cell0907.td1"  
      android:versionCode="21"  
      android:versionName="2.1">  
       <supports-screens android:resizeable="true"  
            android:smallScreens="true"  
            android:normalScreens="true"  
            android:largeScreens="true"  
            android:anyDensity="true" />  
   <uses-sdk android:minSdkVersion="8"   
               android:targetSdkVersion="10" />  
   <uses-permission android:name="android.permission.CAMERA"/>  
   <uses-feature android:name="android.hardware.camera" android:required="false"/>  
   <uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>  
   <uses-feature android:name="android.hardware.camera.front" android:required="false"/>  
   <uses-feature android:name="android.hardware.camera.front.autofocus" android:required="false"/>  
   <application  
     android:label="@string/app_name"  
     android:icon="@drawable/icon"  
     android:theme="@android:style/Theme.NoTitleBar.Fullscreen"  
     android:allowBackup="false">  
     <activity android:name="_3DActivity"  
          android:label="@string/app_name"  
          android:screenOrientation="landscape"  
          android:configChanges="keyboardHidden|orientation">  
       <intent-filter>  
         <action android:name="android.intent.action.MAIN" />  
         <category android:name="android.intent.category.LAUNCHER" />  
       </intent-filter>  
     </activity>  
   </application>  
 </manifest>  

Cheers!

PS1.: Click here to see the index of these series of posts on OpenCV
PS2.: This is other stuff I found in some places and "randomly" tried after reading that it was not possible to get a file path to a resource, and sure enough, didn't work... But just for documentation sake:

    Uri uri=Uri.parse("android.resource://com.example.td1/raw/lbpcascade_frontalface");
    File myFile = new File(uri.toString());
    face_cascade=new CascadeClassifier(myFile.getAbsolutePath());

Or this:                                       

    face_cascade=new CascadeClassifier("file:///android_asset/lbpcascade_frontalface.xml");
   
Or this:

    face_cascade=new CascadeClassifier("android.resource://com.example.td1/raw/lbpcascade_frontalface.xml");

5 comments:

  1. Hello,
    Nice job!
    I'm developing an app to facial recognition
    You know if it's possible to do it without displaying the surface view? (in a service)

    ReplyDelete
    Replies
    1. Hi hi...Sorry for late answer... I seldom monitor/work in these pages any more... I don't really have the full answer but you can read my other post:
      http://cell0907.blogspot.com/2014/01/android-camera-capture-without.html
      I guess things have changed since I wrote that, so, there may be a better way to do that now.

      Cheers!

      Delete
  2. I have problem in line...
    InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);
    What is "raw" ? and how can I solve it?

    ReplyDelete
    Replies
    1. I try to use this
      InputStream is = getResources().openRawResource(android.R.raw.lbpcascade_frontalface);
      But I still have problem. Maybe my project doesn't have lbpcascade_frontalface.xml

      Delete
    2. raw is actually a resource folder which sometimes get born automatically like layout,values.etc..
      that error may be because u don't have that folder create that folder and copy the required file.

      Delete