In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article will explain in detail how to analyze the whole process of witness comparison using face recognition SDK. The content of the article is of high quality, so the editor shares it for you as a reference. I hope you will have a certain understanding of the relevant knowledge after reading this article.
Witness comparison can be seen everywhere in today's society, such as high-speed rail, airplanes, hotel check-in, and even the entrance of scenic spots can see a variety of witness applications, face recognition SDK is also springing up like bamboo shoots after a spring rain, such as Baidu, Shangtang, Face++, Hongrou and so on. After trying to use various SDK, my favorite is Rainbow soft Technology's SDK, one of the most direct reasons is Hongrou's promise to be free forever. I have been using it since version 2.0, and the actual results are really good. Just last month, I received news that ArcFace3.0 was updated. As a white whoring party, I will not miss this update. After getting started with version 3.0, I found that ArcFace3.0 has the following new features.
Feature comparison supports comparison model selection, including life photo comparison model and witness comparison model.
64-bit SDK added to Android platform
A new way to input image data has been added
First, the gain and loss of ArcFace 3.0 SDK interface changes:
1. Higher degree of freedom in business
takes witness 2.0 as an example, we can only input data and transmit results, but some intermediates, such as facial features, cannot be obtained. Now after the adoption of ArcFace 3.0, the fixed process has been cancelled, and the processes of detection, comparison and extraction can be controlled by ourselves.
two。 The comparison of life photos and witnesses can be realized in the same project.
witness SDK and ArcFace SDK are in conflict and cannot be used at the same time. If we want to use both witnesses and life photos, we have to write two projects, and the processes of the two projects are somewhat different. Now we only need to select the model in the interface to realize the model switching, and the integration of witness and life photo program can be realized in a project.
3. Code reusability
The only difference between witness and ID card in ArcFace 3.0 is the model selection in compare interface, and the others are completely the same, so most of the code can be reused, which greatly improves the efficiency of development.
Disadvantages of interface changes:
1. Interface change
There are gains and losses in . Due to the lack of witness encapsulation in ArcFace 3.0, all interfaces need to be changed during the upgrade process, which is believed to be a problem that all programmers do not want to see.
two。 It becomes difficult to achieve
also does not encapsulate the witness part of ArcFace 3.0, so some of the processes and callbacks originally included in the API need to be implemented on their own, which is not very friendly for beginners.
Summary:
Although said above some shortcomings of ArcFace 3.0, but I am still in favor of this upgrade, after all, the innovation of each product will always bring some impact, but compared to these shocks, I believe that the unity of the interface and identification process has improved the applicability of the program and the freedom of business. I believe that for witness 2.0, this "strong man breaking his wrist" measure is worth it in the long run.
II. Witness 2.0 Demo Integration ArcFace 3.0 SDK
above we saw that due to interface changes, all interfaces of the witness 2.0 program have to be modified. Next, I will take the witness 2.0 Demo as an example to explain how I use ArcFace 3.0 SDK to upgrade.
1. Witness 2.0 Demo project configuration
Considering that some users may not be familiar with Demo 2.0, first briefly introduces how to configure and use the official Demo.
first puts the witness engine into the demo as shown in the figure, and then modifies the APP_ID and SDK_KEY,APP_ID and SDK_KEY in the Constants, as well as the witness engine is obtained on the open platform of the official website. Then place a picture named "sample.jpg" in the SDCard root directory of the device as the input picture of the simulated witness (the image path can be modified in the SAMPLE_FACE variable under MainActivity). The following picture is a screenshot of running after configuration.
2.ArcFace 3.0 SDK replacement
first we need to obtain the SDK of ArcFace3.0, which can also be obtained on the open platform. Replace the original SDK with the new SDK library, and the replaced project directory is shown in the following figure
3.ArcFace3.0 interface replacement
As mentioned above in , all interfaces have changed due to the overall change in 3. 0, so we have to replace all the interfaces in 2. 0 with 3. 0.
3.1 engine Activation:
There is no change in the interface parameters for activation.
Witness 2.0:
IdCardVerifyManager.getInstance () .active (Context context, String appId, String sdkKey)
ArcFace 3.0:
FaceEngine.active (Context context, String appId, String sdkKey); 3.2 engine initialization:
Since the initialization of , there is a big difference between witness 2.0 and ArcFace3.0. Witness 2.0 monitors Id Card information and Camera information, but 3.0 cancels this monitoring mechanism, so the parameters in the API will not be introduced one by one. The official documents are very detailed. You can refer to the official documents.
Witness 2.0:
IdCardVerifyManager.getInstance () .init (Context context IdCardVerifyListener listener)
ArcFace 3.0:
FaceEngine.init (Context context, DetectMode detectMode, DetectFaceOrientPriority detectFaceOrientPriority, int detectFaceScaleVal, int detectFaceMaxNum, int combinedMask) 3.3Activation & initialization demo:
The following is the code before and after I replaced , which can be used as a reference for you:
Witness 2.0:
Private void initEngine () {int result = IdCardVerifyManager.getInstance () .init (this, idCardVerifyListener); LogUtils.dTag (TAG, "initResult:" + result) If (result = = IdCardVerifyError.MERR_ASF_NOT_ACTIVATED) {Executors.newSingleThreadExecutor () .execute (new Runnable () {@ Override public void run () {int activeResult = IdCardVerifyManager.getInstance () .active (MainActivity.this, APP_ID, SDK_KEY) RunOnUiThread (new Runnable () {@ Override public void run () {LogUtils.dTag (TAG, "activeResult:" + activeResult) If (activeResult = = IdCardVerifyError.OK) {int initResult = IdCardVerifyManager.getInstance () .init (MainActivity.this, idCardVerifyListener); LogUtils.dTag (TAG, "initResult:" + initResult) If (initResult! = IdCardVerifyError.OK) {toast ("witness engine initialization failed, error code:" + initResult) }} else {toast ("witness engine activation failed, error code:" + activeResult);});}}) } else if (result! = IdCardVerifyError.OK) {toast ("witness engine initialization failed, error code:" + result);}}
ArcFace 3.0:
Private void initEngine () {int result = faceEngine.init (this, DetectMode.ASF_DETECT_MODE_VIDEO, DetectFaceOrientPriority.ASF_OP_ALL_OUT, 16, 1, FaceEngine.ASF_FACE_DETECT | FaceEngine.ASF_FACE_RECOGNITION); LogUtils.dTag (TAG, "initResult:" + result) If (result = = ErrorInfo.MERR_ASF_NOT_ACTIVATED) {Executors.newSingleThreadExecutor () .execute (()-> {int activeResult = FaceEngine.active (MainActivity.this, Constants.APP_ID, Constants.SDK_KEY); runOnUiThread (()-> {LogUtils.dTag (TAG, "activeResult:" + activeResult) If (activeResult = = ErrorInfo.MOK) {int initResult = faceEngine.init (this, DetectMode.ASF_DETECT_MODE_VIDEO, DetectFaceOrientPriority.ASF_OP_ALL_OUT, 16, 1, FaceEngine.ASF_FACE_DETECT | FaceEngine.ASF_FACE_RECOGNITION); LogUtils.dTag (TAG, "initResult:" + initResult) If (initResult! = ErrorInfo.MOK) {toast ("witness engine initialization failed, error code:", initResult);}} else {toast ("witness engine activation failure, error code:", activeResult) }});} else if (result! = ErrorInfo.MOK) {toast ("witness engine initialization failed, error code:", result);}} 3.4 identification and feature extraction of certificate photos
In the part of ID photo, we need to replace the image processing method originally included in the 2.0 engine with the ArcSoftImageUtil method in package 3.0. at the same time, since the callback monitoring after successful feature extraction is deleted from the engine, this callback needs to be written by myself. Here I was lazy and copied the FaceListener in the faceHelper in both 2.0 demo and 3.0 demo as a monitoring callback. Of course, you can also achieve your own callback.
Witness 2.0:
Private void inputIdCard () {if (bmp = = null) {return;} int width = bmp.getWidth (); int height = bmp.getHeight (); / / Image cropping boolean needAdjust = false; while (width% 4! = 0) {width--; needAdjust = true } if (height% 2! = 0) {height--; needAdjust = true;} if (needAdjust) {bmp = ImageUtils.imageCrop (bmp, new Rect (0,0, width, height));} / / converted to NV21 data format byte [] nv21Data = ImageUtils.getNV21 (width, height, bmp) / / enter ID card image data DetectFaceResult result = IdCardVerifyManager.getInstance (). InputIdCardData (nv21Data, width, height); LogUtils.dTag (TAG, "inputIdCardData result:" + result.getErrCode ());}
ArcFace 3.0:
Private void inputIdCard () {if (bmp = = null) {return;} / / Image 4-byte aligned cropping bmp = ArcSoftImageUtil.getAlignedBitmap (bmp, true); int width = bmp.getWidth (); int height = bmp.getHeight (); / / converted to bgr format byte [] bgrData = ArcSoftImageUtil.createImageData (bmp.getWidth (), bmp.getHeight (), ArcSoftImageFormat.BGR24) Int translateResult = ArcSoftImageUtil.bitmapToImageData (bmp, bgrData, ArcSoftImageFormat.BGR24); / / successfully converted if (translateResult = = ArcSoftImageUtilError.CODE_SUCCESS) {List faceInfoList = new ArrayList () / / video mode is not suitable for static image detection. A new idFaceEngine is created here except that the detection mode is modified to Image. Other parameters are the same as faceEngine. Int detectResult = idFaceEngine.detectFaces (bgrData, width, height, FaceEngine.CP_PAF_BGR24, faceInfoList) If (detectResult = = ErrorInfo.MOK & & faceInfoList.size () > 0) {/ / the-2 here is trackID because Camera and license extraction share faceHelper and use trackID to distinguish which side of the data is faceHelper.requestFaceFeature (bgrData, faceInfoList.get (0), width, height, FaceEngine.CP_PAF_BGR24,-2). }} else {LogUtils.dTag (TAG, "translate Error result:" + translateResult);}} 3.5 Camera recognition and feature extraction
In fact, there is a feature extraction protection inside the onPreviewData interface of witness 2.0, that is, the next feature extraction cannot be carried out before the last feature extraction is completed, but there is no external encapsulation in 3.0, so we have to control the feature extraction by ourselves. The basic strategy is according to trackId, every trackId will carry out feature extraction if it is not extracted or fails.
Witness 2.0:
Public void onPreview (byte [] nv21, Camera camera) {if (faceRectView! = null) {faceRectView.clearFaceInfo ();} if (nv21 = = null) {return } / / Preview data is passed in DetectFaceResult result = IdCardVerifyManager.getInstance () .onPreviewData (nv21, previewSize.width, previewSize.height, true); Rect rect = result.getFaceRect () If (faceRectView! = null & & drawHelper! = null & & rect! = null) {/ / generate real-time human face frame drawHelper.draw (faceRectView, new DrawInfo (drawHelper.adjustRect (rect), "", Color.YELLOW));}}
ArcFace 3.0:
Public void onPreview (byte [] nv21, Camera camera) {if (faceRectView! = null) {faceRectView.clearFaceInfo ();} if (nv21 = = null) {return;} List faceInfoList = new ArrayList () Int ftResult = faceEngine.detectFaces (nv21, previewSize.width, previewSize.height, FaceEngine.CP_PAF_NV21, faceInfoList) / / only the largest human face is valid in the scene, so just take the first human face. If there are other scenes, you can adjust the if (ftResult = = ErrorInfo.MOK & & faceInfoList.size () > 0) {Rect rect = faceInfoList.get (0). GetRect () If (faceRectView! = null & & drawHelper! = null & & rect! = null) {drawHelper.draw (faceRectView, new DrawInfo (drawHelper.adjustRect (rect), "", Color.YELLOW)) } / / wait for the ID card data to be prepared, then start to extract features from the Camera data and prevent repeated extraction according to trackId int trackId = faceInfoList.get (0) .getFaceId () If (isIdCardReady & & requestFeatureStatusMap! = null & & requestFeatureStatusMap.containsKey (trackId)) {/ / if a face extraction fails, if (requestFeatureStatusMap.get (trackId) = = null | | requestFeatureStatusMap.get (trackId) = = RequestFeatureStatus.FAILED) {requestFeatureStatusMap.put (trackId, RequestFeatureStatus.SEARCHING) FaceHelper.requestFaceFeature (nv21, faceInfoList.get (0), previewSize.width, previewSize.height, FaceEngine.CP_PAF_NV21, faceInfoList.get (0). GetFaceId ());} 3.6camera and idCard data callback
as mentioned above, there are two interfaces for camera data idCard data in the engine of witness 2.0, and two callback functions are used to process the two data respectively. In ArcFace3.0, not only the callback is cancelled, but also the camera data and idCard data share one detect and extractFaceFeature, so we can use trackId as the distinction. Because of changes in the engine, eigenvalues are no longer stored in the engine, resulting in the need to record the eigenvalues obtained from the two data sources.
Witness 2.0:
Private IdCardVerifyListener idCardVerifyListener = new IdCardVerifyListener () {@ Override public void onPreviewResult (DetectFaceResult detectFaceResult, byte [] bytes, int I, int i1) {runOnUiThread (())-> {/ / Preview successful facial feature extraction if (detectFaceResult.getErrCode () = = IdCardVerifyError.OK) {isCurrentReady = true; compare () ); @ Override public void onIdCardResult (DetectFaceResult detectFaceResult, byte [] bytes, int I, int i1) {LogUtils.dTag (TAG, "onIdCardResult:" + detectFaceResult.getErrCode ()) RunOnUiThread (()-> {/ / identity witness face feature extraction succeeded if (detectFaceResult.getErrCode () = = IdCardVerifyError.OK) {isIdCardReady = true; restartHandler.removeCallbacks (restartRunnable); readHandler.postDelayed (readRunnable, READ_DELAY); ByteArrayOutputStream baos = new ByteArrayOutputStream () Bmp.compress (Bitmap.CompressFormat.PNG, 80, baos); byte [] bmpBytes = baos.toByteArray (); Glide.with (MainActivity.this) .load (bmpBytes) .into (ivIdCard); compare ();}});}}
ArcFace 3.0:
FaceListener faceListener = new FaceListener () {@ Override public void onFail (Exception e) {} @ Override public void onFaceFeatureInfoGet (@ Nullable FaceFeature faceFeature, Integer requestId, Integer errorCode, long frTime) Byte [] nv21) {/ / failed feature extraction sets the alignment status to failure if (ErrorInfo.MOK! = errorCode) {requestFeatureStatusMap.put (requestId, RequestFeatureStatus.FAILED) If return;} / / requestId is-2, it is ID card data if (requestId = =-2) {isIdCardReady = true; / / because the interface change feature cannot be stored in the engine, so use the global variable to store idFaceFeature = faceFeature RestartHandler.removeCallbacks (restartRunnable); readHandler.postDelayed (readRunnable, 5000); ByteArrayOutputStream baos = new ByteArrayOutputStream (); bmp.compress (Bitmap.CompressFormat.PNG, 100, baos); runOnUiThread (()-> {Glide.with (MainActivity.this) .load (bmp) .into (ivIdCard)) Compare ();} else {/ / because the interface change feature cannot be stored in the engine, it is stored with global variables MainActivity.this.faceFeature = faceFeature; isCurrentReady = true RunOnUiThread (()-> {compare ();});}; 3.7 compare interface
is much less modified than the interface before, so you just need to pay attention to changing the alignment mode to ID_CARD mode.
Witness 2.0:
Private void compare () {/ /. / / witness feature comparison interface CompareResult compareResult = IdCardVerifyManager.getInstance (). CompareFeature (THRESHOLD); LogUtils.dTag (TAG, "compareResult: result" + compareResult.getResult () + ", isSuccess" + compareResult.isSuccess () + ", errCode" + compareResult.getErrCode ()); if (compareResult.isSuccess ()) {playSound (R.raw.compare_success); ivCompareResult.setBackgroundResource (R.mipmap.compare_success) TvCompareTip.setText (name);} else {playSound (R.raw.compare_fail); ivCompareResult.setBackgroundResource (R.mipmap.compare_fail); tvCompareTip.setText (R.string.tip_retry);} / /. }
ArcFace 3.0:
Private void compare () {/ /. / / witness feature comparison interface FaceSimilar compareResult = new FaceSimilar (); faceEngine.compareFaceFeature (idFaceFeature, faceFeature, CompareModel.ID_CARD, compareResult); / / witness comparison threshold is 0.82if (compareResult.getScore () > 0.82) {playSound (R.raw.compare_success); ivCompareResult.setBackgroundResource (R.mipmap.compare_success); tvCompareTip.setText (name) } else {playSound (R.raw.compare_fail); ivCompareResult.setBackgroundResource (R.mipmap.compare_fail); tvCompareTip.setText (R.string.tip_retry);} / /. } 3.8 result display
At this point, as long as deletes the useless code of witness 2.0 demo, we will successfully upgrade 2.0 to 3.0. Let's take a look at the screenshot of the successful operation of the army.
III. The demo of ArcFace 3.0has been changed into witness procedure.
Compared with upgrading from witness 2.0 to ArcFace3.0, it is much easier to modify the ArcFace3.0 version directly. after all, we don't have to change all the interfaces, all we need to do is to increase the input of witness, the callback of witness and the logic of comparison. Therefore, I highly recommend using ArcFace3.0 directly. If there is no special reason, it is much faster to modify 3. 0 than to use witness 2. 0.
Modify interface selection
first of all, we have to choose an Activity in demo as the template for our modification. I took a look at RegisterAndRecognizeActivity as the most appropriate one, because the comparison process of its Camera has been completed, and we need to do two things:
Add Id Card data input source
IdCard data input source we use the same way as the witness demo to simulate the input of certificate information, so we can completely apply the inputIdCard method.
Public void onClickIdCard (View view) {/ / simulate ID card name, you can modify FileInputStream fis; / / ID card image data bmp = null; try {/ / simulate ID card image data source, you can modify fis = new FileInputStream (SAMPLE_FACE); bmp = BitmapFactory.decodeStream (fis); fis.close () } catch (Exception e) {e.printStackTrace ();} inputIdCard ();} private void inputIdCard () {if (bmp = = null) {return;} / / Image 4-byte alignment cropping bmp = ArcSoftImageUtil.getAlignedBitmap (bmp, true); int width = bmp.getWidth (); int height = bmp.getHeight () / / converted to bgr format byte [] bgrData = ArcSoftImageUtil.createImageData (bmp.getWidth (), bmp.getHeight (), ArcSoftImageFormat.BGR24); int translateResult = ArcSoftImageUtil.bitmapToImageData (bmp, bgrData, ArcSoftImageFormat.BGR24); / / converted successfully if (translateResult = = ArcSoftImageUtilError.CODE_SUCCESS) {List faceInfoList = new ArrayList () / / video mode is not suitable for static image detection. We choose frEngine as the engine to detect certificate photos and add FaceEngine.ASF_FACE_DETECT int detectResult = frEngine.detectFaces (bgrData, width, height, FaceEngine.CP_PAF_BGR24, faceInfoList) during initialization. If (detectResult = = ErrorInfo.MOK & & faceInfoList.size () > 0) {/ / the-2 here is trackID because Camera and license extraction share faceHelper and use trackID to distinguish which side of the data is faceHelper.requestFaceFeature (bgrData, faceInfoList.get (0), width, height, FaceEngine.CP_PAF_BGR24,-2). } else {LogUtils.dTag (TAG, "translate Error result:" + translateResult);}}
Modify the base library of the comparison
In , witness comparison is compared at 1:1 in most scenarios, so it needs to be adjusted in the onFaceFeatureInfoGet callback. First of all, through our inputIdCard above to pave the way to-2 for trackID, as a means of identifying ID card data. Secondly, we need to record the face feature information under the ID card feature and camera to be compared, here we use the way of global variables to record. Finally, because there is a difference in the order before and after the feature acquisition of the comparison, we can record it with a status bit (of course, we can also determine whether the two feature have data, and maintain this data to synchronize the data on both sides). After the data on both sides are ready, we can compare it.
Public void onFaceFeatureInfoGet (@ Nullable final FaceFeature faceFeature, final Integer requestId, final Integer errorCode) {/ / FR succeeded if (faceFeature! = null) {/ / received ID card data if (requestId = =-2) {isIdCardReady = true / / feature uses global variables to store idFaceFeature = faceFeature; compare (); return;} / / Log.i (TAG, "onPreview: fr end =" + System.currentTimeMillis () + "trackId =" + requestId) Integer liveness = livenessMap.get (requestId); / / search directly for if (! livenessDetect) {isCurrentReady = true; / / to prevent multiple feature extraction of the same face requestFeatureStatusMap.put (requestId, RequestFeatureStatus.SUCCEED) Compare (); / / searchFace (faceFeature, requestId);} / / in vivo test passed, search for feature else if (liveness! = null & & liveness = = LivenessInfo.ALIVE) {isCurrentReady = true / / prevent multiple feature extraction from the same face RegisterAndRecognizeActivity.this.faceFeature = faceFeature; requestFeatureStatusMap.put (requestId, RequestFeatureStatus.SUCCEED); compare (); / / searchFace (faceFeature, requestId) } / / No result is obtained in vivo detection, or non-living body, the execution of the function else {/ / is delayed. }} / / feature extraction failed else {/ /. Override public void onFaceLivenessInfoGet (@ Nullable LivenessInfo livenessInfo, final Integer requestId, Integer errorCode) {/ /. }}
Compare function:
Private void compare () {if (isCurrentReady & & isIdCardReady) {FaceSimilar similar = new FaceSimilar (); int compareResult = frEngine.compareFaceFeature (idFaceFeature, faceFeature, CompareModel.ID_CARD, similar); if (compareResult = = ErrorInfo.MOK & & similar.getScore () > 0.82) {Log.i (TAG, "compare: success") } else {Log.i (TAG, "compare: fail");} / / reset alignment state isIdCardReady = false; isCurrentReady = false; / / if the same face still wants to try after comparison, allow it to extract requestFeatureStatusMap.clear ();}} summary
uses ArcFace3.0 to modify, you can obviously feel that the modification is "silky" a lot, we only need to pay attention to the data input of Id Card on the basis of the original code, as well as the logic before and after comparison, the difficulty of comparison is almost negligible, just a simple call to the interface. I also write here is relatively simple, some business logic such as: increase the valid time of ID card data; stipulate the order of data enforcement on both sides; the display of the interface part is not done, only print the results of the comparison. This article only provides ideas for your reference, business logic still needs to be added on its own, and finally show you the log of running a comparison of success after the modification is completed.
On how to analyze the use of face recognition SDK to achieve witness comparison of the whole process is shared here, I hope the above content can be of some help to everyone, can learn more knowledge. If you think the article is good, you can share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.