In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article introduces the knowledge of "how to use Java to achieve face recognition". Many people will encounter such a dilemma in the operation of actual cases, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
1. Download the demo project
Github address: https://github.com/xinzhfiu/ArcSoftFaceDemo, build database locally, create table: user_face_info. This table is mainly used to store portrait features, in which the main field face_feature uses binary type blob to store facial features.
SET NAMES utf8mb4;SET FOREIGN_KEY_CHECKS = 0Mutual-Table structure for user_face_info-- DROP TABLE IF EXISTS `user_face_ info` CREATE TABLE `face_ info` (`id`int (11) NOT NULL AUTO_INCREMENT COMMENT 'key', `group_ id` int (11) DEFAULT NULL COMMENT 'grouping id', `face_ id`varchar (31) DEFAULT NULL COMMENT' face unique Id', `name` varchar (63) DEFAULT NULL COMMENT 'name', `age`int (3) DEFAULT NULL COMMENT 'age', `email`varchar 'DEFAULT NULL COMMENT' email address', `gender` smallint (1) DEFAULT NULL COMMENT 'gender, 1 = male 2 = female', 'phone_ number` varchar (11) DEFAULT NULL COMMENT' phone number', 'face_ feature` blob COMMENT' facial feature', 'create_ time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT' creation time', `update_ time`timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT 'update time', `fpath` varchar (255) COMMENT 'photo path', PRIMARY KEY (`id`) USING BTREE, KEY `GROUP_ ID` (`group_ id`) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=DYNAMIC SET FOREIGN_KEY_CHECKS = 1
2. Modify application.properties file
The whole project is relatively complete, you only need to change some configuration to start, but there are a few points to pay attention to, which will be emphasized later.
Config.arcface-sdk.sdk-lib-path: the path where the three .dll files in the SDK package are stored
Config.arcface-sdk.app-id: APPID of developer Center
Config.arcface-sdk.sdk-key: SDK Key of developer Center
Config.arcface-sdk.sdk-lib-path=d:/arcsoft_libconfig.arcface-sdk.app-id=8XMHMu71Dmb5UtAEBpPTB1E9ZPNTw2nrvQ5bXxBobUA8config.arcface-sdk.sdk-key=BA8TLA9vVwK7G6btJh3A2FCa8ZrC6VWZLNbBBFctCz5R# druid local data library address spring.datasource.druid.url=jdbc:mysql://127.0.0.1:3306/xin-master?useUnicode=true&characterEncoding=utf-8&useSSL=false&serverTimezone=UTCspring.datasource.druid.username=junkangspring.datasource.druid.password=junkang
3. Create a lib folder in the root directory
Create a folder lib in the project root directory and put the arcsoft-sdk-face-2.2.0.1.jar in the downloaded SDK package into the project root directory
4. Introduce arcsoft dependency package
Com.arcsoft.face arcsoft-sdk-face 2.2.0.1 system ${basedir} / lib/arcsoft-sdk-face-2.2.0.1.jar
The includeSystemScope attribute should be configured in the pom.xml file, otherwise the arcsoft-sdk-face-2.2.0.1.jar may not be referenced.
Org.springframework.boot spring-boot-maven-plugin true true
5. Start the project
At this point, the configuration is complete and the run Application file starts.
Test it: http://127.0.0.1:8089/demo. The following page will launch successfully.
Operation
1. Input face images
Enter a name on the page, click the camera to register and adjust the local camera, submit the current image to the background, identify and extract the current facial signs, and save it to the database.
2. Face comparison
After entering the face image, test whether the recognition is successful, submit the current image, and find that the similarity of successful recognition is 92%. But as programmers have to be skeptical about everything, the result is not written by the old man on the page, is it?
In order to further verify, this time block the face and try again, found that the prompt "face does not match", to prove that there is really a comparison.
Source code analysis
Take a brief look at the project source code and analyze the implementation process:
The page and JS are written by back-end programmers. Don't ask me why. If you know it, you will understand it, ha ~
1. JS adjusts the local camera to take photos and uploads the image file string
Function getMedia () {$("# mainDiv") .empty (); let videoComp = ""; $("# mainDiv") .append (videoComp); let constraints = {video: {width: 500}, height: 500}, audio: true}; / / get video camera area let video = document.getElementById ("video") / / A new method is introduced here. It returns a Promise object / / the callback function returned by the Promise object successfully takes a MediaStream object as its parameter / / then () is the method in the Promise object / / then () method is executed asynchronously Execute the internal program of then () after the method before then () is executed / / avoid not getting the data let promise = navigator.mediaDevices.getUserMedia (constraints) Promise.then (function (MediaStream) {video.srcObject = MediaStream; video.play ();}); / / var T1 = window.setTimeout (function () {/ / takePhoto (); / /}, 2000)} / / Photo event function takePhoto () {let mainComp = $("# mainDiv") If (mainComp.has ('video'). Length) {let userNameInput = $("# userName") .val (); if (userNameInput = = "") {alert ("name cannot be empty!"); return false } / / get Canvas object let video = document.getElementById ("video"); let canvas = document.getElementById ("canvas"); let ctx = canvas.getContext ('2d'); ctx.drawImage (video, 0,0500,500); var formData = new FormData (); var base64File = canvas.toDataURL () Var userName = $("# userName"). Val (); formData.append ("file", base64File); formData.append ("name", userName); formData.append ("groupId", "101") $.ajax ({type: "post", url: "/ faceAdd", data: formData, contentType: false, processData: false, async: false Success: function (text) {var res = JSON.stringify (text) if (text.code = = 0) {alert ("registered successfully")} else {alert (text.message)}} Error: function (error) {alert (JSON.stringify (error))}}) } else {var formData = new FormData (); let userName = $("# userName"). Val (); formData.append ("groupId", "101"); var file = $("# file0") [0] .files [0]; var reader = new FileReader (); reader.readAsDataURL (file) Reader.onload = function () {var base64 = reader.result; formData.append ("file", base64); formData.append ("name", userName) $.ajax ({type: "post", url: "/ faceAdd", data: formData, contentType: false, processData: false, async: false Success: function (text) {var res = JSON.stringify (text) if (text.code = = 0) {alert ("registered successfully")} else {alert (text.message) }} Error: function (error) {alert (JSON.stringify (error))}}) Location.reload ();}
2. Analyze the picture and extract the portrait feature in the background
Background analysis of the front-end pictures, extract portrait features into the database, portrait feature extraction mainly depends on the FaceEngine engine, along the source code all the way down, I really do not understand what kind of algorithm.
/ * face add * / @ RequestMapping (value = "/ faceAdd", method = RequestMethod.POST) @ ResponseBody public Result faceAdd (@ RequestParam ("file") String file, @ RequestParam ("groupId") Integer groupId, @ RequestParam ("name") String name) {try {/ / analyze the picture byte [] decode = Base64.decode (base64Process (file)) ImageInfo imageInfo = ImageFactory.getRGBData (decode); / / facial feature acquisition byte [] bytes = faceEngineService.extractFaceFeature (imageInfo); if (bytes = = null) {return Results.newFailedResult (ErrorCodeEnum.NO_FACE_DETECTED);} UserFaceInfo userFaceInfo = new UserFaceInfo (); userFaceInfo.setName (name); userFaceInfo.setGroupId (groupId) UserFaceInfo.setFaceFeature (bytes); userFaceInfo.setFaceId (RandomUtil.randomString (10)); / / insert facial features into the database userFaceInfoService.insertSelective (userFaceInfo); logger.info ("faceAdd:" + name); return Results.newSuccessResult ("");} catch (Exception e) {logger.error ("", e) } return Results.newFailedResult (ErrorCodeEnum.UNKNOWN);}
3. Comparison of portrait features
Face recognition: after the portrait feature extraction, the front-end image is compared with the existing portrait information in the database.
/ * face recognition * / @ RequestMapping (value = "/ faceSearch", method = RequestMethod.POST) @ ResponseBody public Result faceSearch (String file, Integer groupId) throws Exception {byte [] decode = Base64.decode (base64Process (file)); BufferedImage bufImage = ImageIO.read (new ByteArrayInputStream (decode)); ImageInfo imageInfo = ImageFactory.bufferedImage2ImageInfo (bufImage); / / facial feature acquisition byte [] bytes = faceEngineService.extractFaceFeature (imageInfo) If (bytes = = null) {return Results.newFailedResult (ErrorCodeEnum.NO_FACE_DETECTED);} / / face comparison result List userFaceInfoList = faceEngineService.compareFaceFeature (bytes, groupId); if (CollectionUtil.isNotEmpty (userFaceInfoList)) {FaceUserInfo faceUserInfo = userFaceInfoList.get (0); FaceSearchResDto faceSearchResDto = new FaceSearchResDto (); BeanUtil.copyProperties (faceUserInfo, faceSearchResDto) List processInfoList = faceEngineService.process (imageInfo); if (CollectionUtil.isNotEmpty (processInfoList)) {/ / face detection List faceInfoList = faceEngineService.detectFaces (imageInfo); int left = faceInfoList.get (0). GetRect (). GetLeft (); int top = faceInfoList.get (0). GetRect (). GetTop () Int width = faceInfoList.get (0). GetRect (). GetRight ()-left; int height = faceInfoList.get (0). GetRect (). GetBottom ()-top; Graphics2D graphics2D = bufImage.createGraphics (); graphics2D.setColor (Color.RED); / / Red BasicStroke stroke = new BasicStroke (5f); graphics2D.setStroke (stroke) Graphics2D.drawRect (left, top, width, height); ByteArrayOutputStream outputStream = new ByteArrayOutputStream (); ImageIO.write (bufImage, "jpg", outputStream); byte [] bytes1 = outputStream.toByteArray (); faceSearchResDto.setImage (_ "data:image/jpeg;base64," + Base64Utils.encodeToString (bytes1)) FaceSearchResDto.setAge (processInfoList.get (0). GetAge ()); faceSearchResDto.setGender (processInfoList.get (0). GetGender (). Equals (1)? "female": "male");} return Results.newSuccessResult (faceSearchResDto);} return Results.newFailedResult (ErrorCodeEnum.FACE_DOES_NOT_MATCH);} "how to use Java to achieve face recognition". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.