LiveDetection Web API
POST /livedetection
Performs a face liveness detection on two images, see Face Liveness Detection.
To perform a live detection, exactly two live recorded images are required. These are sent to the quality-check where, among other things, the face detection is done. If the images are suitable, the live-detection is executed.
To perform a live detection with challenge-response, please use the LivenessDetection API or one of the methods Enroll
, Verify
or Identify
.
Request Information
Authentication
This API call requires Basic Authentication, i.e. you have to provide an HTTP authorization header using the authorization method Basic and the base64 encoded string App-ID:App-Secret
. To receive the necessary BWS WebAPI access data (App-ID
and App-Secret
) you have to register your application on the BWS Portal first. This requires a valid BWS subscription.
Parameters
This API has no required parameters. An optional state
parameter can be provided, which is then simply passed through to the BWS log and to the returned object.
Body
The body contains the two live images encoded into a Data-URL using the data URI scheme as described in RFC 2397 (see also at Wikipedia), e.g. using the application/json media type:
{
"liveimage1": "data:image...",
"liveimage2": "data:image..."
}
or using the application/x-www-form-urlencoded media type:
liveimage1=data:image...&liveimage2=data:image...
Response
If all two provided images could be processed successfully, this method returns the OK HTTP status code (200) with true
or false
in the body content, indicating whether the submitted images prove that they are recorded from a live person or not.
If a state
argument has also been provided, a result object is returned instead, that primarily contains the flag Success, which indicates, whether the live detection succeeded or not:
Success | Boolean flag indicating whether the submitted images prove that they are recorded from a live person. |
---|---|
JobID | A unique ID to identify this LiveDetection job with the BWS log. |
State | The provided status string that is also added to the BWS log. |
Errors | A list of reported (liveness detection) errors. |
Samples | Array of quality check and liveness detection results for the provided live images. |
Errors | List of problems detected with this live image. |
EyeCenters | Coordinates of the left and right eye centers, if a face was found. |
An example result object might look as follows:
{ "Success": true, "JobID": "3e4ed0e8-5eb4-4684-930a-bdae875b854e", "State": "details", "Samples": [{ "Errors": [{ "Code": "ImageTooSmall", "Message": "The part of the image containing the found face is too small.", "Details": "The found face (with an eye-distance of 55 pixels) does not have the required eye-distance of at least 60 pixels." } ], "EyeCenters": { "RightEyeX": 290.188, "RightEyeY": 251.479, "LeftEyeX": 345.9, "LeftEyeY": 251.182 } }, { "Errors": [{ "Code": "ImageTooBlurry", "Message": "The image is too blurry, i.e. it is not sharp enough.", "Details": "An image blurring of 10.45% was calculated, where up to 9.00% is allowed. Note that compression artifacts might be the reason for this fuzziness as they reduce the objective sharpness more than the subjective sharpness." } ], "EyeCenters": { "RightEyeX": 280.599, "RightEyeY": 242.184, "LeftEyeX": 333.96, "LeftEyeY": 241.687 } } ] }
In case something goes wrong, an error HTTP status code is returned together with some additional information if available.
Response HTTP Status Codes
The call returns one of the standard HTTP status codes:
200 OK | The response body contains a boolean value or the result object as described above. |
---|---|
400 Bad Request | Invalid samples have been uploaded or they could not be processed successfully, e.g. no face found, etc. The response body typically has a Message field containing the error code:
|
401 Unauthorized | Basic Authentication is required. |
415 Unsupported Media Type | You probably forgot to specify the media type, e. g. application/json. |
500 Internal Server Error | A server side exception occurred. The content may contain a Message and an ExceptionMessage . |
Implementation Note
You need to send two images in between which the user has moved slightly. Therefore, please implement the following:
- Capture the first image with the person looking straight at the camera.
- Use our BioID Motion Detection to automatically trigger the capturing of the second image as soon as the person has turned their head far enough.
- Please get your users to perform this movement. Otherwise liveness detection fails automatically.
Sample Code
private static async Task<bool> LiveDetectionAsync(string dataUrlLiveImage1, string dataUrlLiveImage2) { using (var client = new HttpClient()) { client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", Convert.ToBase64String(Encoding.ASCII.GetBytes($"{APP_IDENTIFIER}:{APP_SECRET}"))); string mediaType, images; // using the application/json media type mediaType = "application/json"; images = $@"{{""liveimage1"":""{dataUrlLiveImage1}""" + $@",""liveimage2"":""{dataUrlLiveImage2}""}}"; // or using application/x-www-form-urlencoded /* mediaType = "application/x-www-form-urlencoded"; images = string.Format("liveimage1={0}&liveimage2={1}", HttpUtility.UrlEncode(dataUrlLiveImage1), HttpUtility.UrlEncode(dataUrlLiveImage2)); */ using (var content = new StringContent(images, Encoding.UTF8, mediaType)) using (var response = await client.PostAsync(ENDPOINT + "livedetection", content)) { Console.Write("LiveDetection Response... "); string result = await response.Content.ReadAsStringAsync(); if (response.StatusCode == HttpStatusCode.OK) { if (bool.TryParse(result, out var isLive)) { Console.WriteLine(isLive); return isLive; } } Console.WriteLine(response.StatusCode.ToString()); Console.WriteLine(result); return false; } } }
// using org.json.JSONObject from JSON-java library JSONObject requestBody = new JSONObject(); requestBody.put("liveimage1", "data:image/png;base64," + Base64.getEncoder().encodeToString(png1AsByteArray)); requestBody.put("liveimage2", "data:image/png;base64," + Base64.getEncoder().encodeToString(png2AsByteArray)); // using OkHttpClient from the OkHttp library Request request = new Request.Builder() .url("https://bws.bioid.com/extension/livedetection") .addHeader("Authorization", Credentials.basic(APP_IDENTIFIER, APP_SECRET)) .post(RequestBody.create(MediaType.parse("application/json"), requestBody.toString())) .build(); OkHttpClient client = new OkHttpClient(); Response response = client.newCall(request).execute(); if (response.code() == 200) { if (response.body().string().equals("true")) { System.out.println("recorded from a live person"); } else { System.out.println("not recorded from a live person"); } }