Adding Facial Recognition to Your Apps


The Microsoft Cognitive Services Face API is a Cloud-based service with the most advanced machine learning algorithms. It offers detection of faces in images with attributes, extract features such as gender and age, organize images into groups, and identify individuals. The Ace API can detect up to 64 faces with high precision face location in an image. You have to specify image by bytes or a valid image URL. Cognitive Services even has an API for detecting individuals’ emotions based on their facial expressions. Microsoft is currently offering a free plan with limited calls of 30,000 transactions per month. To calculate pricing above the mentioned number of free transaction, please visit URL given.

The Face API uses the JSON format for data exchange. By using Cognitive Services APIs, you can add face detection authentication to your mobile app. To consume the Face API, you need to register in Microsoft Azure Cognitive Services portal and get the subscription keys. I will mention subscription steps later in this article.

The following features are supported in the Cognitive Face API.

Face Detection

A developer can detect one or more human faces in an image and get face attributes that contain machine learning-based predictions of facial features. Detection of one or more human faces in an image is marked as rectangles. The face attribute features are age, emotion, gender, pose, smile, and facial hair, along with 27 landmarks for each face in the image. After detecting faces, we can take the face rectangle and pass it to the Emotion API to speed up the processing.

Recognizing a face
Figure 1: Recognizing a face

Face Verification

Face verification is based on a deep machine learning algorithm. It quickly and precisely detects facial features, recognizing faces with an impressively high rate of accuracy. The Face API checks the possibility of two faces belonging to the same or different person. it returns a confidence score.

Verifying a face
Figure 2: Verifying a face

Face Identification

A developer can search and identify faces, tag people and groups with user-provided data, and then search those for a match with previously unseen faces.

Similar Face Search

Finding similar-looking faces to a new face, from a given collection of faces, this API will return a collection of similar faces.

Face Grouping

Face grouping is organizing many unidentified faces together into groups, based on their visual similarity. A group contains a collection of similar faces, identified by their Face ID. In Face API terms, the PersonGroup is the image classifier unit, and is trained from the faces of the Person objects contained therein. Groups is derived from face grouping, which is a collection of faces according to facial similarity. Faces that are not similar to others are placed in a group called a messy group.

Creating a New Face API Key in Azure Portal

To use Face API, you first need to create an account or use an existing account to log in to the Azure portal.

Logging in to the Azure portal
Figure 3: Logging in to the Azure portal

From the Azure dashboard, click ‘+ NEW.’ Again, select ‘AI + Cognitive Services;’ you will get a list of available Cognitive APIs.

A list of available APIs
Figure 4: A list of available APIs

If you have not subscribed to Azure Service, it will ask your account name, subscription type (select free), and other details. After subscription, select API type as Face API from the list of available APIs.

For existing subscription holders, select Face API, and continue.

Selecting the Face API
Figure 5: Selecting the Face API

Next, you need to choose the location, pricing tier, resource group, and “Accept” the license term. To pin the account to the Azure portal dashboard, check the ‘Pin to Dashboard’ option and click Create.

Accepting the licensing terms
Figure 6: Accepting the licensing terms

A few minutes later, your Cognitive Services account will be successfully deployed and the Face API dashboard will appear. Click the tile in the dashboard to view the account information.

Viewing the account information
Figure 7: Viewing the account information

Use the API endpoint URL, and start making Face API calls from your applications.

Starting to make Face API calls
Figure 8: Starting to make Face API calls

Face API Service Call

Unlike other Cognitive APIs, the Face recognition API is currently available in the following five Azure zones:

  • West US:
  • East US 2:
  • West Central US:
  • West Europe:
  • Southeast Asia:

Four versions of Face APIs are available. The Post request formats of these versions are as follows:

  • Detect: https://[location][?returnFaceId][&returnFaceLandmarks][&returnFaceAttributes]
  • Find Similar Faces: https://[location]
  • Group: https://[location]
  • Identify: https://[location]
  • Verify: https://[location]

The following JSON code is a sample successful response of the API call with HTTP success code 200.

Detection result:
         "FaceRectangle": {
            "Top": 277,
            "Left": 295,
         "Width": 215,
         "Height": 215
      "FaceAttributes": {
         "Hair": {
         "Bald": 0.02,
         "Invisible": false,
         "HairColor": [
                  "Color": "black",
                  "Confidence": 1.0
                  "Color": "gray",
                  "Confidence": 0.75
                  "Color": "other",
                  "Confidence": 0.52
                  "Color": "red",
                  "Confidence": 0.17
                  "Color": "brown",
                  "Confidence": 0.11
                  "Color": "blond",
                  "Confidence": 0.04
         "Smile": 0.008,
         "HeadPose": {
            "Pitch": 0.0,
            "Roll": -5.6,
            "Yaw": -2.7
         "Gender": "male",
         "Age": 34.1,
         "FacialHair": {
            "Moustache": 0.4,
            "Beard": 0.3,
            "Sideburns": 0.3
         "Glasses": "NoGlasses",
         "Makeup": {
            "EyeMakeup": false,
            "LipMakeup": false
         "Emotion": {
            "Anger": 0.0,
            "Contempt": 0.002,
            "Disgust": 0.0,
            "Fear": 0.0,
            "Happiness": 0.008,
            "Neutral": 0.989,
            "Sadness": 0.0,
            "Surprise": 0.0
         "Occlusion": {
            "ForeheadOccluded": false,
            "EyeOccluded": false,
            "MouthOccluded": false
         "Accessories": [],
         "Blur": {
            "BlurLevel": "medium",
            "Value": 0.61
         "Exposure": {
            "ExposureLevel": "goodExposure",
            "Value": 0.71
         "Noise": {
            "NoiseLevel": "medium",
            "Value": 0.69
      "FaceLandmarks": {
         "PupilLeft": {
            "X": 353.1,
            "Y": 338.6
         "PupilRight": {
            "X": 441.8,
            "Y": 329.4
         "NoseTip": {
            "X": 405.2,
            "Y": 396.2
         "MouthLeft": {
            "X": 360.1,
            "Y": 440.1
         "MouthRight": {
            "X": 451.9,
            "Y": 426.9
         "EyebrowLeftOuter": {
            "X": 318.8,
            "Y": 331.7
         "EyebrowLeftInner": {
            "X": 375.0,
            "Y": 323.6
         "EyeLeftOuter": {
            "X": 339.2,
            "Y": 342.5
         "EyeLeftTop": {
            "X": 351.5,
            "Y": 334.4
         "EyeLeftBottom": {
            "X": 354.0,
            "Y": 346.5
         "EyeLeftInner": {
            "X": 368.0,
            "Y": 340.0
         "EyebrowRightInner": {
            "X": 418.6,
            "Y": 317.0
         "EyebrowRightOuter": {
            "X": 473.3,
            "Y": 310.9
         "EyeRightInner": {
            "X": 429.1,
            "Y": 334.8
         "EyeRightTop": {
            "X": 440.9,
            "Y": 325.3
         "EyeRightBottom": {
            "X": 444.0,
            "Y": 338.1
         "EyeRightOuter": {
            "X": 456.4,
            "Y": 328.7
         "NoseRootLeft": {
            "X": 385.8,
            "Y": 340.9
         "NoseRootRight": {
            "X": 414.1,
            "Y": 339.3
         "NoseLeftAlarTop": {
            "X": 380.6,
            "Y": 380.0
         "NoseRightAlarTop": {
            "X": 427.1,
            "Y": 375.4
         "NoseLeftAlarOutTip": {
            "X": 372.0,
            "Y": 399.6
         "NoseRightAlarOutTip": {
            "X": 436.8,
            "Y": 391.6
         "UpperLipTop": {
            "X": 405.8,
            "Y": 428.7
         "UpperLipBottom": {
            "X": 406.4,
            "Y": 437.6
         "UnderLipTop": {
            "X": 405.8,
            "Y": 440.6
         "UnderLipBottom": {
            "X": 406.9,
            "Y": 453.8


I hope this article gave you insights on how to get started with the Face API, how the works, how to call the API functions, and how to create an Azure account for API subscription. In my next article, I will create an application using the Universal Windows Platform (UWP) to demonstrate Face API functionalities.

More by Author

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Must Read