# Face analysis

## Face analysis

A set of ML models that accurately detect human faces and predict crucial characteristics like facial landmarks, expression, ethnicity, age, and gender

### Overview <a href="#overview" id="overview"></a>

The Face analyzer consists of several cutting-edge ML models. Its primary function is to detect visible human faces in images and predict some facial characteristics that are deemed important.

Leveraging state-of-the-art deep learning algorithms and neural networks, the Analyzer accurately identifies and analyzes faces, extracting the following information for each face:

* position in the image (bounding box);
* facial landmarks (coordinates of points that map to specific facial structures on the face);
* expression classification (happy, angry, sad, etc.);
* ethnicity classification;
* age estimation;
* gender classification.

### Use cases <a href="#use-cases" id="use-cases"></a>

Use cases for automatic facial analysis include:

* **Image tagging and organization** - Automatic facial analysis enables users to easily categorize and index images based on expressions, ethnicities, age groups and genders. This streamlines content management, making it easier to locate specific images for various purposes.
* **Inclusive representation** - The Face Analyzer can facilitate inclusive representation in media content. By analyzing facial ethnicities and genders, content creators can ensure diversity and cultural representation in their visual assets, thus promoting inclusivity.
* **Search Optimization** - Automatic tagging and categorization of images based on facial characteristics allow users to find specific faces, expressions, or ethnicities with ease.

### API endpoints <a href="#api-endpoints" id="api-endpoints"></a>

An up-to-date reference with all API endpoints is available here:

[![Logo](https://documenter-assets.pstmn.io/favicon.ico)Scaleflex API for Digital Asset Management (DAM), Visual AI and Media OptimizationScaleflex API](https://developers.scaleflex.com/#4979d84f-7d02-47b5-b849-4fbdec3022eb)

### Example API responses <a href="#example-api-responses" id="example-api-responses"></a>

<table data-full-width="true"><thead><tr><th>Input image</th><th>Input image</th></tr></thead><tbody><tr><td><img src="/files/aua46HbjckIxW461rb9u" alt="" data-size="original"></td><td><img src="/files/kYTpsiuwW8smvd8cT0z7" alt="" data-size="original"></td></tr><tr><td><strong>Output preview</strong></td><td><strong>Output preview</strong></td></tr><tr><td><img src="/files/mTpZUEMYntk3VQPDc6Hp" alt="" data-size="original"></td><td><img src="/files/zFQzOwDUjXVdeucH72yf" alt="" data-size="original"></td></tr><tr><td><strong>API response</strong></td><td><strong>API response</strong></td></tr><tr><td><pre class="language-json"><code class="lang-json">{
    "status": "success",
    "version": "2.9.3",
    "request_uuid": "4f74f084-39a3-48f7-829a-336b55e8fd13",
    "result": "https://fdocs.filerobot.com/https://ask.filerobot.com/deliver/fdocs/4f74f084-39a3-48f7-829a-336b55e8fd13.png",
    "faces": {
        "face_0": {
            "box": [
                467,
                269,
                576,
                416
            ],
            "emotions": {
                "angry": 0.07,
                "disgust": 0.03,
                "fear": 0.02,
                "happy": 98.85,
                "sad": 0.3,
                "surprise": 0.58,
                "neutral": 0.16
            },
            "dominant_emotion": "happy",
            "races": {
                "asian": 0.89,
                "indian": 3.23,
                "black": 0.42,
                "white": 34.26,
                "middle eastern": 31.55,
                "latino hispanic": 29.64
            },
            "dominant_race": "white",
            "age": 34,
            "gender": "Male"
        },
        "face_1": {
            "box": [
                907,
                53,
                1017,
                207
            ],
            "emotions": {
                "angry": 1.92,
                "disgust": 0.13,
                "fear": 1.65,
                "happy": 31.76,
                "sad": 9.86,
                "surprise": 3.49,
                "neutral": 51.2
            },
            "dominant_emotion": "neutral",
            "races": {
                "asian": 0.01,
                "indian": 0.02,
                "black": 0.0,
                "white": 89.42,
                "middle eastern": 5.25,
                "latino hispanic": 5.3
            },
            "dominant_race": "white",
            "age": 25,
            "gender": "Male"
        },
        "face_2": {
            "box": [
                62,
                254,
                172,
                398
            ],
            "emotions": {
                "angry": 4.9,
                "disgust": 0.56,
                "fear": 0.16,
                "happy": 4.81,
                "sad": 22.33,
                "surprise": 0.13,
                "neutral": 67.1
            },
            "dominant_emotion": "neutral",
            "races": {
                "asian": 0.01,
                "indian": 0.0,
                "black": 0.0,
                "white": 99.67,
                "middle eastern": 0.12,
                "latino hispanic": 0.2
            },
            "dominant_race": "white",
            "age": 32,
            "gender": "Female"
        },
        "face_3": {
            "box": [
                746,
                346,
                844,
                474
            ],
            "emotions": {
                "angry": 0.55,
                "disgust": 0.29,
                "fear": 0.15,
                "happy": 97.94,
                "sad": 0.09,
                "surprise": 0.92,
                "neutral": 0.06
            },
            "dominant_emotion": "happy",
            "races": {
                "asian": 3.73,
                "indian": 9.26,
                "black": 1.63,
                "white": 27.46,
                "middle eastern": 26.19,
                "latino hispanic": 31.73
            },
            "dominant_race": "latino hispanic",
            "age": 36,
            "gender": "Male"
        },
        "face_4": {
            "box": [
                268,
                143,
                367,
                267
            ],
            "emotions": {
                "angry": 0.91,
                "disgust": 0.01,
                "fear": 0.11,
                "happy": 7.24,
                "sad": 16.02,
                "surprise": 0.69,
                "neutral": 75.03
            },
            "dominant_emotion": "neutral",
            "races": {
                "asian": 0.0,
                "indian": 0.0,
                "black": 0.0,
                "white": 99.97,
                "middle eastern": 0.02,
                "latino hispanic": 0.02
            },
            "dominant_race": "white",
            "age": 23,
            "gender": "Male"
        },
        "face_5": {
            "box": [
                1131,
                272,
                1225,
                400
            ],
            "emotions": {
                "angry": 0.04,
                "disgust": 3.63,
                "fear": 0.02,
                "happy": 95.44,
                "sad": 0.2,
                "surprise": 0.05,
                "neutral": 0.63
            },
            "dominant_emotion": "happy",
            "races": {
                "asian": 0.0,
                "indian": 0.0,
                "black": 0.0,
                "white": 99.88,
                "middle eastern": 0.07,
                "latino hispanic": 0.05
            },
            "dominant_race": "white",
            "age": 24,
            "gender": "Female"
        }
    },
    "file_downloaded": "friends.jpg"
}
</code></pre></td><td><pre class="language-json"><code class="lang-json">{
    "status": "success",
    "version": "2.9.3",
    "request_uuid": "cf655b9a-7467-4d2c-8425-a12fbb600af4",
    "result": "https://fdocs.filerobot.com/https://ask.filerobot.com/deliver/fdocs/cf655b9a-7467-4d2c-8425-a12fbb600af4.png",
    "faces": {
        "face_0": {
            "box": [
                630,
                161,
                940,
                573
            ],
            "emotions": {
                "angry": 0.9,
                "disgust": 0.29,
                "fear": 0.09,
                "happy": 2.27,
                "sad": 3.59,
                "surprise": 0.09,
                "neutral": 92.79
            },
            "dominant_emotion": "neutral",
            "races": {
                "asian": 0.03,
                "indian": 0.06,
                "black": 0.0,
                "white": 84.59,
                "middle eastern": 7.62,
                "latino hispanic": 7.7
            },
            "dominant_race": "white",
            "age": 26,
            "gender": "Female"
        }
    },
    "file_downloaded": "face.png"
}
</code></pre></td></tr></tbody></table>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.scaleflex.com/visual-ai/visual-ai/images/classification-models/faces/face-analysis.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
