> ## Documentation Index
> Fetch the complete documentation index at: https://jigsaw-13.mintlify.app/llms.txt
> Use this file to discover all available pages before exploring further.

# NSFW Detection

> Learn how to use the JigsawStack NSFW Detection API to identify inappropriate images

## Overview

The NSFW (Not Safe For Work) Detection API helps you identify potentially inappropriate images, including nudity, violence, and explicit content. This API is essential for applications that need to maintain content standards, protect users from offensive material, or comply with platform guidelines.

* High-accuracy detection of explicit visual content
* Multiple content categories detection (nudity, gore, etc.)
* Fast analysis response times
* Simple integration with any image source
* Confidence scores for detailed risk assessment

## API Endpoint

```
POST /v1/validate/nsfw
```

## Quick Start

```javascript Javascript theme={null}
import { JigsawStack } from "jigsawstack";

const jigsawstack = JigsawStack({
  apiKey: "your-api-key",
});

const url = "https://jigsawstack.com/preview/nsfw-example.jpg";
const result = await jigsawstack.validate.nsfw(url);

console.log(result);
```

## Response Example

```json theme={null}
{
  "success": true,
  "nsfw": false,
  "nudity": false,
  "gore": false,
  "nudity_score": 0.005777647718787193,
  "nsfw_score": 0.004729619482532144,
  "gore_score": 0.003681591246277094
}
```

<Note>Find more information on NSFW Detection API [here](/docs/api-reference/validate/nsfw)</Note>
