AI Content Moderation v2 via API

The AI Content Moderation API offers a powerful solution for analyzing video content to detect various categories of inappropriate material. Leveraging state-of-the-art AI models, this API ensures real-time analysis and flagging of sensitive or restricted content types, making it an essential tool for platforms requiring stringent content moderation.

A NEW single method aggregates and allows you to run different analysis models. You can use the following types of models. Categories:

  • nsfw: Quick algorithm to detect pornographic material, ensuring content is "not-safe-for-work" or normal.

  • hard_nudity: Detailed analisys of video which detects explicit nudity involving genitalia.

  • soft_nudity: Detailed video analysis that reveals both explicit and partial nudity, including the presence of male and female faces and other uncovered body parts.

  • child_pornography: Detects content related to child pornography.

  • sport: Recognizes various sporting activities.

  • weapon: Identifies the presence of weapons in the video content.

API: https://api.gcore.com/docs/streaming#tag/AI/operation/post_ai_contentmoderation