Griffeye Brain CSA Classifier is a forensics tool aims to detect and rescue children pictured in child sexual exploitation material.

The filtering techniques can detect skin tones — even in low light or poor-quality video — then map connecting regions containing skin tones to determine if a subject is nude.

This technique will be paired with systems that can extract facial features and perform spatial and textural analysis to determine if the face belongs to an adult or a child.

The system will automatically assess the body motions to determine whether the content is explicit. The AI scans through previously unseen footage and suggests images that it believes depicts child sexual abuse content. The AI outputs a score that can be used to tell whether a file is pertinent to the investigation or not.

More information on the software can be found here.

post

page

attachment

revision

nav_menu_item

custom_css

customize_changeset

oembed_cache

user_request

wp_block

wp_template

wp_template_part

wp_global_styles

wp_navigation

wp_font_family

wp_font_face

acf-taxonomy

acf-post-type

acf-field-group

acf-field

ai1ec_event

exactmetrics_note

Oyoty App
Online Tools

Oyoty is a chatbot aiming to empower children to be safer online. The app runs on the child’s phone, detects threats and risks on smart devices and social networks using Artificial Intelligence (AI). Oyoty alerts the child to risks and th...Read More

Stop Slavery Resource Hub
Online Tools

This resource hub is intended to provide a central repository for resources on understanding the risk of modern slavery to business, modern slavery policy and legislation, and how this can be applied to the hotel industry. child labour T...Read More

Stop the Traffik App
Online Tools

A tool to enable people across the world to join the fight against human trafficking and modern slavery. child labour More information on the app can be found here. ...Read More

Project Arachnid
Online Tools

The Project Arachnid platform was initially designed to crawl links on sites previously reported to Cybertip.ca that contained CSAM and detect where these images/videos are being made publicly available. Once child sexual abuse material is detec...Read More