Mirror of https://github.com/roostorg/awesome-safety-tools
0
fork

Configure Feed

Select the types of activity you want to include in your feed.

Content Safety API: expand details/rationale

Since Coop is released as open source and includes Content Safety API integration, this makes more sense to add now, so long as we mention that rationale

Signed-off-by: Cassidy James Blaede <cassidyjames@roost.tools>

authored by

Cassidy James Blaede and committed by
GitHub
acf069f7 0dc9ff3d

+3 -2
+3 -2
README.md
··· 36 36 ## Classification 37 37 38 38 * [Content Safety API by Google](https://protectingchildren.google/tools-for-partners/#learn-about-our-tools) 39 - * uses machine learning to detect child sexual abuse material (CSAM), nudity, and sexually explicit content in images and videos 40 - * free service, but requires registration and not open source 39 + * uses machine learning to detect novel CSAM, nudity, and sexually explicit content in images and videos 40 + * free service, but requires registration 41 + * not open source itself, but can be [used via Coop](https://roostorg.github.io/coop/SIGNALS.html#content-safety-api-by-google), which is open source 41 42 * [CoPE by Zentropi](https://huggingface.co/zentropi-ai/cope-a-9b) 42 43 * small language model trained for accurate, fast, steerable content classification based on developer-defined content policies 43 44 * [Detoxify by Unitary AI](https://github.com/unitaryai/detoxify)