Build a model to identify toxic statements and reduce bias in classification
-
Updated
Dec 15, 2019 - Jupyter Notebook
Build a model to identify toxic statements and reduce bias in classification
Identification and Classification of Toxic comments using Machine Learning
NLP deep learning model for multilingual toxicity detection in text 📚
The repo contains notebooks for the Jigsaw Unintend Bias in Toxicity Classification contest hosted on Kaggle
A code solution to the Kaggle competition by using basic classifications techniques.
This is my repository and all the code needed to complete my Bachelor thesis on the detection of toxic comments.
A REST API for detecting toxicity in a sentence. Using Tensorflow.js in the backend to detect parameters like identity_attack, insult, obscene, severe_toxicity, sexual_explicit, threat.
Toxic comment classification using Tensorflow and react.js
UoT-UWF-PartAI at SemEval2021 Task 5: Toxic Span Detection
This repository contains code for the paper: Cisco at SemEval-2021 Task 5: What’s Toxic?: Leveraging Transformers for Multiple Toxic Span Extraction from Online Comments
Trabajo final de la cátedra "Text Mining" de Laura Alonso Alemany - FaMAF UNC. 2021.
基于SparseEA的特征选择方法在毒性分类中的应用
This project applies classification models with the aim of automating the detection of toxic comments on social media. After choosing the model with the best performance, HuggingFace + Streamlit are applied to make the web app.
An AI to Scan for Toxic Tweets
Vk bot with neural network for toxic comments classification
It is a trained Deep Learning model to predict different level of toxic comments. Toxicity like threats, obscenity, insults, and identity-based hate.
This repository contains the code for the paper: "DeToxy: A Large-Scale Multimodal Dataset for Toxicity Classification in Spoken Utterances"
Model that determines the level of toxicity of Russian and English messages
Anonymous bot with model for determining the level of toxicity of sentences in Russian and English
Add a description, image, and links to the toxicity-classification topic page so that developers can more easily learn about it.
To associate your repository with the toxicity-classification topic, visit your repo's landing page and select "manage topics."