Hyper-HaBERTor: a light-weight pretrained hatespeech language detector using hypercomplex space

Author: Thanh Tran; Kyumin Lee

Advisor: Kyumin Lee

Category: Graduate

The occurrence of hatespeech has been increasing. One of the reasons is that social media is gradually replacing traditional media as the main source of information for many people. It is very easy to reach a large audience quickly via social media, causing an increase of the temptation for inappropriate behaviors such as hatespeech, and potential damage to social systems. In particular, hatespeech interferes with a civil discourse, and turns good people away. Furthermore, hatespeech in the virtual world can lead to physical violence against certain groups in the real world, thus should not be ignored on the ground of freedom of speech.
To detect hatespeech, researchers developed human-crafted feature based classifiers, and proposed deep neural network architectures. However, they might not explore all possible important features for hatespeech detection, ignored pretrained language model understanding, or proposed uni-directional language models by reading from left to right or right to left.
Recently, the BERT (Bidirectional Encoder Representations from Transformers) model has achieved tremendous success in Natural Language Processing. However, even the pretrained BERT-base is heavy and expensive in the inferencing phrase.
Hence, in this work, we propose to pretrain a Hateful Language Model (HLM) that (i) better understands the hatespeech related language model by pretraining from the scratch a HLM from a hateful corpus, and (ii) is light-weight by utilizing the hypercomplex space. As a result, our HLM contains only 18M of parameters compared to 110M of parameters in BERT-base model with a significant better performance compared to the fine-tuning BERT-base on hatespeech datasets.

Loader Loading…
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab


*May need to REFRESH several times to see file preview or to download file.

Back To Top