Cloud-Based Offensive Code Mixed Text Classification Using Hierarchical Attention Network

Cloud-Based Offensive Code Mixed Text Classification Using Hierarchical Attention Network

Durga Karthik, Rajeswari Natarajan, R. Bhavani, D. Rajalakshmi
Copyright: © 2024 |Pages: 14
ISBN13: 9798369316948|ISBN13 Softcover: 9798369344507|EISBN13: 9798369316955
DOI: 10.4018/979-8-3693-1694-8.ch012
Cite Chapter Cite Chapter

MLA

Karthik, Durga, et al. "Cloud-Based Offensive Code Mixed Text Classification Using Hierarchical Attention Network." Advanced Applications in Osmotic Computing, edited by G. Revathy, IGI Global, 2024, pp. 224-237. https://doi.org/10.4018/979-8-3693-1694-8.ch012

APA

Karthik, D., Natarajan, R., Bhavani, R., & Rajalakshmi, D. (2024). Cloud-Based Offensive Code Mixed Text Classification Using Hierarchical Attention Network. In G. Revathy (Ed.), Advanced Applications in Osmotic Computing (pp. 224-237). IGI Global. https://doi.org/10.4018/979-8-3693-1694-8.ch012

Chicago

Karthik, Durga, et al. "Cloud-Based Offensive Code Mixed Text Classification Using Hierarchical Attention Network." In Advanced Applications in Osmotic Computing, edited by G. Revathy, 224-237. Hershey, PA: IGI Global, 2024. https://doi.org/10.4018/979-8-3693-1694-8.ch012

Export Reference

Mendeley
Favorite

Abstract

The use of mixed language in social media has increased and the need of the hour is to detect abusive and offensive content. Hierarchical attention network (HAN) is employed for classifying offensive content both at word and sentence level. Data from Thinkspeak cloud tweets containing annotated Tamil and English text is used as a training set for the HAN model. The attention mechanism captures the significance from both word and sentence levels. Cross-entropy loss function and backpropagation algorithm in the model classify offensive code-mixed text with an accuracy of 0.58. The above model can be employed for classifying other mixed language text too.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.