We released our MentalBERTa model and pretraining data on HuggingFace Hub.
pip install -r docker/requirements.txt
All experiments were made on V100 GPU (32GB).
@inproceedings{garcia-etal-2023-deeplearningbrasil,
title = "{D}eep{L}earning{B}rasil@{LT}-{EDI}-2023: Exploring Deep Learning Techniques for Detecting Depression in Social Media Text",
author = "Garcia, Eduardo and
Gomes, Juliana and
Barbosa Junior, Adalberto Ferreira and
Borges, Cardeque Henrique Bittes de Alvarenga and
da Silva, Nadia F{\'e}lix Felipe",
editor = "Chakravarthi, Bharathi R. and
Bharathi, B. and
Griffith, Joephine and
Bali, Kalika and
Buitelaar, Paul",
booktitle = "Proceedings of the Third Workshop on Language Technology for Equality, Diversity and Inclusion",
month = sep,
year = "2023",
address = "Varna, Bulgaria",
publisher = "INCOMA Ltd., Shoumen, Bulgaria",
url = "https://aclanthology.org/2023.ltedi-1.42",
pages = "272--278",
}
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG).