본문 바로가기
반응형

논문572

[2022-09-16] 오늘의 자연어처리 CNN-Trans-Enc: A CNN-Enhanced Transformer-Encoder On Top Of Static BERT representations for Document Classification BERT achieves remarkable results in text classification tasks, it is yet not fully exploited, since only the last layer is used as a representation output for downstream classifiers. The most recent studies on the nature of linguistic features learned by BERT, suggest that differen.. 2022. 9. 16.
[2022-09-15] 오늘의 자연어처리 Non-Parametric Temporal Adaptation for Social Media Topic Classification User-generated social media data is constantly changing as new trends influence online discussion, causing distribution shift in test data for social media NLP applications. In addition, training data is often subject to change as user data is deleted. Most current NLP systems are static and rely on fixed training data. As .. 2022. 9. 15.
[2022-09-14] 오늘의 자연어처리 Stability of Syntactic Dialect Classification Over Space and Time This paper analyses the degree to which dialect classifiers based on syntactic representations remain stable over space and time. While previous work has shown that the combination of grammar induction and geospatial text classification produces robust dialect models, we do not know what influence both changing grammars and changi.. 2022. 9. 14.
[2022-09-13] 오늘의 자연어처리 Joint Alignment of Multi-Task Feature and Label Spaces for Emotion Cause Pair Extraction Emotion cause pair extraction (ECPE), as one of the derived subtasks of emotion cause analysis (ECA), shares rich inter-related features with emotion extraction (EE) and cause extraction (CE). Therefore EE and CE are frequently utilized as auxiliary tasks for better feature learning, modeled via multi-task l.. 2022. 9. 13.
반응형