본문 바로가기
반응형

자연어처리572

[2022-11-16] 오늘의 자연어처리 MT4SSL: Boosting Self-Supervised Speech Representation Learning by Integrating Multiple Targets In this paper, we provide a new perspective on self-supervised speech models from how the self-training targets are obtained. We generalize the targets extractor into Offline Targets Extractor (Off-TE) and Online Targets Extractor (On-TE), without caring about specific pretext tasks. Based on this, we.. 2022. 11. 16.
[2022-11-15] 오늘의 자연어처리 Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators Recently, non-autoregressive (NAR) neural machine translation models have received increasing attention due to their efficient parallel decoding. However, the probabilistic framework of NAR models necessitates conditional independence assumption on target sequences, falling short of characterizi.. 2022. 11. 15.
[2022-11-14] 오늘의 자연어처리 BERT on a Data Diet: Finding Important Examples by Gradient-Based Pruning Current pre-trained language models rely on large datasets for achieving state-of-the-art performance. However, past research has shown that not all examples in a dataset are equally important during training. In fact, it is sometimes possible to prune a considerable fraction of the training set while maintaining the test .. 2022. 11. 14.
[2022-11-13] 오늘의 자연어처리 An Inclusive Notion of Text Natural language processing researchers develop models of grammar, meaning and human communication based on written text. Due to task and data differences, what is considered text can vary substantially across studies. A conceptual framework for systematically capturing these differences is lacking. We argue that clarity on the notion of text is crucial for reproducib.. 2022. 11. 13.
반응형