본문 바로가기
반응형

분류 전체보기599

[2022-10-12] 오늘의 자연어처리 Robustification of Multilingual Language Models to Real-world Noise with Robust Contrastive Pretraining Advances in neural modeling have achieved state-of-the-art (SOTA) results on public natural language processing (NLP) benchmarks, at times surpassing human performance. However, there is a gap between public benchmarks and real-world applications where noise such as typos or grammatical mistak.. 2022. 10. 12.
[2022-10-11] 오늘의 자연어처리 PARAGEN : A Parallel Generation Toolkit PARAGEN is a PyTorch-based NLP toolkit for further development on parallel generation. PARAGEN provides thirteen types of customizable plugins, helping users to experiment quickly with novel ideas across model architectures, optimization, and learning strategies. We implement various features, such as unlimited data loading and automatic model selection, t.. 2022. 10. 11.
[2022-10-10] 오늘의 자연어처리 Detecting Narrative Elements in Informational Text Automatic extraction of narrative elements from text, combining narrative theories with computational models, has been receiving increasing attention over the last few years. Previous works have utilized the oral narrative theory by Labov and Waletzky to identify various narrative elements in personal stories texts. Instead, we direct our focus .. 2022. 10. 10.
[2022-10-09] 오늘의 자연어처리 Retrieval of Soft Prompt Enhances Zero-Shot Task Generalization During zero-shot inference with language models (LMs), using hard prompts alone may not be able to fully describe the target task. In this paper, we explore how the retrieval of soft prompts obtained through prompt tuning can assist hard prompts in zero-shot task generalization. Specifically, we train soft prompt embeddings for each.. 2022. 10. 9.
반응형