본문 바로가기
반응형

분류 전체보기599

[2022-12-26] 오늘의 자연어처리 Esports Data-to-commentary Generation on Large-scale Data-to-text Dataset Esports, a sports competition using video games, has become one of the most important sporting events in recent years. Although the amount of esports data is increasing than ever, only a small fraction of those data accompanies text commentaries for the audience to retrieve and understand the plays. Therefore, in this stud.. 2022. 12. 26.
[2022-12-25] 오늘의 자연어처리 GENIE: Large Scale Pre-training for Text Generation with Diffusion Model In this paper, we propose a large-scale language pre-training for text GENeration using dIffusion modEl, which is named GENIE. GENIE is a pre-training sequence-to-sequence text generation model which combines Transformer and diffusion. The diffusion model accepts the latent information from the encoder, which is used to gui.. 2022. 12. 25.
[2022-12-24] 오늘의 자연어처리 Evaluation for Change Evaluation is the central means for assessing, understanding, and communicating about NLP models. In this position paper, we argue evaluation should be more than that: it is a force for driving change, carrying a sociological and political character beyond its technical dimensions. As a force, evaluation's power arises from its adoption: under our view, evaluation succeeds .. 2022. 12. 24.
[2022-12-23] 오늘의 자연어처리 Training language models for deeper understanding improves brain alignment Building systems that achieve a deeper understanding of language is one of the central goals of natural language processing (NLP). Towards this goal, recent works have begun to train language models on narrative datasets which require extracting the most critical information by integrating across long contexts. However, i.. 2022. 12. 23.
반응형