본문 바로가기
반응형

오늘의 자연어 처리572

[2022-08-24] 오늘의 자연어처리 Few-Shot Table-to-Text Generation with Prefix-Controlled Generator Neural table-to-text generation approaches are data-hungry, limiting their adaptation for low-resource real-world applications. Previous works mostly resort to Pre-trained Language Models (PLMs) to generate fluent summaries of a table. However, they often contain hallucinated contents due to the uncontrolled nature of PLMs. Moreo.. 2022. 8. 24.
[2022-08-24] 오늘의 자연어처리 Generalized Attention Mechanism and Relative Position for Transformer In this paper, we propose generalized attention mechanism (GAM) by first suggesting a new interpretation for self-attention mechanism of Vaswani et al. . Following the interpretation, we provide description for different variants of attention mechanism which together form GAM. Further, we propose a new relative position repres.. 2022. 8. 24.
[2022-08-23] 오늘의 자연어처리 Beyond Text Generation: Supporting Writers with Continuous Automatic Text Summaries We propose a text editor to help users plan, structure and reflect on their writing process. It provides continuously updated paragraph-wise summaries as margin annotations, using automatic text summarization. Summary levels range from full text, to selected (central) sentences, down to a collection of keywords. .. 2022. 8. 23.
[2022-08-22] 오늘의 자연어처리 Analyzing Robustness of End-to-End Neural Models for Automatic Speech Recognition We investigate robustness properties of pre-trained neural models for automatic speech recognition. Real life data in machine learning is usually very noisy and almost never clean, which can be attributed to various factors depending on the domain, e.g. outliers, random noise and adversarial noise. Therefore, the m.. 2022. 8. 22.
반응형