본문 바로가기
반응형

논문572

[2023-02-03] 오늘의 자연어처리 An Empirical Study on the Transferability of Transformer Modules in Parameter-Efficient Fine-Tuning Parameter-efficient fine-tuning approaches have recently garnered a lot of attention. Having considerably lower number of trainable weights, these methods can bring about scalability and computational effectiveness. In this paper, we look for optimal sub-networks and investigate the capability of .. 2023. 2. 3.
[2023-02-02] 오늘의 자연어처리 Adaptive Machine Translation with Large Language Models Consistency is a key requirement of high-quality translation. It is especially important to adhere to pre-approved terminology and corrected translations in domain-specific projects. Machine translation (MT) has achieved significant progress in the area of domain adaptation. However, real-time adaptation remains challenging. Large-scale lan.. 2023. 2. 2.
[2023-02-01] 오늘의 자연어처리 Knowledge Transfer from Pre-trained Language Models to Cif-based Speech Recognizers via Hierarchical Distillation Large-scale pre-trained language models (PLMs) with powerful language modeling capabilities have been widely used in natural language processing. For automatic speech recognition (ASR), leveraging PLMs to improve performance has also become a promising research trend. However, most p.. 2023. 2. 1.
[2023-01-31] 오늘의 자연어처리 Reading and Reasoning over Chart Images for Evidence-based Automated Fact-Checking Evidence data for automated fact-checking (AFC) can be in multiple modalities such as text, tables, images, audio, or video. While there is increasing interest in using images for AFC, previous works mostly focus on detecting manipulated or fake images. We propose a novel task, chart-based fact-checking, and intro.. 2023. 1. 31.
반응형