본문 바로가기
반응형

오늘의 자연어 처리572

[2023-03-16] 오늘의 자연어처리 Input-length-shortening and text generation via attention values Identifying words that impact a task's performance more than others is a challenge in natural language processing. Transformers models have recently addressed this issue by incorporating an attention mechanism that assigns greater attention (i.e., relevance) scores to some words than others. Because of the attention mechanism's hig.. 2023. 3. 16.
[2023-03-15] 오늘의 자연어처리 Diffusion Models for Non-autoregressive Text Generation: A Survey Non-autoregressive (NAR) text generation has attracted much attention in the field of natural language processing, which greatly reduces the inference latency but has to sacrifice the generation accuracy. Recently, diffusion models, a class of latent variable generative models, have been introduced into NAR text generation, showin.. 2023. 3. 15.
[2023-03-13] 오늘의 자연어처리 Making a Computational Attorney This "blue sky idea" paper outlines the opportunities and challenges in data mining and machine learning involving making a computational attorney -- an intelligent software agent capable of helping human lawyers with a wide range of complex high-level legal tasks such as drafting legal briefs for the prosecution or defense in court. In particular, we discuss what.. 2023. 3. 13.
[2023-03-12] 오늘의 자연어처리 ChatGPT is on the horizon: Could a large language model be all we need for Intelligent Transportation? ChatGPT, developed by OpenAI, is one of the largest Large Language Models (LLM) with over 175 billion parameters. ChatGPT has demonstrated the impressive capabilities of LLM, particularly in the field of natural language processing (NLP). With the emergence of the discussion and application of .. 2023. 3. 12.
반응형