본문 바로가기
반응형

분류 전체보기599

[2022-12-22] 오늘의 자연어처리 ZEROTOP: Zero-Shot Task-Oriented Semantic Parsing using Large Language Models We explore the use of large language models (LLMs) for zero-shot semantic parsing. Semantic parsing involves mapping natural language utterances to task-specific meaning representations. Language models are generally trained on the publicly available text and code and cannot be expected to directly generalize to domain.. 2022. 12. 22.
[2022-12-22] 오늘의 자연어처리 Semantically-informed Hierarchical Event Modeling Prior work has shown that coupling sequential latent variable models with semantic ontological knowledge can improve the representational capabilities of event modeling approaches. In this work, we present a novel, doubly hierarchical, semi-supervised event modeling framework that provides structural hierarchy while also accounting for ontologica.. 2022. 12. 22.
[2022-12-21] 오늘의 자연어처리 Resoling Open-textured Rules with Templated Interpretive Arguments Open-textured terms in written rules are typically settled through interpretive argumentation. Ongoing work has attempted to catalogue the schemes used in such interpretive argumentation. But how can the use of these schemes affect the way in which people actually use and reason over the proper interpretations of open-textured te.. 2022. 12. 21.
[2022-12-20] 오늘의 자연어처리 Fast Rule-Based Decoding: Revisiting Syntactic Rules in Neural Constituency Parsing Most recent studies on neural constituency parsing focus on encoder structures, while few developments are devoted to decoders. Previous research has demonstrated that probabilistic statistical methods based on syntactic rules are particularly effective in constituency parsing, whereas syntactic rules are not use.. 2022. 12. 20.
반응형