Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/132126
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: A short survey of pre-trained language models for conversational AI-A new age in NLP
Author: Zaib, M.
Sheng, Q.Z.
Zhang, W.
Citation: Proceedings of the Australasian Computer Science Week (ACSW'20), 2020, pp.1-4
Publisher: Association for Computing Machinery
Publisher Place: online
Issue Date: 2020
ISBN: 9781450376976
ISSN: 2153-1633
Conference Name: Australasian Computer Science Week (ACSW) (3 Feb 2020 - 7 Feb 2020 : Melbourne, Australia)
Statement of
Responsibility: 
Munazza Zaib, Quan Z. Sheng, Wei Emma Zhang
Abstract: Building a dialogue system that can communicate naturally with humans is a challenging yet interesting problem of agent-based computing. The rapid growth in this area is usually hindered by the long-standing problem of data scarcity as these systems are expected to learn syntax, grammar, decision making, and reasoning from insufficient amounts of task-specific dataset. The recently introduced pre-trained language models have the potential to address the issue of data scarcity and bring considerable advantages by generating contextualized word embeddings. These models are considered counterpart of ImageNet in NLP and have demonstrated to capture different facets of language such as hierarchical relations, long-term dependency, and sentiment. In this short survey paper, we discuss the recent progress made in the field of pre-trained language models. We also deliberate that how the strengths of these language models can be leveraged in designing more engaging and more eloquent conversational agents. This paper, therefore, intends to establish whether these pre-trained models can overcome the challenges pertinent to dialogue systems, and how their architecture could be exploited in order to overcome these challenges. Open challenges in the field of dialogue systems have also been deliberated.
Keywords: Agent-based computing; dialogue systems; pre-trained language models; natural language processing; intelligent agents
Rights: © 2020 Association for Computing Machinery.
DOI: 10.1145/3373017.3373028
Published version: https://dl.acm.org/doi/proceedings/10.1145/3373017
Appears in Collections:Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.