You are required to read and agree to the below before accessing a full-text version of an article in the IDE article repository.

The full-text document you are about to access is subject to national and international copyright laws. In most cases (but not necessarily all) the consequence is that personal use is allowed given that the copyright owner is duly acknowledged and respected. All other use (typically) require an explicit permission (often in writing) by the copyright owner.

For the reports in this repository we specifically note that

  • the use of articles under IEEE copyright is governed by the IEEE copyright policy (available at http://www.ieee.org/web/publications/rights/copyrightpolicy.html)
  • the use of articles under ACM copyright is governed by the ACM copyright policy (available at http://www.acm.org/pubs/copyright_policy/)
  • technical reports and other articles issued by M‰lardalen University is free for personal use. For other use, the explicit consent of the authors is required
  • in other cases, please contact the copyright owner for detailed information

By accepting I agree to acknowledge and respect the rights of the copyright owner of the document I am about to access.

If you are in doubt, feel free to contact webmaster@ide.mdh.se

Backward-Forward Sequence Generative Network for Multiple Lexical Constraints

Authors:

Seemab Latif , Sarmad Bashir, Mir Muntasar Ali Agha , Rabia Latif

Publication Type:

Conference/Workshop Paper

Venue:

16th International Conference on Artificial Intelligence Applications and Innovations

Publisher:

Springer

DOI:

10.1007/978-3-030-49186-4_4


Abstract

Advancements in Long Short Term Memory (LSTM) Networks have shown remarkable success in various Natural Language Generation (NLG) tasks. However, generating sequence from pre-specified lexical constraints is a new, challenging and less researched area in NLG. Lexical constraints take the form of words in the language model’s output to create fluent and meaningful sequences. Furthermore, most of the previous approaches cater this problem by allowing the inclusion of pre-specified lexical constraints during the decoding process, which increases the decoding complexity exponentially or linearly with the number of constraints. Moreover, some of the previous approaches can only deal with single constraint. In this paper, we propose a novel neural probabilistic architecture based on backward-forward language model and word embedding substitution method that can cater multiple lexical constraints for generating quality sequences. Experiments shows that our proposed architecture outperforms previous methods in terms of intrinsic evaluation.

Bibtex

@inproceedings{Latif6598,
author = {Seemab Latif and Sarmad Bashir and Mir Muntasar Ali Agha and Rabia Latif},
title = {Backward-Forward Sequence Generative Network for Multiple Lexical Constraints},
pages = {39--50},
month = {May},
year = {2020},
booktitle = {16th International Conference on Artificial Intelligence Applications and Innovations},
publisher = {Springer},
url = {http://www.es.mdu.se/publications/6598-}
}