NLtoPDDL: One-Shot Learning of PDDL Models from Natural Language Process Manuals

Shivam Miglani, Neil Yorke-Smith

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

307 Downloads (Pure)

Abstract

Existing automated domain acquisition approaches require large amounts of structured data in the form of plans or plan traces to converge. Further, automatically-generated domain models can be incomplete, error-prone, and hard to understand or modify. To mitigate these issues, we take advantage of readily-available natural language data: existing process manuals. We present a domain-authoring pipeline called NLtoPDDL, which takes as input a plan written in natural language and outputs a corresponding PDDL model. We employ a two-stage approach: stage one advances the state-of-the-art in action sequence extraction by utilizing transfer learning via pre-trained contextual language models (BERT and ELMo). Stage two employs an interactive modification of an object-centric algorithm which keeps human-in-the-loop to one-shot learn a PDDL model from the extracted plan. We show that NLtoPDDL is an effective and flexible domain-authoring tool by using it to learn five real-world planning domains of varying complexities and evaluating them for their completeness, soundness and quality.
Original languageEnglish
Title of host publicationWorking Notes of the ICAPS'20 Workshop on Knowledge Engineering for Planning and Scheduling (KEPS'20)
PublisherICAPS
Number of pages9
Publication statusPublished - 2020
EventICAPS’20 Workshop on Knowledge Engineering for Planning and Scheduling (KEPS’20) - Nancy, France
Duration: 1 Nov 20201 Nov 2020

Workshop

WorkshopICAPS’20 Workshop on Knowledge Engineering for Planning and Scheduling (KEPS’20)
Abbreviated titleKEP'20
Country/TerritoryFrance
CityNancy
Period1/11/201/11/20

Fingerprint

Dive into the research topics of 'NLtoPDDL: One-Shot Learning of PDDL Models from Natural Language Process Manuals'. Together they form a unique fingerprint.

Cite this