Learning Contextualized Knowledge Structures for Commonsense Reasoning

Abstract

Recently, neural-symbolic models have achieved noteworthy success in leveraging knowledge graphs (KGs) for commonsense reasoning tasks, like question answering (QA). However, fact sparsity, inherent in human-annotated KGs, can hinder such models from retrieving task-relevant knowledge. To address these issues, we propose Hybrid Graph Network (HGN), a neural-symbolic model that reasons over both extracted (human-labeled) and generated facts within the same learned graph structure. Given a KG subgraph of extracted facts, HGN is jointly trained to generate complementary facts, encode relational information in the resulting “hybrid” subgraph, and filter out task-irrelevant facts. We demonstrate HGN’s ability to produce contextually pertinent subgraphs by showing considerable performance gains across four commonsense reasoning benchmarks and a user study of fact validness and helpfulness.

Publication
Findings of ACL 2021
Aaron Chan
Aaron Chan
PhD Student

Building AI systems to communicate more intelligently with humans.