Poster
Neural-Symbolic Recursive Machine for Systematic Generalization
Qing Li · Yixin Zhu · Yitao Liang · Yingnian Wu · Song-Chun Zhu · Siyuan Huang
Halle B
Current learning models often struggle with human-like systematic generalization; learning compositional rules from limited data and extrapolating them to unseen combinations. To address this, we introduce Neural-Symbolic Recursive Machine (NSR), a model whose core representation is a Grounded Symbol System (GSS ), with its combinatorial syntax and semantics emerging entirely from the training data. The NSR adopts a modular approach, incorporating neural perception, syntactic parsing, and semantic reasoning, which are jointly learned through a deduction-abduction algorithm. We establish that NSR possesses sufficient expressiveness to handle a variety of sequence-to-sequence tasks and attains superior systematic generalization, thanks to the inductive biases of equivariance and recursiveness inherent in each module. We assess NSR ’s performance against four rigorous benchmarks designed to test systematic generalization: SCAN for semantic parsing, PCFG for string manipulation, HINT for arithmetic reasoning, and a task involving compositional machine translation. Our results indicate that NSR outperforms existing neural or hybrid models in terms of generalization and transferability.