A Grammar-Based Method for Instilling Empirical Dependency Structure in LLMs

Date

2025-03

Journal Title

Journal ISSN

Volume Title

Publisher

University of Tartu Library

Abstract

We investigate whether synthetic pretraining data generated from a formal grammar modeling syntactic dependencies can improve English language models. Building upon the structured pretraining data approach of Papadimitriou and Jurafsky (2023), we develop a grammar that more closely mirrors empirical dependency structures. Our results are negative – this type of pretraining significantly degrades model performance, with both our and their pretraining approach performing worse than no pretraining at all. We analyze potential explanations for these findings and discuss implications for future work on structured-data pretraining.

Description

Keywords

Citation