Mandar Sharma, Nikhil Muralidhar, Naren Ramakrishnan

Abstract

The field of Math-NLP has witnessed significant growth in recent years, motivated by the desire to expand LLM performance to the leaning of non-linguistic notions (numerals, and subsequently, arithmetic reasoning). However, non-linguistic skill injection typically comes at a cost for LLMs: it leads to catastrophic forgetting of core linguistic skills, a consequence that often remains unaddressed in the literature. As Math-NLP has been able to create LLMs that can closely approximate the mathematical skills of a grade schooler or the arithmetic reasoning skills of a calculator, the practicality of these models fail if they concomitantly shed their linguistic capabilities. In this work, we take a closer look into the phenomena of catastrophic forgetting as it pertains to LLMs and subsequently offer a novel framework for non-linguistic skill injection for LLMs based on information-theoretic interventions and skill-specific losses that enable the learning of strict arithmetic reasoning. Our model outperforms the state-of-the-art both on injected non-linguistic skills and on linguistic knowledge retention, and does so with a fraction of the non-linguistic training data (1/4) and zero additional synthetic linguistic training data.

Mandar SharmaNikhil MuralidharNaren Ramakrishnan: Learning Non-linguistic Skills without Sacrificing Linguistic Proficiency. ACL (1) 2023: 6178-6191

People

Mandar Sharma


Naren Ramakrishnan


Publication Details

Date of publication:
July 9, 2023
Conference:
Association for Computational Linguistics
Page number(s):
6178-6191