Poster
in
Workshop: Workshop on Spurious Correlation and Shortcut Learning: Foundations and Solutions
Excessive Supervision and Shortcuts Prevent In-domain Learning of Trivial Graph Search
Arvid Frydenlund
Keywords: [ Task Decomposition ] [ Language Models ] [ Shortcuts ] [ Graph Search ]
[
Abstract
]
[ Project Page ]
presentation:
Workshop on Spurious Correlation and Shortcut Learning: Foundations and Solutions
Sun 27 Apr 5:30 p.m. PDT — 3 a.m. PDT
[
OpenReview]
Sun 27 Apr 5:30 p.m. PDT — 3 a.m. PDT
Abstract:
This work concerns the path-star task, a minimal example of searching over a graph. The graph, $G$, is star-shaped with $D$ arms radiating from a start node, $s$. A language model (LM) is given $G$, $s$, and a target node, $t$, which ends one of the arms and is tasked with generating the arm containing $t$. The minimal nature of this task means only a single choice needs to be made: which of the $D$ arms contains $t$? Decoder-only LMs fail to solve this simple task above $1/D$ chance due to a learned shortcut that absorbs training supervision. We show how this pathology is caused by excess supervision and present a series of solutions demonstrating that the task is solvable via decoder-only LMs. We find that the task's minimal nature causes its difficulty, as it prevents task decomposition. Our solutions provide insight into the pathology and its implications for LMs trained via next-token prediction.
Chat is not available.