Skip to yearly menu bar Skip to main content


Poster

On the Joint Interaction of Models, Data, and Features

Yiding Jiang · Christina Baek · J Kolter

Halle B
[ ]
Fri 10 May 1:45 a.m. PDT — 3:45 a.m. PDT
 
Oral presentation: Oral 7C
Fri 10 May 1 a.m. PDT — 1:45 a.m. PDT

Abstract:

Learning features from data is one of the defining characteristics of deep learning,but our theoretical understanding of the role features play in deep learning is stillrudimentary. To address this gap, we introduce a new tool, the interaction tensor,for empirically analyzing the interaction between data and model through features.With the interaction tensor, we make several key observations about how featuresare distributed in data and how models with different random seeds learn differentfeatures. Based on these observations, we propose a conceptual framework for fea-ture learning. Under this framework, the expected accuracy for a single hypothesisand agreement for a pair of hypotheses can both be derived in closed-form. Wedemonstrate that the proposed framework can explain empirically observed phenomena, including the recently discovered Generalization Disagreement Equality(GDE) that allows for estimating the generalization error with only unlabeled data.Further, our theory also provides explicit construction of natural data distributionsthat break the GDE. Thus, we believe this work provides valuable new insight intoour understanding of feature learning.

Chat is not available.