CMU logo
Search
Expand Menu
Close Menu

HCII PhD Thesis Defense: Napol Rachatasumrit

Open in new window

When
-

Description

Meaningful Models: Unlocking Insights Through Model Interpretations in Educational Data Mining
Napol Rachatasumrit
HCII PhD Thesis Defense

Time and Location

Monday, May 19, 2025 @ 3.30 pm ET

Newell-Simon Hall (NSH) 3305

Zoom Linkhttps://cmu.zoom.us/j/94940483907?pwd=NRkIYk4IX5zqFH7BqnsPvTsfprCZ8M.1&jst=2

Meeting ID: 94940483907
Passcode: 631197

 

Thesis Committee

Kenneth Koedinger (Co-chair), HCII, CMU
Paulo Carvalho (Co-chair), HCII, CMU
Kenneth Holstein, HCII, CMU
Adam Sales, Mathematical Sciences, WPI

Abstract

The conventional wisdom in Educational Data Mining (EDM) suggests that a superior model fits the data better. However, this perspective overlooks a critical aspect: the value of machine learning models lies not merely in their predictive power, but fundamentally comes from their use. Models that prioritize prediction accuracy often fail to provide scientifically or practically meaningful interpretations. Meaningful interpretations are crucial for scientific insight and are useful for practical applications, especially from the human-centered perspective. For example, Deep Knowledge Tracing (DKT) has been demonstrated to have a superior predictive power of student performance; however, its parameters do not have an association with any latent constructs, so there have been no scientific insights or practical applications resulting from it. In contrast, Additive Factor Model (AFM) often underperforms DKT in prediction accuracy, but its parameter estimates have meaningful interpretations (e.g., the slope illustrates the rate of learning of knowledge components) that lead to new scientific insights (e.g. improved cognitive models discovery) and results in useful practical applications (e.g. an intelligent tutoring system redesign).

 

In this thesis, I argue for a claim that meaningful interpretations are what we need rather than post-hoc explanations or uninterpreted interpretable models, especially in the context of EDM.  I explore a concept of "meaningful models" as inherently interpretable models whose parameters and outputs are not only transparent but actively interpreted. Moreover, their interpretations lead to useful and actionable insights for stakeholders. I illustrate the benefits of meaningful models through examples where existing mechanisms or models are insufficient to produce meaningful interpretations and demonstrating how enhancements can yield scientifically or practically valuable insights. For example, Performance Factor Analysis (PFA) has been demonstrated to outperform AFM, but we show that PFA parameters are confounded, which resulted in ambiguous interpretations. We then proposed improved models that not only de-confound the parameters but also presented meaningful interpretations that lead to insights on the associated knowledge component model and suggested instructional improvement. Overall, this thesis highlights the essential role of meaningful models in EDM, emphasizing that only through meaningful interpretations can models effectively drive practical improvements in educational practices and advance scientific understanding.
 

Dissertation: https://drive.google.com/file/d/1t1-TUXZdbuMb0LFMn_JoLzvind7aMx-r/view?usp=sharing

Best, 

Napol Rachatasumrit