Identifying Core Concepts in Educational Resources     
James Foster; Md. Arafat Sultan; Holly Devaul; Ifeyinwa Okoye; Tamara Sumner

This paper describes the results of a study designed to assess human expert ratings of educational concept features for use in automatic core concept extraction systems. Digital library resources provided the content base for human experts to annotate automatically extracted concepts on seven dimensions: coreness, local importance, topic, content, phrasing, structure, and function. We use the non-coreness dimensions as predictors of concept coreness and show that it significantly improves the prediction accuracy over a baseline classifier.