Abstract: Viewing worked examples before problem solving has been shown to improve learning efficiency in novice programming. Example-based feedback seeks to present smaller, adaptive worked example steps during problem solving. We present a method for automatically generating and selecting adaptive, example-based programming feedback using historical student data. Our data-driven feature-based (DDF) example generation method automatically learns program features from data and selects example pairs based on when students complete each feature. We performed an experiment to compare three example generation methods: Student trace data, Data-Driven Features (DDF), and Expert examples. Two experts rated the quality of feedback for each generator, and they rated both the Expert and DDF example feedback as significantly more relevant to students’ goals than the Student example feedback. However, there were no significant differences between the DDF and Expert examples. We compared these approaches to one that combined DDF with an Interactive Selection step (DDF-IS), where the user (in this case, an expert) selects their preferred data-driven feature before an example is selected. DDF-IS produced significantly more relevant examples than all other approaches, with significantly higher overall example quality than DDF. This suggests that our DDF approach allows more relevant examples to be selected than existing approaches, and that we may be able to leverage interactivity with the student to further improve example quality.