Abstract: Every year, millions of students learn how to write programs. Learning activities for beginners almost always include programming tasks that require a student to write a program to solve a particular problem. When learning how to solve such a task, many students need feedback on their previous actions, and hints on how to proceed. In the case of programming, the feedback should take the steps a student has taken towards implementing a solution into account, and the hints should help a student to complete or improve a possibly partial solution. Only a limited number of learning environments for programming give feedback and hints on intermediate steps students take towards a solution, and little is known about the quality of the feedback provided. To determine the quality of feedback of such tools and to help further developing them, we create and curate data sets that show what kinds of steps students take when solving programming exercises for beginners, and what kind of feedback and hints should be provided. This working group aims to 1) select or create several data sets with steps students take to solve programming tasks, 2) introduce a method to annotate students' steps in these data sets, 3) attach feedback and hints to these steps, 4) set up a method to utilize these data sets in various learning environments for programming, and 5) analyse the quality of hints and feedback in these learning environments.