Abstract: Novice programmers often struggle on assignments, and timely help, such as a hint on what to do next, can help students continue to progress and learn, rather than giving up. However, in large programming classrooms, it is hard for instructors to provide such real-time support for every student. Researchers have therefore put tremendous effort into developing algorithms to generate automated, data-driven hints to help students at scale. Despite this, few controlled studies have directly evaluated the impact of such hints on students' performance, and learning. It is also unclear what specific design features make hints more or less effective. In this work, we present iSnap, a block-based programming environment that provides novices with data-driven, next-step hints in real-time. This paper describes our improvements to iSnap over 4 years, including its ``enhanced'' next-step hints with three design features: textual explanations, self-explanation prompts and an adaptive hint display. Moreover, we conducted a controlled study in an authentic classroom setting over several weeks to evaluate the impact of iSnap’s enhanced hints on students' performance and learning. We found students who received the enhanced hints perform better on in-class assignments and have higher programming efficiency in homework assignments than those who did not receive hints, but that hints did not significantly impact students' learning. We also discuss the challenges of classroom studies and the implications of enhanced hints compared to prior evaluations in laboratory settings, which is essential to validate the efficacy of next-step hints' impact in a real classroom experience.