In this paper, we introduce Crowdclass, a novel method that integrates the learning of advanced scientific concepts with the crowdsourcing microtask of image classification. In Crowdclass, we design the questions to serve as both a learning experience and a scientific classification. This is different from conventional citizen science platforms which decompose high-level questions into a series of simple microtasks that require no scientific background knowledge to complete. We facilitate learning within the microtask by providing content that is appropriate for the participant’s level of knowledge through scaffolding learning. We conduct a between-group study of 93 participants on Amazon Mechanical Turk comparing Crowdclass to the popular citizen science project Galaxy Zoo. We find that the scaffolding presentation of content enables learning of more challenging concepts. By quantifying the trade-off between user motivation, learning, and performance, we draw general design principles for learning-as-an-incentive interventions applicable to other crowdsourcing applications.