In recent years the field has improved the standards for replicators to follow to help ensure that a replication attempt, whether it succeeds or fails, will be informative. When these standards are not followed, false claims can result and opportunities to learn are missed, which could undermine the larger scientific enterprise and hinder the accumulation of knowledge. In the case addressed here-Li and Bates' (in press) attempt to replicate Mueller and Dweck's (1998) findings on the effects of ability versus effort praise on post-failure performance-the replicating authors did not follow best practices in the design or analysis of the study. Correcting even the simplest deviations from standard procedures yielded a clear replication of the original results. Li and Bates' data therefore provided one of the strongest possible types of evidence in support of Mueller and Dweck's (1998) findings: an independent replication by a researcher who is on record being skeptical of the phenomenon. The present paper highlights the wisdom of upholding the field's rigorous standards for replication research. It also highlights the importance of moving beyond yes/no thinking in replication studies and toward an approach that values collaboration, generalization, and the systematic identification of boundary conditions.