"We have some 30 papers in peer-reviewed journals where we are actually sure that they are fake, and there are more to come," says Pim Levelt, chair of the committee that investigated Stapel's work at the university.Peer-reviewed,
I think a lot of people don't know what peer-review actually involves. It does not involve rerunning the other scientist's experiments. It involves reading the paper and sniffing to determine whether it passes the "smell test". In this case, there were some malodorous clues:
The data were also suspicious, the report says: effects were large; missing data and outliers were rare; and hypotheses were rarely refuted. Journals publishing Stapel's papers did not question the omission of details about where the data came from. "We see that the scientific checks and balances process has failed at several levels," Levelt says.Note that some of these issues pertain to trends in his work as a whole, rather than being obvious in a particular paper. As noted in the article, his work seemed too good to be true.
So remember - if you're going to fake your data, do it in moderation!
Nothing succeeds like success,
but nothing exceeds like excess.