KINGSTON - Queen's University developmental psychology professor Stanka Fitneva has co-authored a study in the journal Science that, for the first time, explores the replicability of psychology research.
The Reproducibility Project: Psychology, launched nearly four years ago, is one of the first crowdsourced research studies in the field. The researchers' most important finding was that, regardless of the analytic method or criteria used, fewer than half of their studies produced the same findings as the original study.
"This is a unique project in psychology, and maybe in all of science," says Dr. Fitneva. "It's the first crowdsourcing project where a number of labs from universities all around the world are involved in an effort to see to what extent findings that are published in major journals can be replicated by independent labs."
The 270 researchers in the study, at facilities around the globe, re-examined studies from the 2008 issues of three peer-reviewed journals and attempted to reproduce the results of the study. While the project hypothesized a reproducibility rate approaching 80 per cent, the authors were surprised to discover that less than half of the target findings were reproduced. Dr. Fitneva's team attempted to reproduce the results of an earlier study into the effects of language on children's object representation.
Dr. Fitneva and her co-authors propose three possible reasons for the surprising lack of reproducibility they encountered: small differences in how the studies were carried out; a random chance failure to detect the original result; or the possibility the original itself was a false positive. In addition, they highlight another possibility - the pre-eminence placed on new and innovative discoveries has incentivized researchers to aim for "new" rather than "reproducible" findings.
"Publication bias in science is a major issue and, in the last couple of years, more and more has surfaced about the detrimental consequences of this bias," says Dr. Fitneva. "Just like in any aspect of human activity, there are incentives that influence the conduct of research. Our journals have been prioritizing the publication of, and thus rewarding researchers for, novel and surprising findings.
"When we find something surprising it catches the imagination of the public and the media just as much as it catches the imagination of researchers and journal editors. We need to balance the verification processes in science against the drive for innovation" she adds. "Assessing the reproducibility of findings is essential for scientific progress but currently researchers receive few rewards for engaging in this practice."
The full results of the Reproducibility Project: Psychology have been published in the journal Science.