Research in the PILab focuses on developing and applying new methods for acquiring, organizing, and synthesizing psychological data. Our projects span multiple fields of psychology, including cognitive neuroimaging, personality psychology, psycholinguistics, judgment & decision-making, and working memory & executive control. The common theme uniting research in the lab is a belief in the transformative power of good methods to reveal novel insights into the nature of the human mind and brain. Some of our current lines of research include:
Neurosynth is a software framework for large-scale automated synthesis of functional neuroimaging data. The framework uses text mining, meta-analysis, and machine-learning techniques to distill the results of nearly 6,000 published fMRI articles and make them available to the neuroimaging community via a web interface (http://neurosynth.org). We introduced Neurosynth in a 2011 Nature Methods paper illustrating the ability of relatively simple text mining and machine learning methods to reproduce fMRI meta-analyses that previously had required a considerable amount of manual effort. We also showed that the framework could be used to support quantitative reverse inference--that is, inferring cognitive function from patterns of brain activation in a statistically supported manner. In a more recent application (Chang et al, 2012), we used the Neurosynth framework to 'decode' cognitive functions associated with distinct insula networks, revealing greater functional specificity than is readily apparent using conventional approaches.
Current work on Neurosynth focuses on expanding the codebase, developing novel interactive visualization tools, and adding real-time web-based meta-analysis nad decoding capabilities. The code, data, and resulting images are all freely accessible online, and we're always looking for new collaborators. Work on Neurosynth is generously supported by an R01 grant from NIMH.
Traditional pre-publication peer review of scientific output is a slow, inefficient, and unreliable process. A growing movement within the scientific community seeks to ensure that science is done out in the open, facilitating rapid communication, evaluation, and replication of research findings. One line of research in the PILab focuses on designing and promoting alternative evaluation platforms based on recommender systems and collaborative filtering algorithms widely used in commercial web applications. For instance, in Yarkoni (2012), we joined a chorus of other commentators in calling for a shift away from pre-publication review to post-publication evaluation. We proposed an evaluation platform based largely on social news sites like reddit and Q&A sites like Stack Overflow, and argued that successful implementation of open evaluation platforms has the potential to dramatically improve both the pace and the quality of scientific publication and evaluation.