Publications Database

Welcome to the new Schulich Peer-Reviewed Publication Database!

The database is currently in beta-testing and will be updated with more features as time goes on. In the meantime, stakeholders are free to explore our faculty’s numerous works. The left-hand panel affords the ability to search by the following:

  • Faculty Member’s Name;
  • Area of Expertise;
  • Whether the Publication is Open-Access (free for public download);
  • Journal Name; and
  • Date Range.

At present, the database covers publications from 2012 to 2020, but will extend further back in the future. In addition to listing publications, the database includes two types of impact metrics: Altmetrics and Plum. The database will be updated annually with most recent publications from our faculty.

If you have any questions or input, please don’t hesitate to get in touch.

 

Search Results

Schweinsberg, M., Madan, N., Vianello, M., Sommer, S. A., Jordan, J., Tierney, W., Awtrey, E., Zhu, L., … and Uhlmann, E.L (2016). "The Pipeline Project: Prepublication Independent Replications of a Single Laboratory’s Research Pipeline", Journal of Experimental Social Psychology, 66, 55-67.

View Paper

Abstract This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had “in the pipeline” as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed.