Publications Database
Welcome to the new Schulich Peer-Reviewed Publication Database!
The database is currently in beta-testing and will be updated with more features as time goes on. In the meantime, stakeholders are free to explore our faculty’s numerous works. The left-hand panel affords the ability to search by the following:
- Faculty Member’s Name;
- Area of Expertise;
- Whether the Publication is Open-Access (free for public download);
- Journal Name; and
- Date Range.
At present, the database covers publications from 2012 to 2020, but will extend further back in the future. In addition to listing publications, the database includes two types of impact metrics: Altmetrics and Plum. The database will be updated annually with most recent publications from our faculty.
If you have any questions or input, please don’t hesitate to get in touch.
Search Results
Miloš Fišar, Ben Greiner, Christoph Huber, Elena Katok, Ali I. Ozkes, and the Management Science Reproducibility Collaboration (2024). "Reproducibility in Management Science", Management Science, 70(3), 1343-1356.
Abstract
With the help of more than 700 reviewers, we assess the reproducibility of nearly 500 articles published in the journal Management Science before and after the introduction of a new Data and Code Disclosure policy in 2019. When considering only articles for which data accessibility and hardware and software requirements were not an obstacle for reviewers, the results of more than 95% of articles under the new disclosure policy could be fully or largely computationally reproduced. However, for 29% of articles, at least part of the data set was not accessible to the reviewer. Considering all articles in our sample reduces the share of reproduced articles to 68%. These figures represent a significant increase compared with the period before the introduction of the disclosure policy, where only 12% of articles voluntarily provided replication materials, of which 55% could be (largely) reproduced. Substantial heterogeneity in reproducibility rates across different fields is mainly driven by differences in data set accessibility. Other reasons for unsuccessful reproduction attempts include missing code, unresolvable code errors, weak or missing documentation, and software and hardware requirements and code complexity. Our findings highlight the importance of journal code and data disclosure policies and suggest potential avenues for enhancing their effectiveness.Schweinsberg, M., Madan, N., Vianello, M., Sommer, S. A., Jordan, J., Tierney, W., Awtrey, E., Zhu, L., … and Uhlmann, E.L (2016). "The Pipeline Project: Prepublication Independent Replications of a Single Laboratory’s Research Pipeline", Journal of Experimental Social Psychology, 66, 55-67.
Abstract
This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had “in the pipeline” as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed.Packard, G. with 48 other authors (2014). "Investigating Variation in Replicability: A “Many Labs” Replication Project", Social Psychology, 45(3), 142-152.