Publications Database

Welcome to the new Schulich Peer-Reviewed Publication Database!

The database is currently in beta-testing and will be updated with more features as time goes on. In the meantime, stakeholders are free to explore our faculty’s numerous works. The left-hand panel affords the ability to search by the following:

  • Faculty Member’s Name;
  • Area of Expertise;
  • Whether the Publication is Open-Access (free for public download);
  • Journal Name; and
  • Date Range.

At present, the database covers publications from 2012 to 2020, but will extend further back in the future. In addition to listing publications, the database includes two types of impact metrics: Altmetrics and Plum. The database will be updated annually with most recent publications from our faculty.

If you have any questions or input, please don’t hesitate to get in touch.

 

Search Results

Yeomans, J.S. (2018). "Computationally Testing the Efficacy of a Modelling-to-Generate-Alternatives Procedure for Simultaneously Creating Solutions", Journal of Computer Engineering, 20(1), 38-45.

Open Access Download

Abstract “Real world” applications tend to contain complex performance specifications riddled with contradictory performance elements. This state arises because policymaking naturally involves multifaceted problems that are riddled with competing performance objectives and contain incompatible design requirements which are very problematic – if not impossible – to capture at the time that the requisite decision models are constructed. There are invariably unmodelled components, not readily apparent during model formulation, which could greatly impact the suitability of the model’s solutions. Consequently, it proves preferable to generate a number of dissimilar alternatives that provide multiple, distinct perspectives to the problem. These different options should all possess close-to-optimal measures with respect to the specified objective(s), but be maximally different from each other in the decision space. These maximally different solution construction approaches have been referred to as modelling-to-generate-alternatives (MGA). This study provides a procedure that simultaneously generates multiple, maximally different alternatives by employing the metaheuristic, Firefly Algorithm. The efficacy of this efficient algorithmic optimization approach is demonstrated on a commonly-tested engineering benchmark problem.

Imanirad, R., Yang, X.S. and Yeomans, J.S. (2013). "Modelling-to-Generate-Alternatives Via the Firefly Algorithm", Journal of Applied Operational Research, 5(1), 14-21.

Open Access Download

Abstract “Real world” decision-making often involves complex problems that are riddled with incompatible and inconsistent performance objectives. These problems typically possess competing design requirements which are very difficult – if not impossible – to quantify and capture at the time that any supporting decision models are constructed. There are invariably unmodelled design issues, not apparent during the time of model construction, which can greatly impact the acceptability of the model’s solutions. Consequently, when solving many practical mathematical programming applications, it is generally preferable to formulate numerous quantifiably good alternatives that provide very different perspectives to the problem. These alternatives should possess near-optimal objective measures with respect to all known modelled objectives, but be fundamentally different from each other in terms of the system structures characterized by their decision variables. This solution approach is referred to as modelling-to-generate-alternatives (MGA). This study demonstrates how the nature-inspired, Firefly Algorithm can be used to efficiently create multiple solution alternatives that both satisfy required system performance criteria and yet are maximally different in their decision spaces.

Imanirad, R. and Yeomans, J.S. (2012). "A Computationally Efficient Modelling-to-Generate-Alternatives Method Using the Firefly Algorithm", Lecture Notes in Management Science, 4, 30-36.

Open Access Download

Abstract In solving many practical mathematical programming applications, it is preferable to formulate numerous quantifiably good alternatives that provide very different perspectives to the problem. This is because decision-making typically involves complex problems that are riddled with incompatible and inconsistent performance objectives and possess competing design requirements which are very difficult – if not impossible – to quantify and capture at the time that the supporting decision models are constructed. There are invariably unmodelled design issues, not apparent at the time of model construction, which can greatly impact the acceptability of the model’s solutions. Consequently, it is preferable to generate several alternatives that provide multiple, disparate perspectives to the problem. These alternatives should possess near-optimal objective measures with respect to all known modelled objective(s), but be fundamentally different from each other in terms of the system structures characterized by their decision variables. This solution approach is referred to as modelling-to-generatealternatives (MGA). This study demonstrates how the biologically-inspired, Firefly algorithm can be used to efficiently create multiple solution alternatives that both satisfy required system performance criteria and yet are maximally different in their decision spaces

Imanirad, R., Yang, X.S. and Yeomans, J.S. (2012). "A Co-Evolutionary, Nature-Inspired Algorithm for the Concurrent Generation of Alternatives", Journal on Computing, 2(3), 101-106.

Open Access Download

Abstract Engineering optimization problems usually contain multifaceted performance requirements that can be riddled with unquantifiable specifications and incompatible performance objectives. Such problems typically possess competing design requirements which are very difficult – if not impossible – to quantify and capture at the time of model formulation. There are invariably unmodelled design issues, not apparent at the time of model construction, which can greatly impact the acceptability of the model’s solutions. Consequently, when solving many “real life” mathematical programming applications, it is generally preferable to formulate several quantifiably good alternatives that provide very different perspectives to the problem. These alternatives should possess near-optimal objective measures with respect to all known modelled objective(s), but be fundamentally different from each other in terms of the system structures characterized by their decision variables. This solution approach is referred to as modelling-to-generate-alternatives (MGA). This study demonstrates how the nature-inspired, Firefly Algorithm can be used to concurrently create multiple solution alternatives that both satisfy required system performance criteria and yet are maximally different in their decision spaces. This new co-evolutionary approach is very computationally efficient, since it permits the concurrent generation of multiple, good solution alternatives in a single computational run rather than the multiple implementations required in previous MGA procedures.