Carnegie Mellon University
January 18, 2023

Tepper Winter/Spring Research Rundown

A collection of cutting-edge research happening at the Tepper School.


A Mechanism to Allocate Seats in Courses to Students Efficiently and Fairly 

Alexey Kushnir, Associate Professor of Economics

Every academic term, thousands of U.S. post-secondary institutions assign course schedules to millions of students. In a new article, researchers propose a mechanism to allocate seats in courses to students; the mechanism is based on ideas of competitive markets with fake currency. The new mechanism ensures that course seats are allocated efficiently and fairly subject to course priority constraints based on factors such as students’ seniority and majors. Also, students have limited ability to manipulate the system.

Using simulations and theoretical analysis, the researchers show how university registrars who use the mechanism can improve course allocation compared to the state-of-the-art approaches. The mechanism also provides registrars with valuable information on student demand, making it easier to adjust class sizes, timing, and sections to boost students’ satisfaction. The article, Undergraduate Course Allocation Through Competitive Markets, appears as a preprint on the SSRN website and is authored by Kornbluth, D (Carnegie Mellon University), and Kushnir, A (Carnegie Mellon University). Copyright 2022. All rights reserved.

A New Way to Help Auditors and Risk-Management Professionals Address Anomalies

Pierre Jinghong Liang, Professor of Accounting

How can money laundering be spotted automatically in large-scale accounting data? How can temporal correlation patterns be discovered in time-evolving graph datasets? When is the best time to start analysis of these issues? In a new article, researchers propose AutoAudit, a new way to help auditors and risk-management professionals address these questions. After testing the method on real-world datasets, in which nearly 100 percent of anomalies in actual data were identified, the authors conclude that the approach is general enough to be easily modified to solve problems in other domains.

The article, AutoAudit: Mining Time-Evolving and Accounting Graphs, appears in KDD’20
and is authored by Lee, M-C (National Chiao Tung University), Wang, A (Carnegie Mellon University), Liang, PJ (Carnegie Mellon University), Akoglu, L (Carnegie Mellon University), Tseng, VS (National Chiao Tung University), and Faloutsos, C (Carnegie Mellon University). Copyright 2020 Association for Computing Machinery. All rights reserved.

Detecting Anomalous Graphs in Labeled Multi-Graph Databases

Pierre Jinghong Liang, Professor of Accounting

Corporations can have hundreds of thousands of annual transaction records, making it difficult to identify the abnormal ones, some of which may indicate entry errors or employee misconduct. Also challenging is spotting anomalous daily e-mail and call interactions or software programs with bugs. To address these problems, in a new article, researchers introduce a technique for detecting anomalies called CODEtect.

Although they developed the system for business accounting, it can also be used more other domains. The authors apply the technique to annual transaction records from three different corporations — from small to large scale — and consider the Enron scandal. In doing so, they show that CODEtect outperforms existing techniques in detecting injected anomalies that mimic known malicious schemes in accounting.

The article, Detecting Anomalous Graphs in Labeled Multi-Graph Databases, appears in ACM Transactions on Knowledge Discovery from Data and is authored by Nguyen, HG (Princeton University), Liang, PJ (Carnegie Mellon University), and Akoglu, L (Carnegie Mellon University). Copyright 2022 Association for Computing Machinery. All rights reserved.

Bookkeeping Graphs: Computations Theory and Applications

Pierre Jinghong Liang, Professor of Accounting

Although it is 500 years old, double-entry bookkeeping remains a foundation of the financial infrastructure of any modern organization. In a new monograph, Pierre Liang of Carnegie Mellon University’s Tepper School of Business describes the graph or network representation of double-entry bookkeeping in both theory and practice. The representation serves as the intellectual basis for applied computational work on pattern recognition and anomaly detection in corporate journal-entry audit settings.

Liang discusses the computational theory of pattern recognition and anomaly detection as it is currently practiced in machine learning, especially the so-call Minimum Description Length (MDL) approach. The monograph concludes with a description of how the computational MDL theory is applied to recognize patterns and detect anomalous transactions in graphs representing the journal entries of a large set of transactions extracted from real-world corporate entities’ bookkeeping data.

The monograph, Bookkeeping Graphs: Computations Theory and Applications, is currently in preparation to appear in Foundations and Trends in Accounting © published by now: the essence of knowledge and is authored by Liang, PJ (Carnegie Mellon University). Copyright 2022 The Author. All rights reserved.

Summarizing Labeled Multi-Graphs

Pierre Jinghong Liang, Professor of Accounting

Real-world graphs can be difficult to interpret and visualize beyond a certain size. To address this issue, graph summarization aims to simplify and shrink a graph, while maintaining its high-level structure and characteristics. Most summarization methods are designed for homogeneous, undirected, simple graphs; however, many real-world graphs are more complicated. In a new article, researchers propose TG-sum, a versatile yet rigorous graph summarization model that can handle complex graphs. The authors’ experiments demonstrate that TG-sum facilitates the visualization of real-world complex graphs, revealing interpretable structures and high-level relationships. The method also achieves better tradeoff between compression rate and running time than existing methods on comparable settings.

The article, Summarizing Labeled Multi-Graphs, appears in the Proceedings of the 2022 European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases and is authored by Berberidis, D (Carnegie Mellon University), Liang, PJ (Carnegie Mellon University), and Akoglu, L (Carnegie Mellon University). Copyright 2022. All rights reserved.

Shifting Determinants of Homeownership

Robert A. Miller, Richard M. Cyert and Morris DeGroot Professor of Economics and Statistics; Professor of Economics and Strategy

Homeownership has declined in the last 50 years as a result of individuals postponing the purchase of their first home. This delay has coincided with postponements of marriage and fertility, and with increases in women’s participation in the labor force. Researchers developed a dynamic model of female labor supply, fertility (i.e., timing of births), and transition from renting to owning a first home, to create a unifying framework that integrates these joint decisions.

Data came from the Panel Study of Income Dynamics (1968 to 1993), a longitudinal survey of U.S.families that measures economic, social, and health factors over multiple generations. Higher house prices and increased wage rates for women caused many households to postpone their purchase of a first home because leisure and fertility complement homeownership. Education and women’s participation in the workforce reinforce and raise the value of owning a home. The researchers’ estimates suggest that the effects of rising house prices and wage rates more than offset the effects of higher levels of education and workforce participation.

The article, American Dream Delayed: Shifting Determinants of Homeownership, appears in International Economic Review and is authored by Khorunzhina, N (Copenhagen Business School), and Miller, RA (Carnegie Mellon University). Copyright 2021 the Economics Department of the University of Pennsylvania and the Osaka University Institute of Social and Economic Research Association.

Examining the Effect of the Sarbanes-Oxley Act on CEOs’ Compensation and Incentives

Robert A. Miller, Richard M. Cyert and Morris DeGroot Professor of Economics and Statistics; Professor of Economics and Strategy

The Sarbanes-Oxley Act was passed in 2002 to mandate practices in U.S. public companies’ financial record keeping and reporting. The law was passed in response to several accounting scandals that resulted in the dismissal of CEOs and, in some cases, prosecution for fraud, conviction, and imprisonment. In a new study, researchers used data from S&P1500 firms to examine the effect of the law on CEOs’ compensation plans and incentives. The authors use four measures to quantify the impact on the inherent conflict of interest between shareholders and their CEO, and the cost of resolving that conflict through incentives embedded with the CEO compensation package.

The study found that the law 1) reduced the conflict of interest between shareholders and their CEOs, mainly by reducing shareholders’ loss as a result of CEOs deviating from their goal of expected value maximization; 2) increased the cost of agency, or the risk premium CEOs are paid to align their interests with those of their shareholders;3) raised administrative costs in the primary sector (including utilities and energy) but in the other sectors studied (services and consumer goods), the effect was more nuanced, and 4) had no effect on CEOs’ attitudes toward risk.

The article, Was Sarbanes-Oxley Costly? Evidence from Optimal Contracting on CEO Compensation, appears in the Journal of Accounting Research and is authored by Gayle, G-L (Washington University in St. Louis and Federal Reserve Bank of St. Louis), Li, C(New York University-Shanghai), and Miller, RA(Carnegie MellonUniversity). Copyright 2022The Chookaszian Accounting Research Center at the University of Chicago Booth School of Business. All rights reserved.

Markets With Within-Type Adverse Selection

Anh Nguyen, Assistant Professor of Economics

In economics, adverse selection generally refers to a situation in which sellers possess information about the quality of a product that buyers do not, or vice versa. This imbalance—called asymmetric information—can occur in a variety of transactions, including sales of used cars or insurance, and can contribute to market failures.

In a new study, researchers examined bilateral trades in which a seller owns multiple units of a product, and each unit is of potentially different quality and indistinguishable by the buyer before the transaction. The study explains why a bulk purchase contract is often used in markets with these features. More generally, the study found that every optimal allocation that takes into account the buyer’s incentive to sell bad quality products first must involve a trade that has threshold property: If the seller has more than some threshold unit of the product, the entire endowment is traded, but if the seller has less than that threshold, they trade all but the high-quality units of the property.

The article, Markets With Within-Type Adverse Selection, appears in The American Economic Journal: Macroeconomics and is authored by Nguyen, A (Carnegie Mellon University), and Tan, TY (University of Nebraska-Lincoln).Copyright 2022.All rights reserved.

Technical and Intellectual Forces That Have Generated the “Standardization Trap” in Macroeconomics

Stephen Spear, Professor of Economics

In macroeconomic analysis, two competing dynamic, stochastic, general equilibrium models have evolved as workhorse models. They are used in monetary economics, business cycle theory, economic growth, public finance/optimal taxation, and fiscal policy analysis. In a new book, ‘Overlapping Generations: Methods, Models, and Morphology’, authors Stephen Spear and Warren Young explain the technical and intellectual forces that have generated the current situation in which macroeconomics seems to be caught in a “standardization trap” based on a problematic choice of models.

The first model, the Infinite Lived Agent Model (ILA), was developed in 1928 as a way to model economic growth; the second, the Overlapping Generations Model (OLG), first appeared in 1947 and contained features that contradicted the first. The specific differences come from the way the two models view families. In the ILA, families are treated as dynasties that last forever. In the OLG, families live finite lives and make only accidental bequests to their offspring. Both models have been refined over the years since then, and both have been the subject of numerous articles and other writing, with the ILA model pulling more weight than the OLG. In their book, the researchers look at key papers in the literature on the development of the two models and show how and why the OLG model became less popular over time, while the ILA framework grew in popularity, even though the hypothesis that parents optimally plan out their children’s economic lives has been thoroughly refuted by the data.

The book, Overlapping Generations: Methods, Models, and Morphology, by Stephen Spear, (Carnegie Mellon University), and Warren Young, (Bar Ilan University). Copyright 2022. All rights reserved. The book will be published by Emerald Press as monograph in the International Symposia In Economic Theory and Econometrics, William A. Barnett, ed.

How Split Liver Transplantation Could Benefit Waiting Patients

Yanhan (Savannah) Tang, Ph.D. Student, Operations Management; Andrew Li, Assistant Professor of Operations Research; Alan Scheller-Wolf, Richard M. Cyert Professor of Operations Management; Sridhar R. Tayur, University Professor of Operations Management

Individuals gain proficiency through experience-based learning, also called “learning by doing.” In a new study, researchers formulated a multi-armed bandit model (a problem in which a fixed set of resources must be allocated between competing choices to maximize expected gain) in which learning curves were embedded in reward functions to capture experience-based learning. They focused on learning how to do complex surgeries, such as split liver transplantation (SLT), which involves making two parts of the same liver available for transplantation, helping two patients instead of just one.

The United States lacks an adequate number of donor livers, so increased use of SLT could benefit waiting patients significantly. Donated livers (the fixed resource) must be allocated across different transplant centers (that gain proficiency as they perform new surgeries) that have heterogeneous maximum proficiencies to maximize patient welfare.

The model included provisions ensuring that the choices of arms (or resources) are subject to fairness (to ensure equity across patient classes), incorporate queueing dynamics (to capture waiting time dynamics as patients await transplants), and foster dependence (to capture learning by surgical teams across similar surgeries).

When applying the model to SLT, the researchers’ algorithms performed better than standard bandit algorithms in a setting with experience-based learning, fairness, queueing, and arm dependence. The findings have managerial implications, including in helping evaluate strategies to boost the proliferation of this type of transplant and other technically difficult medical procedures.

The article, Multi-Armed Bandits with Endogenous Learning Curves and Queueing: An Application to Split Liver Transplantation, appears as a working paper and is authored by Tang, YS (Carnegie Mellon University), Li, A (Carnegie Mellon University), Scheller-Wolf, A (Carnegie Mellon University), and Tayur, S (Carnegie Mellon University). Copyright 2022. All rights reserved.

Optimal Scheduling in Multiserver-Job Systems Under Heavy Traffic

Issac Grosof, Ph.D. Student, Computer Science; Mor Harchol-Balter, Professor of Computer Science; Alan Andrew Scheller-Wolf, Richard M. Cyert Professor of Operations Management

Multiserver-job systems, where jobs require concurrent service at many servers, are becoming widespread—consider, for example, computing centers for companies like Google, Amazon, and Microsoft. Much work on multiserver-job systems focuses on maximizing use, but scheduling must be coordinated to optimize response time. For example, multiserver-job systems must ensure that servers are not unnecessarily left idle, so minimizing response time requires prioritizing small jobs while maximizing throughput. In a new study, researchers sought to answer the question of how to achieve these joint objectives. They devised a scheduling policy called the ServerFilling-SRPT, the first strategy that minimizes mean response time in a multiserver-job model in heavy traffic. ServerFilling-SRPT outperformed all existing scheduling policies for all loads, with improvements by orders of magnitude at higher loads.

The article, Optimal Scheduling in the Multiserver-Job Model Under Heavy Traffic, appears in ACM Sigmetrics (Conference) and is authored by Grosof, I (Carnegie Mellon University), Scully, Z (University of California Berkeley, Carnegie Mellon University, Cornell University), Harchol-Balter, M (Carnegie Mellon University), and Scheller-Wolf, A (Carnegie Mellon University). Copyright 2022 The Authors. All rights reserved.