NYU Stern Department of Economics
Welcome to the homepage of the Department of Economics at NYU Stern. Please use the links above to access information about the professionals affiliated with the department, recent research, news, and our involvement with Stern's educational programs. The best way to stay in close touch is to follow us on X @NYUSternEcon.
News
2025 - 2026 Job Market Candidates
Stern Economics PhD students currently on the job market:
Agata Farina, job market paper "Communicating with Data-Generating Processes: An Experimental Analysis”
Brandon Kaplowitz, job market paper "Reinforcement Learning and Consumption-Savings Behavior"
Giulia Brancaccio awarded the 2025 Carlo Alberto Medal
Assistant Professor of Economics Giulia Brancaccio has been awarded the 2025 Carlo Alberto Medal. This biennial award by the Collegio Carlo Alberto recognizes an Italian scholar under the age of 40 for outstanding research contributions to the field of economics.
The award announcement noted Brancaccio’s innovative research at the intersection of industrial organization and market design that combines theoretical modeling with empirical analysis....
Felix Montag is a recipient of this year's "Yuki Arai Faculty Research Prize in Finance"
Felix Montag, Assistant Professor of Economics at NYU Stern, is one of the recipients of this year's Yuki Arai Faculty Research Prize in Finance.
The prize, established by alumnus Yuki Arai (MBA ’10), aims to recognize and promote excellence in research by Stern faculty. The award is bestowed annually on the best research paper in Finance by a Stern faculty member.
The committee recognized Felix's paper, "Mergers, Foreign Competition, and Jobs: Evidence from the U.S. Appliance Industry."
Petra Moser cited in 2025 Economic Report of the President
Work by Petra Moser (with B. Biasi and D. Deming) is cited in the just-released Economic Report of the President. The paper "Education and Innovation" (2022) is cited in Chapter 7: the K-12 Education System: Economic Impacts and Opportunities for Innovation (pgs 244 - 245).
The report cites the findings of Biasi, Deming, and Moser as evidence of the importance of K-12 education in preparing students for higher education and becoming future innovators.
research

Abdou Ndiaye, Meghana Gaur, John R. Grigsby, and Jonathon Hazell's "Bonus Question: Does Flexible Incentive Pay Dampen Unemployment Dynamics?"
We introduce dynamic incentive contracts into a model of unemployment dynamics and present three results. First, wage cyclicality from incentives does not dampen unemployment dynamics: the response of unemployment to shocks is first-order equivalent in an economy with flexible incentive pay and without bargaining, vis-á-vis an economy with rigid wages. Second, wage cyclicality from bargaining dampens unemployment dynamics through the standard mechanism. Third, our calibrated model suggests 46%..

Adam Brandenburger, Patricia Contreras-Tejada, Pierfrancesco La Mura, Giannicola Scarpa, and Kai Steverson's "Agreement and Disagreement in a Non-Classical World," published in Philosophical Transactions of the Royal Society A
The Agreement Theorem (Aumann, 1976) states that if two Bayesian agents start with a common prior, then they cannot have common knowledge that they hold different posterior probabilities of some underlying event of interest. In short, the two agents cannot "agree to disagree." This result applies in the classical domain where classical probability theory applies. But in non-classical domains, such as the quantum world, classical probability theory does not apply. Inspired principally by their...

Adam Brandenburger, Amanda Friedenberg, and H. Jerome Keisler's "The Relationship Between Strong Belief and Assumption," published in Synthese
We define two maps, one map from the set of conditional probability systems (CPS's) onto the set of lexicographic probability systems (LPS’s), and another map from the set of LPS's with full support onto the set of CPS's. We use these maps to establish a relationship between strong belief (defined on CPS's) and assumption (defined on LPS's). This establishes a relationship at the abstract level between these two widely used notions of belief in an extended probability-theoretic setting.

Chris Conlon, Nathan Miller, Tsolmon Otgon, and Yi Yao's "Rising Markups, Rising Prices?" published in the American Economic Association Papers and Proceedings
The rise in markups and market power documented by De Loecker, Eeckhout, and Unger (2020) has recently generated much discussion in economics. We measure the correlation between the change in firm level markups and the change in industry level prices as measured by the Producer Price Index and find little to no relationship both for 1980–2018 and 2018–present. While a "false negative" result due to mismeasurement is possible, it also raises the possibility that firms have not passed along...

Paul Scott, Adrian Torchiana, Ted Rosenbaum, and Eduardo Souza-Rodrigues' "Improving Estimates of Transitions from Satellite Data: A Hidden Markov Model Approach" published in The Review of Economics and Statistics
Satellite-based image classification facilitates low-cost measurement of the Earth's surface composition. However, misclassified imagery can lead to misleading conclusions about transition processes. We propose a correction for transition rate estimates based on the econometric measurement error literature to extract the signal (truth) from its noisy measurement (satellite-based classifications). No ground-truth data is required in the implementation. Our proposed correction produces...

Adam Brandenburger, Ye Jin, and Zhen Zhou's "Coordination via delay: Theory and experiment," published in Games and Economic Behavior
This paper studies the effect of introducing an option of delay in coordination games—that is, of allowing players to wait and then choose between the risk-dominant and payoff-dominant actions. The delay option enables forward-induction reasoning to operate, whereby a player's waiting and not choosing the risk-dominant action right away signals an intention to choose the payoff-dominant action later. If players have ϵ-social preferences—they help others if they can do so at no cost to...

Adam Brandenburger and Stefan Bucher's "Divisive Normalization is an Efficient Code for Multivariate Pareto-Distributed Environments," published in PNAS
Divisive normalization is a canonical computation in the brain, observed across neural systems, that is often considered to be an implementation of the efficient coding principle. We provide a theoretical result that makes the conditions under which divisive normalization is an efficient code analytically precise: We show that, in a low-noise regime, encoding an n-dimensional stimulus via divisive normalization is efficient if and only if its prevalence in the environment is described by...

Adam Brandenburger, Pierfrancesco La Mura, and Stuart Zoble's "Rényi Entropy, Signed Probabilities, and the Qubit," published in Entropy
The states of the qubit, the basic unit of quantum information, are 2×2 positive semi-definite Hermitian matrices with trace 1. We contribute to the program to axiomatize quantum mechanics by characterizing these states in terms of an entropic uncertainty principle formulated on an eight-point phase space. We do this by employing Rényi entropy (a generalization of Shannon entropy) suitably defined for the signed phase-space probability distributions that arise in representing quantum states.

Luis Cabral and Gabriel Natividad's "Bundling Sequentially Released Durable Goods," published in the Journal of Industrial Economics
Suppose two durables are sequentially released and suppose that consumer valuations of these goods are positively correlated. By the time the second good is released, high-valuation buyers are out of the market for the first good. Therefore, a bundle can be targeted at the low-valuation consumers without violating the high-valuation consumers' incentive compatibility constraint. We test the model's predictions on data from retail DVD sales in the 2000's. Consistent with theory, our estimates...

Larry White's "The Dead Hand of Cellophane and the Federal Google and Facebook Antitrust Cases: Market Delineation Will Be Crucial" published in The Antitrust Bulletin
The U.S. Department of Justice (DOJ) and Federal Trade Commission (FTC) monopolization cases against Google and Facebook, respectively, represent the most important federal nonmerger antitrust initiatives since (at least) the 1990s. As in any monopolization case, market delineation will be a central feature of both cases—as it was in the du Pont Cellophane case of sixty-five years ago. Without a delineated market, how can one determine whether a company has engaged in monopolization?