I am a PhD student in financial economics at the Yale School of Management. My research focuses on asset pricing, financial econometrics, text data, and high-dimensional Bayesian statistics. Before Yale, I was a researcher at the Booth School of Business, received a master's degree in statistics from the University of Michigan, and completed my undergraduate degree in economics at the University of Chicago.
(With Bryan Kelly and Yinan Su)
We seek fundamental risks from news text. Conceptually, news is closely related to the idea of systematic risk, in particular the "state variables" in the ICAPM. News captures investors' concerns about future investment opportunities, and hence drives the current pricing kernel. This paper demonstrates a way to extract a parsimonious set of risk factors and eventually a univariate pricing kernel from news text. The state variables are reduced and selected from the variations in attention allocated to different news narratives. As a result, the risk factors attain clear text-based interpretability as well as top-of-the-line asset pricing performance. The empirical method integrates topic modeling (LDA), latent factor analysis (IPCA), and variable selection (group lasso).
Code for our penalized IPCA method can be found here: https://github.com/lbybee/ipca_reg
(With Bryan Kelly, Asaf Manela, and Dacheng Xiu)
We propose an approach to measuring the state of the economy via textual analysis of business news. From the full text of 800,000 Wall Street Journal articles for 1984–2017, we estimate a topic model that summarizes business news into interpretable topical themes and quantifies the proportion of news attention allocated to each theme over time. News attention closely tracks a wide range of economic activities and explains 25% of aggregate stock market returns. A text-augmented VAR demonstrates the large incremental role of news text in modeling macroeconomic dynamics. We use this model to retrieve the narratives that underlie business cycle fluctuations.
Data available here: http://structureofnews.com/
(With Yves Atchadé)
Graphical models with change-points are computationally challenging to fit, particularly in cases where the number of observation points and the number of nodes in the graph are large. Focusing on Gaussian graphical models, we introduce an approximate majorize-minimize (MM) algorithm that can be useful for computing change-points in large graphical models. The proposed algorithm is an order of magnitude faster than a brute force search. Under some regularity conditions on the data generating process, we show that with high probability, the algorithm converges to a value that is within statistical error of the true change-point. A fast implementation of the algorithm using Markov Chain Monte Carlo is also introduced. The performances of the proposed algorithms are evaluated on synthetic data sets and the algorithm is also used to analyze structural changes in the S&P 500 over the period 2000-2016.
An implementation of our method is available on CRAN: https://cran.r-project.org/web/packages/changepointsHD/index.html