I study how the criteria used to evaluate technological innovations emerge, persist, and change.

I am on the 2026–2027 academic job market.

In many industries, what counts as a successful innovation is defined by criteria that seem objective but are shaped by strategic choices. When those criteria fall short of capturing the actual performance of innovations in market deployment, who proposes new evaluation criteria — and why are some firms able to do this while most are not?

My dissertation develops a theory of evaluative evolution: how evaluation criteria of innovations change, which firms drive that change, and what happens to the technological trajectories and competitive dynamics of innovations following that change.

I study these questions in the pharmaceutical industry, where clinical trial endpoints determine which therapies reach patients — and where the wrong criteria can mean approved drugs that don't actually help. I bring training in law, biostatistics, and strategic management, which is why I see evaluation criteria as simultaneously technical design choices and sites of strategic contestation. To study these questions at scale, I build custom AI/LLM research pipelines — including retrieval-augmented systems that outperform state-of-the-art general-purpose models on domain-specific tasks. I have been invited to present my research at major pharmaceutical firms.

Research Interests

  • Innovation Strategy
  • Evaluation of Innovations
  • Organizational Learning
  • Pharmaceutical Industry
  • Strategic Management of Intellectual Property

Methods

  • Causal Inference / Econometrics
  • Natural Language Processing
  • LLM/RAG Pipeline Design
  • Mathematical Modeling
  • Medical Concept Classification Systems

Dissertation: Evaluative Evolution

The dissertation consists of three interconnected studies. The first asks which firms recognize and propose new criteria when existing ones fail. The second examines how evaluators decide which proposals to endorse — a problem structurally harder than evaluating innovations under fixed criteria, because the very dimensions being proposed are new. The third traces what happens after endorsement: how new criteria redirect technological trajectories, and why the firms most likely to adopt are not always the ones who benefit most.

Because newly endorsed criteria are themselves satisficing — good enough to guide decisions, but never fully capturing real-world performance — evaluative evolution is recursive, and each new criterion sows the seeds of its own eventual revision. These three studies open a broader research program — including how evaluators learn through the evaluative evolution process, the durability of industry convergence around new criteria, the dynamics of successive evaluative cycles, and the extension of evaluative evolution to medical devices, financial regulation, and environmental standards.

Working Papers

Vision or Delusion? How Evaluation Sequence Anchors the Assessment of Novelty

Yunxiang Bai, Subrina Shen, & Melody Chang

In preparation for submission to Strategic Management Journal

Organizations select against novel ventures even when they explicitly seek novelty. The literature diagnoses this as a problem of obscured vision — evaluators fail to see the upside. But evaluators do score both upside and risk. This study argues that the penalty arises not only from how they see each dimension, but also from how they synthesize conflicting dimensions into an overall judgment.

Evaluating novel ventures is challenging as evaluators struggle to reconcile the upside potential of a venture with its execution risks. While prior work focuses on how evaluation procedures shape evaluators' relative attention to the opposing dimensions, we examine the sequence of key evaluation criteria and explore how it can shape the overall assessment of novel venture ideas. Using proprietary data from a startup evaluation platform and two pre-registered experiments, we found that when evaluators consider upside potential before risk, their evaluation anchors on the upside potential, thus favoring novelty. Conversely, when risks are considered first, their evaluation anchors on risks, thus disadvantaging. This paper contributes to research on idea evaluation and entrepreneurship by highlighting the role of a key structural component, the sequence of criteria, in evaluating novel ideas.

On Giants' Shoulders While Keeping Others Off of Yours: Engagement in Science and Firm Generative Appropriability

Francisco Polidoro & Yunxiang Bai

Presented at SMS Annual Conference, Istanbul, 2024

Engaging in public science creates knowledge that rivals can freely use — so does it ultimately help or hurt the publishing firm? The literature has treated this as a single tradeoff, but tracing four decades of knowledge flows reveals that the answer depends on a temporal distinction the literature has not drawn.

Research on science and innovation highlights how firms' scientific engagement shapes knowledge flows determining who captures returns to innovation. Yet, whether science tilts these flows toward the publishing firm or its rivals has not been directly tested. This study abductively explores this question by tracing patent citation flows for 170 biopharmaceutical firms over four decades. In contrast with existing literature treating the appropriability implications of science as a single tradeoff, this study reveals that the answer depends on temporal perspective: under a retrospective lens, firms sustaining ongoing science capture roughly twice the benefit rivals do, while under a prospective lens, science at invention creates contested opportunities whose firm advantage materializes only at longer horizons. Exploratory evidence suggests science helps firms retrieve knowledge from spillovers.

Publications

Mitigating Nonattendance Using Clinic-Resourced Incentives Can Be Mutually Beneficial: A Contingency Management-Inspired Partially Observable Markov Decision Process Model

Yunxiang Bai & Björn P. Berg

Value in Health, 24(8), 1102–1110, 2021

I am prepared to teach courses in strategic management, innovation strategy, and research methods.

General Management & Strategy

Instructor of Record

Undergraduate / Master's · UT Austin McCombs · Summer 2024

Instructor rating: 5.00 / 5.00 · Course rating: 4.89 / 5.00

Biostatistical Literacy

Teaching Assistant

University of Minnesota · 2019–2020

Education

Ph.D. in Management, University of Texas at Austin (Expected 2027)

M.S. in Biostatistics, University of Minnesota (2021)

LL.B., Tsinghua University (2018)

Selected Awards

Outstanding Graduate Research Fellowship (2026–2027)

McCombs Dean's Fellowship (2025–2026)

Cooper Fellowship (2025–2026)

Graduate School Continuing Fellowship (2024–2025)

Conference Presentations

Strategic Management Society Annual Conference (2024, 2023)

Download full CV (PDF)

I am on the 2026–2027 academic job market.