
Scaling laws tell us “more is more” — that brute-force scaling of data and compute smoothly enhances AI capabilities. Yet, in reality, this path is becoming increasingly unsustainable, which calls for algorithmic innovations to bend the scaling laws and seek more compute-efficient progress. In this talk, I will discuss our recent efforts in this spirit, including gradient-based methods for enhancing synthetic data, symbolic search algorithms for test-time reasoning, test-time training that squeezes out extra learning even during testing, and a new tokenization algorithm that allows for better and faster inference.2016l
Bio: Yejin Choi is the incoming Dieter Schwartz Foundation HAI Professor, a Professor of Computer Science, and a Stanford HAI Senior Fellow. Prior, she was a professor of Paul G. Allen School of Computer Science & Engineering at the University of Washington, adjunct of the Linguistics department, and affiliate of the Center for Statistics and Social Sciences. She is also a senior research manager at the Allen Institute for Artificial Intelligence. She is a co-recipient of the Marr Prize (best paper award) at ICCV 2013, a recipient of Borg Early Career Award (BECA) in 2018, and named among IEEE AI’s 10 to Watch in 2016. She received her Ph.D. in Computer Science at Cornell University (advisor: Prof. Claire Cardie) and BS in Computer Science and Engineering at Seoul National University in Korea.
Sponsor
Event organized by AI Lab