I am currently a Member of Technical Staff at EvolutionaryScale, building large scale foundation models for Biology. I received my PhD from Princeton where I was advised by Ryan P. Adams. I am particularly interested in machine learning and its applications to problems in science and engineering. In Summer 2023 I worked at at NVIDIA with the Toronto AI Lab, specifically in the intersection of numerical simulation and machine learning, and in Summer 2022 I interned at Google Brain Princeton with Elad Hazan. Previously, I was a Google AI Resident, working in the intersection of applied information theory and machine learning.

I did my BS and MEng in Computer Science at MIT, where I worked with Carl Vondrick and Antonio Torralba.

Representative Publications:

Simulating 500 million years of evolution with a language model
Thomas Hayes*, Roshan Rao*, Halil Akin*, Nicholas J Sofroniew*, Deniz Oktay*, Zeming Lin*, Robert Verkuil*, Vincent Q Tran, Jonathan Deaton, Marius Wiggert, Rohil Badkundri, Irhum Shafkat, Jun Gong, Alexander Derry, Raúl Santiago Molina, Neil Thomas, Yousuf A Khan, Chetan Mishra, Carolyn Kim, Liam J Bartie, Matthew Nemeth, Patrick D Hsu, Tom Sercu, Salvatore Candido, Alexander Rives
Preprint [Link] (* denotes equal contribution)

Neuromechanical Autoencoders: Learning to Couple Elastic and Neural Network Nonlinearity
Deniz Oktay, Mehran Mirramezani, Eder Medina, Ryan P. Adams
ICLR 2023 (Notable, Top 25% - Spotlight) [Link] [Press Article]

Randomized Automatic Differentiation
Deniz Oktay, Nick McGreivy, Joshua Aduol, Alex Beatson, Ryan P. Adams
ICLR 2021 (Oral presentation) [Link]
Oral presentation at NeurIPS 2020 Beyond Backpropagation Workshop [Link]

Scalable Model Compression by Entropy Penalized Reparameterization
Deniz Oktay, Johannes Ballé, Saurabh Singh, Abhinav Shrivastava
ICLR 2020 [Link] [Official Tensorflow Tutorial]

Other Publications:

Fiber Monte Carlo
Nick Richardson, Deniz Oktay, Yaniv Ovadia, James C Bowden, Ryan P. Adams
ICLR 2024 [Link]

JAX FDM: A differentiable solver for inverse form-finding
Rafael Pastrana, Deniz Oktay, Ryan P. Adams, Sigrid Adriaenssens
ICML 2023 Differentiable Almost Everything Workshop [Link]

A rapid and automated computational approach to the design of multistable soft actuators
Mehran Mirramezani, Deniz Oktay, Ryan P. Adams
Computer Physics Communications [Link]

Minuscule corrections to near-surface solar internal rotation using mode-coupling
Srijan Bharati Das*, Samarth G. Kashyap*, Deniz Oktay, Shravan M. Hanasoge, Jeroen Tromp
Astrophysical Journal Supplement Series (ApJS) [Link]

Meta-PDE: Learning to Solve PDEs Quickly Without a Mesh
Tian Qin, Alex Beatson, Deniz Oktay, Nick McGreivy, Ryan P. Adams
In submission [Link]

On Predictive Information in RNNs
Zhe Dong, Deniz Oktay, Ben Poole, Alex Alemi
NeurIPS 2019 Workshop on Information Theory and Machine Learning [Link]

Predicting Motivations of Actions by Leveraging Text
Carl Vondrick, Deniz Oktay, Hamed Pirsiavash, Antonio Torralba
CVPR 2016 [Link]

Open Source Code:

For my PhD research, I developed Varmint, a JAX-based nonlinear solid mechanics simulator that is fully differentiable and can easily interact with neural networks.
Check it out at: https://github.com/PrincetonLIPS/Varmint

Industry Experience:

I have also spent time at Hudson River Trading, Google X on Project Loon, and Yelp working on MOE. Check out my LinkedIn for my full industry experience.