About Me
I am a
In Summer' 22, I was a Student Researcher at Google AI, where I worked on differentiable wildfire simulators in Jax with Dr. Andrey Zhmoginov and Dr. Lily Hu. I completed my undergraduate studies in Computer Science and Engineering at Indian Institute of Technology Kanpur.
News
Experience
Current Position
Massachusetts Institute of Technology
Education
Massachusetts Institute of Technology
Indian Institute of Technology Kanpur
Past Positions
Ongoing Projects
Boundary Value Problems
- Developing fast and accurate solvers for Boundary Value Problems (BVPs) in Julia.
- Exploiting structural sparsity with automatic differentiation for fast Jacobian computation for BVPs.
- Embedding non-linear equality constraints in neural network dynamics using BVPs.
Complementarity Problems
- Developing solvers for Complementarity Problems (CPs) in Julia.
- Embedding general inequality constraints in neural network dynamics using CPs.
- Developing adjoint equations for efficient back-propagation through solutions of CPs.
Neural BVPs
- Developing Adjoint Methods for Differentiating through solutions of Boundary Value Problems.
- Scaling the Adjoints for large scale Neural Networks.
Research
My main research interest involves Scientific Machine Learning -- combining Deep Learning with Differential Equations. I focus on exploring their applications in tackling scalability issues with standard scientific computing models.
Occasionally, I explore domains outside my major area of focus and have published a few works in (Multi-Agent) Reinforcement Learning, Differentiable Graphics, Deep Learning Systems, etc.
Most Significant Bits
NonlinearSolve.jl: High-Performance and Robust Solvers for Systems of Nonlinear Equations in Julia
Locally Regularized Neural Differential Equations: Some Black Boxes Were Meant to Remain Closed!
Continuous Deep Equilibrium Models: Training Neural ODEs faster by integrating them to Infinity
Less Significant Bits
Stably Accelerating Stiff Quantitative Systems Pharmacology Models: Continuous-Time Echo State Networks as Implicit Machine Learning
Composing Modeling and Simulation with Machine Learning in Julia
TorchGAN: A Flexible Framework for GAN Training and Evaluation
Fashionable Modelling with Flux
Teaching
Spring 2021
Open Source Software
For a list of open source softwares I have written and maintain see my GitHub Profile
Talks
Lux.jl: Explicit Parameterization of Neural Networks in Julia
JuliaCon 2022
Mixing Implicit and Explicit Deep Learning with Skip DEQs
SciMLCon 2022
Differentiable Rendering and its Applications in Deep Learning
JuliaCon 2019