Skip to content

About Me

I am a year Ph.D. Candidate in Electrical Engineering and Computer Science at MIT. I work in the Julia Lab under the supervision of Dr. Alan Edelman and Dr. Christopher Rackauckas. My research interests are in the application of numerical methods and scientific computing in deep learning.

In Summer' 22, I was a Student Researcher at Google AI, where I worked on differentiable wildfire simulators in Jax with Dr. Andrey Zhmoginov and Dr. Lily Hu. I completed my undergraduate studies in Computer Science and Engineering at Indian Institute of Technology Kanpur.


News

Mar. '24
New Preprint on NonlinearSolve.jl released on ArXiv.
Feb. '24
Paper on GPU Parallelized Hybrid Particle Swarm Optimization (using NonlinearSolve.jl) accepted at ICLR 2024 Workshop on AI4DifferentialEquations In Science.
Jan. '24
I will be joining Intel Labs for Summer '24 as a Research Intern.
Sept. '23
Infinite-Time Neural ODE work was awarded the Best Student Paper Award at IEEE HPEC 2023!
Aug. '23
My S.M. Thesis is now available on MIT DSpace!




Experience

Current Position

Massachusetts Institute of Technology

Electrical Engineering and Computer Science
Ph.D. Candidate
Advisor(s):
  Dr. Alan Edelman
                    
  Dr. Chris Rackauckas
Sept. '21 - Dec '25 (est.)
Cambridge, USA
GPA:   4.9 / 5.0

Education

Massachusetts Institute of Technology

Electrical Engineering and Computer Science
Master of Science (S.M.)
Advisor(s):
  Dr. Alan Edelman
                    
  Dr. Chris Rackauckas
Sept. '21 - May '23
GPA:   4.8 / 5.0

Indian Institute of Technology Kanpur

Computer Science and Engineering
Bachelor of Technology (B.Tech.)
July '17 - May '21
GPA:   9.9 / 10.0

Past Positions

Summer '24
Graduate Research Intern
Santa Clara, USA
Summer '22
Student Researcher
Mountain View, USA
Jan - July '21
Engineering Simulation Intern
Remote
Jan - Nov '20
Research Intern
Toronto, CAN
Summer '18, '19
Google Summer of Code Participant
Remote
Summer '18
Software Engineering Intern
Kanpur, IND

Ongoing Projects

Boundary Value Problems

  • Developing fast and accurate solvers for Boundary Value Problems (BVPs) in Julia.
  • Exploiting structural sparsity with automatic differentiation for fast Jacobian computation for BVPs.
  • Embedding non-linear equality constraints in neural network dynamics using BVPs.

Complementarity Problems

  • Developing solvers for Complementarity Problems (CPs) in Julia.
  • Embedding general inequality constraints in neural network dynamics using CPs.
  • Developing adjoint equations for efficient back-propagation through solutions of CPs.

Neural BVPs

  • Developing Adjoint Methods for Differentiating through solutions of Boundary Value Problems.
  • Scaling the Adjoints for large scale Neural Networks.

Research

My main research interest involves Scientific Machine Learning -- combining Deep Learning with Differential Equations. I focus on exploring their applications in tackling scalability issues with standard scientific computing models.

Occasionally, I explore domains outside my major area of focus and have published a few works in (Multi-Agent) Reinforcement Learning, Differentiable Graphics, Deep Learning Systems, etc.

Most Significant Bits

NonlinearSolve.jl: High-Performance and Robust Solvers for Systems of Nonlinear Equations in Julia

Preprint
  



Efficient GPU Accelerated Global Optimization for Inverse Problems

ICLR 2024 Workshop on AI4DifferentialEquations In Science
  



Locally Regularized Neural Differential Equations: Some Black Boxes Were Meant to Remain Closed!

International Conference on Machine Learning (ICML) 2023
  



Continuous Deep Equilibrium Models: Training Neural ODEs faster by integrating them to Infinity

IEEE High Performance Extreme Computing (HPEC) 2023 (Oral Presentation)
  



Opening the Blackbox: Accelerating Neural Differential Equations by Regularizing Internal Solver Heuristics

International Conference on Machine Learning (ICML) 2021
  



Emergent Road Rules In Multi-Agent Driving Environments

International Conference on Learning Representations (ICLR) 2021
  



Less Significant Bits

Stably Accelerating Stiff Quantitative Systems Pharmacology Models: Continuous-Time Echo State Networks as Implicit Machine Learning

International Federation of Automatic Control (IFAC) 2021
  



Composing Modeling and Simulation with Machine Learning in Julia

International Modelica Conference 2021
  



Humor@IITK at SemEval-2021 Task 7: Language Models for Quantifying Humor And Offensiveness

Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval) 2021
  



TorchGAN: A Flexible Framework for GAN Training and Evaluation

Avik Pal & Aniket Das
Journal of Open Source Software (JOSS) 2021
  



RayTracer.jl: A Differentiable Renderer that supports Parameter Optimization for Scene Reconstruction

Avik Pal
Proceedings of the JuliaCon Conferences 2019
  



Fashionable Modelling with Flux

NeurIPS Workshop on Systems for Machine Learning 2019
  




Teaching


Open Source Software

For a list of open source softwares I have written and maintain see my GitHub Profile


Talks

Lux.jl: Explicit Parameterization of Neural Networks in Julia


JuliaCon 2022

Mixing Implicit and Explicit Deep Learning with Skip DEQs


SciMLCon 2022

Differentiable Rendering and its Applications in Deep Learning


JuliaCon 2019

Last updated: