Skip to content

About Me


Avik Pal

I am a 5th year Ph.D. Candidate in Electrical Engineering and Computer Science at MIT CSAIL. I work in the Julia Lab under the supervision of Dr. Alan Edelman and Dr. Christopher Rackauckas. My research interests are in the intersection of AI compilers and AI for science, where I focus on performance optimizations for neural network training and inference in the context of scientific computing. For some representative examples of my work, see Publications and Presentations.

News

Aug. '25
New publication on Neural Surrogates for Hypersonic Flow Predictions accepted at AIAA SciTech 2025.
Jul. '25
NonlinearSolve.jl now accepted at ACM Transactions on Mathematical Software!
May '25
New pre-print on Semi Explicit Neural DAEs released on ArXiv.
Nov '24
New publication on B-Spline KANs accepted at NeurIPS Workshop on Science for Deep Learning.
Mar. '24
New pre-print on NonlinearSolve.jl released on ArXiv.

Experience

Current Position


Massachusetts Institute of Technology
Electrical Engineering and Computer Science
Ph.D. Candidate
Advisor(s):   Dr. Alan Edelman
Sep. '21 - May '26 (est.)
Cambridge, MA
GPA:   4.9 / 5.0

Education


Massachusetts Institute of Technology
Electrical Engineering and Computer Science
Master of Science (S.M.)
Advisor(s):   Dr. Alan Edelman
Sep. '21 - May '23
GPA:   4.8 / 5.0

Indian Institute of Technology Kanpur
Computer Science and Engineering
Bachelor of Technology (B.Tech.)
Jul. '17 - May '21
GPA:   9.9 / 10.0

Past Positions


Summer '25Student Researcher, XLA:TPU OptimizationsGoogle CloudNew York City, NY
Summer '24Graduate Research Intern, Parallel Computing LabIntel Parallel Computing LabsSanta Clara, CA
Summer '22Student ResearcherGoogle AIMountain View, CA
Jan - July '21Engineering Simulation InternJulia ComputingRemote
Jan - Nov '20Research InternVector InstituteToronto, CAN
Summer '18, '19Google Summer of Code ParticipantJuliaLang OrganizationRemote
Summer '18Software Engineering InternIIT Kanpur NYO OfficeKanpur, IND

Publications

My research interests are in the intersection of AI compilers and AI for science, where I focus on performance optimizations for neural network training and inference in the context of scientific computing. Occasionally, I explore domains outside my major area of focus and have published a few works in (Multi-Agent) Reinforcement Learning, Differentiable Graphics, etc.

Conference Proceedings


Geometry & Mesh Invariant Neural Surrogates for Hypersonic Flows
AIAA SciTech Forum (Accepted, To Appear) • 2025
Locally Regularized Neural Differential Equations: Some Black Boxes Were Meant to Remain Closed!
International Conference on Machine Learning (ICML) • 2023
Continuous Deep Equilibrium Models: Training Neural ODEs Faster by Integrating Them to Infinity
IEEE High Performance Extreme Computing (HPEC) • 2023 •  Best Student Paper Award
Opening the Blackbox: Accelerating Neural DEs by Regularizing Internal Solver Heuristics
International Conference on Machine Learning (ICML) • 2021
Stably Accelerating Stiff Quantitative Systems Pharmacology Models: Continuous-Time ESNs as Implicit ML
International Federation of Automatic Control (IFAC) • 2021
Composing Modeling and Simulation with Machine Learning in Julia
International Modelica Conference • 2021
Humor@IITK at SemEval-2021 Task 7: Language Models for Quantifying Humor And Offensiveness
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval) • 2021
TorchGAN: A Flexible Framework for GAN Training and Evaluation
Journal of Open Source Software (JOSS) • 2021
Avik Pal, & Aniket Das
RayTracer.jl: A Differentiable Renderer that supports Parameter Optimization for Scene Reconstruction
Proceedings of the JuliaCon Conferences • 2019
Avik Pal

Journal Papers


NonlinearSolve.jl: High-Performance and Robust Solvers for Systems of Nonlinear Equations
ACM Transactions on Mathematical Software (TOMS) (Accepted, To Appear) • 2024

(Peer-Reviewed) Workshop Papers


Understanding the Limitations of KANs: Convergence Dynamics and Computational Efficiency
NeurIPS Workshop on Science for Deep Learning • 2024
Avik Pal, & Dipankar Das
Fashionable Modelling with Flux
NeurIPS Workshop on Systems for Machine Learning • 2019

Pre-prints


Making Waves in the Cloud: A Paradigm-Shift for Scientific Computing and Ocean Modeling through Compiler Technology
Under Review • 2025
Semi-Explicit Neural DAEs: Learning Long-Horizon Dynamical Systems with Algebraic Constraints
Under Review • 2025
Differentiable Programming for Differential Equations: A Review
Under Review at SIAM Review • 2024

Presentations

Learned Cost Models for TPU Window Config RankingGoogle ML Compilers Reading Group 2025
Accelerating Machine Learning in Julia using Lux & ReactantJuliaCon 2025
Accelerating Machine Learning in Julia using Lux & ReactantAnnual CSAIL Alliances Workshop 2025
Semi‑Explicit Neural DAEs: Learning Long Horizon Constrained Dynamical SystemsSIAM CSE 2025
The Tricks Required for Scientific Machine Learning to Work on Real DataSIAM CSE 2025
Accelerating Physics Informed Machine Learning in Julia using Reactant and LuxAAAI 2025
Mesh & Geometry Invariant Neural Surrogates for Hypersonic FlowsAIA Annual Meeting 2025
Lux.jl: Explicit Parameterization of Neural Networks in JuliaJuliaCon 2022
Mixing Implicit and Explicit Deep Learning with Skip DEQsSciMLCon 2022
Differentiable Rendering and its Applications in Deep LearningJuliaCon 2019

Last updated: