Posts by Collection
portfolio
publications
Advanced Fluid Reduced Order Models for Compressible Flow
Published in Sandia National Laboratories Report, Sand No. 2017-10335, 2017
Chapter 5: Structure preservation in finite volume ROMs via physics-based constraints
Recommended citation: Tezaur, I.K., Fike, J., Carlberg, K., Barone, M., Maddix, D.C., Mussoni, E., Balajewicz, M. (2017). "Advanced Fluid Reduced Order Models for Compressible Flow." Sandia National Laboratories Report, Sand No. 2017-10335. https://www.osti.gov/servlets/purl/1395816
Numerical Artifacts in the Generalized Porous Medium Equation: Why harmonic averaging itself is not to blame
Published in Journal of Computational Physics, 2018
Recommended citation: Maddix, D.C., Sampaio, L., Gerritsen, M. (2018). "Numerical Artifacts in the Generalized Porous Medium Equation: Why harmonic averaging itself is not to blame." Journal of Computational Physics. 361:280-298. https://arxiv.org/abs/1709.02581
Numerical Artifacts in the discontinuous Generalized Porous Medium Equation: How to avoid spurious temporal oscillations
Published in Journal of Computational Physics, 2018
Recommended citation: Maddix, D.C., Sampaio, L., Gerritsen, M. (2018). "Numerical Artifacts in the discontinuous Generalized Porous Medium Equation: How to avoid spurious temporal oscillations." Journal of Computational Physics. 368:277-298. https://arxiv.org/abs/1712.00132
Deep Factors for Forecasting
Published in Proceedings of the 36th International Conference on Machine Learning (ICML), 2019
Our Deep Factor code is in GluonTS.
Recommended citation: Wang, Y., Smola, A., Maddix, D.C., Gasthaus, J., Foster, D. (2019). "Deep Factors for Forecasting." Proceedings of the 39th International Conference on Machine Learning (ICML), PMLR. 97:6607-6617. http://proceedings.mlr.press/v97/wang19k/wang19k.pdf
GluonTS: Probabilistic and Neural Time Series Modeling in Python
Published in Proceedings of the 39th International Conference on Machine Learning (ICML), 2020
Recommended citation: Alexandrov, A., Benidis, K., Bohlke-Schneider, M., Flunkert, V., Gasthaus, J., Januschowski, T., Maddix, D.C., Rangapuram, S., Salinas, D., Schulz, J., Stella, L., Türkmen, A.C., Wang, Y. (2020). "GluonTS: Probabilistic and Neural Time Series Modeling in Python." Journal of Machine Learning Research (JMLR). 21(116):1-6. https://www.jmlr.org/papers/v21/19-820.html
Bridging Physics-based and Data-driven modeling for Learning Dynamical Systems
Published in Proceedings of the 3rd Conference on Learning for Dynamics and Control (L4DC), 2021
Our AutoODE code is on github.
Recommended citation: Wang, R., Maddix, D.C., Faloutsos, C., Wang, Y., Yu, R. (2021). "Bridging Physics-based and Data-driven modeling for Learning Dynamical Systems." Proceedings of the 3rd Conference on Learning for Dynamics and Control (L4DC), PMLR. 144:385-398. http://proceedings.mlr.press/v144/wang21a/wang21a.pdf
Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting
Published in Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Our Incremental Quantile Function (IQF) code is incorporated into the MQ-CNN Estimator in GluonTS.
Recommended citation: Park, Y., Maddix, D.C., Aubet, FX., Kan, K., Gasthaus, J., Wang, Y. (2022). "Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting." Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS), PMLR. 151:8127-8150. https://proceedings.mlr.press/v151/park22a.html
Domain Adaptation for Time Series Forecasting via Attention Sharing
Published in Proceedings of the 39th International Conference on Machine Learning (ICML), 2022
Recommended citation: Jin, X., Park, Y., Maddix, D.C., Wang, H., Wang, Y. (2022). "Domain Adaptation for Time Series Forecasting via Attention Sharing." Proceedings of the 39th International Conference on Machine Learning (ICML), PMLR. 162:10280-10297. https://proceedings.mlr.press/v162/jin22d/jin22d.pdf
Deep Learning For Time Series Forecasting: Tutorial and Literature Survey
Published in ACM Computing Surveys, 2022
Recommended citation: Benidis, K., Rangapuram, S., Flunkert V., Wang, Y., Maddix, D.C., Türkmen, C., Gasthaus, J., Bohlke-Schneider, M., Salinas, D., Stella, L., Aubet, FX., Callot, L, Januschowski, T. (2022). "Deep Learning for Time Series Forecasting: Tutorial and Literature Survey." ACM Computing Surveys. 55(6):1-36. https://arxiv.org/pdf/2004.10240
Guiding Continuous Operator Learning through Physics-based boundary constraints
Published in Proceedings of the International Conference on Learning Representations (ICLR), 2023
Our Boundary enforcing Operator Network (BOON) code is on the amazon-science github.
Recommended citation: Saad, N.*, Gupta, G.*, Alizadeh, S., Maddix, D.C. (2023). "Guiding Continuous Operator Learning through Physics-based boundary constraints." Proceedings of the International Conference on Learning Representations (ICLR). https://www.amazon.science/publications/guiding-continuous-operator-learning-through-physics-based-boundary-constraints
Theoretical Guarantees of Learning Ensembling Strategies with Applications to Time Series Forecasting
Published in Proceedings of the 40th International Conference on Machine Learning (ICML), 2023
Recommended citation: Hasson, H., Maddix, D.C., Wang, Y., Park, Y., Gupta, G., (2023). "Theoretical Guarantees of Learning Ensembling Strategies with Applications to Time Series Forecasting." Proceedings of the 40th International Conference on Machine Learning (ICML), PMLR. 202:12616-12632. https://proceedings.mlr.press/v202/hasson23a/hasson23a.pdf
Learning Physical Models that Can Respect Conservation Laws
Published in Proceedings of the 40th International Conference on Machine Learning (ICML), 2023
Recommended citation: Hansen, D., Maddix, D.C., Alizadeh, S., Gupta, G., Mahoney, M.W. (2023). "Learning Physical Models that Can Respect Conservation Laws." Proceedings of the 40th International Conference on Machine Learning (ICML), PMLR. 202:12469-12510. http://proceedings.mlr.press/v202/hansen23b/hansen23b.pdf
PreDiff: Precipitation Nowcasting with Latent Diffusion Models
Published in Proceedings of the 37th Conference of Neural Information Processing Systems (NeurIPS), 2023
Our PreDiff code is available on github.
Recommended citation: Gao, Z., Shi, X., Han, B., Wang, H., Jin, X., Maddix, D.C., Zhu, Y., Li, M., Wang, Y. (2023). "PreDiff: Precipitation Nowcasting with Latent Diffusion Models." Proceedings of the 37th Conference of Neural Information Processing Systems (NeurIPS). https://proceedings.neurips.cc/paper_files/paper/2023/file/f82ba6a6b981fbbecf5f2ee5de7db39c-Paper-Conference.pdf
Learning Physical Models that Can Respect Conservation Laws
Published in Physica D: Nonlinear Phenomena, 2024
Our ProbConserv code is on the amazon-science github.
Recommended citation: Hansen, D.*, Maddix, D.C.*, Alizadeh, S., Gupta, G., Mahoney, M.W. (2024). "Learning Physical Models that Can Respect Conservation Laws." Physica D: Nonlinear Phenomena, 457 (133952), (*Equal contributions). https://doi.org/10.1016/j.physd.2023.133952
Machine Learning for Road Vehicle Aerodynamics Simulation
Published in Society of Automotive Engineers (SAE) Technical Paper, 2024
Recommended citation: Ananthan, V., Ashton, N., Chadwick, N., Lizarraga, M., Maddix, D.C., et al. (2024). "Machine Learning for Road Vehicle Aerodynamics Simulation." Society of Automotive Engineers (SAE) Technical Paper. https://www.sae.org/publications/technical-papers/content/2024-01-2529/
Chronos: Learning the Language of Time Series
Published in Transactions on Machine Learning Research (TMLR), 2024
Our Chronos code is on the amazon-science github.
Recommended citation: Ansari, F.A., Stella, L., Turkmen, C., Zhang, X., Mercado, P., Shen, H., Shchur, O., Rangapuram, S.S., Arango, S.A., Kapoor, S., Zschiegner, J., Maddix, D.C., et al. (2024). "Chronos: Learning the Language of Time Series." Transactions on Machine Learning Research (10/2024). https://www.stat.berkeley.edu/~mmahoney/pubs/2619_Chronos_Learning_the_Lang.pdf
Transferring Knowledge from Large Foundation Models to Small Downstream Tasks
Published in Proceedings of the 41st International Conference on Machine Learning (ICML), 2024
Recommended citation: Qiu, S., Han, B, Maddix, D.C., Zhang, S., Wang, Y., Wilson, A.G. (2024). "Transferring Knowledge from Large Foundation Models to Small Downstream Tasks." Proceedings of the 41st International Conference on Machine Learning (ICML), PMLR. 235. https://arxiv.org/abs/2406.07337
Using Uncertainty Quantification to Characterize and Improve Out-of-Domain Learning for PDEs
Published in Proceedings of the 41st International Conference on Machine Learning (ICML), 2024
Our Operator-ProbConserv code is on the amazon-science github.
Recommended citation: Mouli, S.C., Maddix, D.C., Alizadeh, S., Gupta, G., Wang, Y., Stuart, A., Mahoney, M.W. (2024). "Using Uncertainty Quantification to Characterize and Improve Out-of-Domain Learning for PDEs." Proceedings of the 41st International Conference on Machine Learning (ICML), PMLR. 235. https://arxiv.org/abs/2403.10642
Comparing and Contrasting Deep Learning Weather Prediction Backbones on Navier-Stokes and Atmospheric Dynamics
Published in Technical Report, Preprint arXiv:2407.14129, 2024
Shorter version on Navier Stokes dynamics accepted at the ICLR 2024 Workshop on AI4DifferentialEquations In Science.
Recommended citation: Karlbauer, M., Maddix, D.C., Ansari, A.F., Han, B., Gupta, G., Wang, Y., Stuart, A., Mahoney, M.W., (2024). "Comparing and Contrasting Deep Learning Weather Prediction Backbones on Navier-Stokes and Atmospheric Dynamics." Technical Report, Preprint arXiv:2407.14129, Under Review. https://arxiv.org/abs/2407.14129
WindsorML–High-Fidelity Computational Fluid Dynamics Dataset For Automotive Aerodynamics
Published in Proceedings of the 38th Conference of Neural Information Processing Systems (NeurIPS), Datasets and Benchmarks Track, 2024
Recommended citation: Ashton, N., Angel, J.B., Ghate, A.S., Kenway, G.K.W., Wong, M.L., Kiris, C., Walle, A., Maddix, D.C., Page, G., (2024). "WindsorML: High-Fidelity Computational Fluid Dynamics Dataset For Automotive Aerodynamics." Proceedings of the 38th Conference of Neural Information Processing Systems (NeurIPS), Datasets and Benchmarks Track. https://arxiv.org/pdf/2407.19320
DrivAerML: High-Fidelity Computational Fluid Dynamics Dataset for Road-Car External Aerodynamics
Published in Technical Report, Preprint arXiv:2408.11969, 2024
Recommended citation: Ashton, N., Mockett, C., Fuchs, M., Fliessbach, L., Hetmann, H., Knacke, T., Schönwald, N., Skaperdas, V., Fotiadis, G., Walle, A., Hupertz, B. Maddix, D.C., Yu, P., (2024). "DrivAerML: High-Fidelity Computational Fluid Dynamics Dataset For Road-Car External Aerodynamics." Technical Report, Preprint arXiv:2408.11969, Under Review. https://arxiv.org/pdf/2408.11969
Hard Constraint Guided Flow Matching for Gradient-free Generation of PDE Solutions
Published in Technical Report, Preprint arXiv:2412.01786, 2024
Recommended citation: Cheng, C., Han, B., Maddix, D.C., Ansari, A.F., Stuart, A., Mahoney, M.W., Wang, Y., (2024). "Hard Constraint Guided Flow Matching for Gradient-free Generation of PDE Solutions." Technical Report, Preprint arXiv:2412.01786, Under Review. https://arxiv.org/abs/2412.01786
Early Warning of Complex Climate Risk with Integrated Artificial Intelligence
Published in Nature Communications, 2024
Recommended citation: Reichstein, M., Benson, V., Blunk, J., Camps-Valls, G., Creuzig, F., Fearnley, C., Han, B., Kornhuber, K., , Rahaman, N., Schölkopf, B., Tárraga, J.M., Vinuesa, R., Dall, K., Denzler, J., Frank, D., Martini, G., Nganga, N., Maddix, D.C., Weldemariam, K., (2024). "Early Warning of Complex Climate Risk with Integrated Artificial Intelligence." Nature Communications, Accepted for Publication.
talks
ICME Xpo Research Symposium
Published:
I presented posters on my research at the ICME Xpo Research Symposium from 2015-2018 on the following topics:
- Numerical Artifacts in the Generalized Porous Medium Equation and Solutions, PhD Thesis Research, 2017-2018
- Sparse Matrix Vector Multiplication Using the Merge Path, NVIDIA, 2016
- Won the ICME Xpo Best Poster Presenter Award
- Applications of the Voronoi Implicit Interface Method for Shape Optimization Problems Involving Interconnected Regions, Lawrence Berkeley National Laboratory, 2015
Temporal oscillations in the porous medium equation: why harmonic averaging itself is not to blame
Published:
PhD Thesis Research on finite-volume averaged-based methods for nonlinear porous media flow
Neural Time Series Models with GluonTS
Published:
ICML Time Series Workshop Talk with corresponding youtube link on GluonTS.
Mathematics in Science
Published:
Please find my talk here.
Physics-constrained Machine Learning for Scientific Computing
Published:
My invited talk on “Physics-constrained Machine Learning for Scientific Computing” at the Machine Learning and Dynamical Systems Seminar at the Alan Turing Institute covers our following three research works:
Women in Science at Amazon: A Conversation with our Amazonians Panelists
Published:
In this panel, we discuss challenges for women in STEM and how to persevere, internship mentoring opportunities and the future uses of machine learning in science.
Physics-constrained Machine Learning for Earth and Sustainability Science
Published:
My talk on “Physics-constrained Machine Learning for Earth and Sustainability Science” at the 2nd AI for Good Webinar Series for AI for Earth and Sustainability Science covers our following four research works in various scientific disciplines from epidemiology to weather and climate:
- Bridging Physics-based and Data-driven modeling for Learning Dynamical Systems, L4DC, 2021.
- Learning Physical Models that Can Respect Conservation Laws, Physica D: Nonlinear Phenomena, 2024, ICML, 2023.
- Guiding Continuous Operator Learning through Physics-based boundary constraints, ICLR, 2023.
- PreDiff: Precipitation Nowcasting with Latent Diffusion Models, NeurIPS, 2023.
Physics-constrained Machine Learning for Scientific Computing
Published:
I gave a guest lecture in Professor Krishnapriyan’s advanced Computer Science 294 graduate course on Physics Inspired Deep Learning at the University of California, Berkeley.
Advances in Scientific Machine Learning (SciML) in Industry
Published:
I gave an invited talk at the Institute for Computational and Experimental Research in Mathematics (ICERM)’s Workshop on the Industrialization of SciML at Brown University.
Physics-constrained Machine Learning for Scientific Computing
Published:
I spoke at the Ansys Monthly Seminar on our physics-constrained machine learning works with applications to weather and climate forecasting and aerodyanmics.
UC Berkeley Math Career Panel
Published:
In this panel, I discussed my journey and career in mathematics after graduating from UC Berkeley with my bachelors in Applied Mathematics and then my PhD in Computational and Mathematical Engineering from Stanford University. I advised aspiring young mathematics students on the vast career opportunities with a mathematics degree.
ICME’s 20th Anniversary Celebration Event
Published:
In this panel, I will be discussing my PhD experiences at ICME and how that prepared me for a research career in industry. See the recording here.
Intersection of Foundation Models and Scientific Computing
Published:
I will be speaking at the Foundation Models for Science: Progress, Opportunities, and Challenges at NeurIPS 2024 Workshop.
teaching
Undergraduate Student Instructor (UGSI) for Math 16B, 54, 1B
Undergraduate course, University of California, Berkeley, 2011
I was one of a few undergraduate student instructors selected in the University of California, Berkeley’s mathematics department from 2011-2012.
Lecturer for Math 54: Linear Algebra and Differential Equations, University of California, Berkeley
Undergraduate course, University of California, Berkeley, 2012
I taught the undergraduate Math 54 course at the University of California, Berkeley.
ICME Numerical Linear Algebra Refresher Course
Graduate course, Institute of Computational and Mathematical Engineering (ICME), Stanford University, 2015
I taught the Numerical Linear Algebra ICME refresher course for incoming ICME graduate students to help them prepare for their first year core graduate courses.
ICME Data Science Summer Workshop Instructor and Organizer
Professional and student course, Institute of Computational and Mathematical Engineering, Stanford University, 2016
I was the organizer of the 2016 ICME Data Science Workshops, which covered the fundamentals of data science for Stanford students and professionals in industry. I was also the instructor of the Advanced MATLAB workshop.
CME 292 Advanced MATLAB for Scientific Computing
Graduate course, Stanford University, 2016
I taught and developed an Advanced MATLAB course aimed for graduate student scientists and engineers, covering topics including data structures, memory management, advanced graphics in higher dimensions, code optimization and debugging, object-oriented programming, compiled MATLAB (MEX files and MATLAB coder), and optimization, parallel computing, symbolic math and PDEs toolboxes.
Machine Learning University (MLU): Accelerated Time Series Forecasting
Professional course, Amazon Web Services (AWS), 2022
I taught and developed the materials for a course on operational time series forecasting covering classical local state space models, e.g., ARIMA and ETS and deep learning models, e.g., DeepAR. In addition, we covered how to use the GluonTS time series toolkit.
ICME Data Science Summer Workshop Introduction to Statistics Instructor
Professional and student course, Institute of Computational and Mathematical Engineering, Stanford University, 2023
I taught the Introduction to Statistics course at the 2023 ICME Summer Workshops Fundamentals of Data Science, which covers the fundamentals of data science for Stanford students and professionals in industry internationally.