Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
Bio
This is a page not in th emain menu
Published:
Featured blog by Amazon Science on the research efforts that I am leading on physics-constrained machine learning for scientific computing and computational sciences.
Published:
Blog, see pages 6-7 on how my parents inspired my love of math and science.
Published:
Featured blog by Amazon Science on my research on our award-winning paper on COVID-19 forecasting, where I advised our summer PhD student intern Rui (Ray) Wang.
Published:
Blog on our Best Paper Award at the Machine Learning and Public Health NeurIPS workshop in 2020.
Published:
Blog on the launch of our open-source library GluonTS for probabilistic time series forecasting.
Published:
Blog: Selected interview at the Women in Data Science (WiDS) conference at Stanford University.
Published:
Stanford spotlight on my love of teaching mathematics and MATLAB for Advanced Scientific Computing for the ICME Summer Data Science workshops.
Published in Sandia National Laboratories Report, Sand No. 2017-10335, 2017
Chapter 5: Structure preservation in finite volume ROMs via physics-based constraints
Recommended citation: Tezaur, I.K., Fike, J., Carlberg, K., Barone, M., Maddix, D.C., Mussoni, E., Balajewicz, M. (2017). "Advanced Fluid Reduced Order Models for Compressible Flow." Sandia National Laboratories Report, Sand No. 2017-10335. https://www.osti.gov/servlets/purl/1395816
Published in Journal of Computational Physics, 2018
Recommended citation: Maddix, D.C., Sampaio, L., Gerritsen, M. (2018). "Numerical Artifacts in the Generalized Porous Medium Equation: Why harmonic averaging itself is not to blame." Journal of Computational Physics. 361:280-298. https://arxiv.org/abs/1709.02581
Published in Journal of Computational Physics, 2018
Recommended citation: Maddix, D.C., Sampaio, L., Gerritsen, M. (2018). "Numerical Artifacts in the discontinuous Generalized Porous Medium Equation: How to avoid spurious temporal oscillations." Journal of Computational Physics. 368:277-298. https://arxiv.org/abs/1712.00132
Published in Proceedings of the 36th International Conference on Machine Learning (ICML), 2019
Our Deep Factor code is in GluonTS.
Recommended citation: Wang, Y., Smola, A., Maddix, D.C., Gasthaus, J., Foster, D. (2019). "Deep Factors for Forecasting." Proceedings of the 39th International Conference on Machine Learning (ICML), PMLR. 97:6607-6617. http://proceedings.mlr.press/v97/wang19k/wang19k.pdf
Published in Proceedings of the 39th International Conference on Machine Learning (ICML), 2020
Recommended citation: Alexandrov, A., Benidis, K., Bohlke-Schneider, M., Flunkert, V., Gasthaus, J., Januschowski, T., Maddix, D.C., Rangapuram, S., Salinas, D., Schulz, J., Stella, L., Türkmen, A.C., Wang, Y. (2020). "GluonTS: Probabilistic and Neural Time Series Modeling in Python." Journal of Machine Learning Research (JMLR). 21(116):1-6. https://www.jmlr.org/papers/v21/19-820.html
Published in Proceedings of the 3rd Conference on Learning for Dynamics and Control (L4DC), 2021
Our AutoODE code is on github.
Recommended citation: Wang, R., Maddix, D.C., Faloutsos, C., Wang, Y., Yu, R. (2021). "Bridging Physics-based and Data-driven modeling for Learning Dynamical Systems." Proceedings of the 3rd Conference on Learning for Dynamics and Control (L4DC), PMLR. 144:385-398. http://proceedings.mlr.press/v144/wang21a/wang21a.pdf
Published in Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Our Incremental Quantile Function (IQF) code is incorporated into the MQ-CNN Estimator in GluonTS.
Recommended citation: Park, Y., Maddix, D.C., Aubet, FX., Kan, K., Gasthaus, J., Wang, Y. (2022). "Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting." Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS), PMLR. 151:8127-8150. https://proceedings.mlr.press/v151/park22a.html
Published in Proceedings of the 39th International Conference on Machine Learning (ICML), 2022
Recommended citation: Jin, X., Park, Y., Maddix, D.C., Wang, H., Wang, Y. (2022). "Domain Adaptation for Time Series Forecasting via Attention Sharing." Proceedings of the 39th International Conference on Machine Learning (ICML), PMLR. 162:10280-10297. https://proceedings.mlr.press/v162/jin22d/jin22d.pdf
Published in ACM Computing Surveys, 2022
Recommended citation: Benidis, K., Rangapuram, S., Flunkert V., Wang, Y., Maddix, D.C., Türkmen, C., Gasthaus, J., Bohlke-Schneider, M., Salinas, D., Stella, L., Aubet, FX., Callot, L, Januschowski, T. (2022). "Deep Learning for Time Series Forecasting: Tutorial and Literature Survey." ACM Computing Surveys. 55(6):1-36. https://arxiv.org/pdf/2004.10240
Published in Proceedings of the International Conference on Learning Representations (ICLR), 2023
Our Boundary enforcing Operator Network (BOON) code is on the amazon-science github.
Recommended citation: Saad, N.*, Gupta, G.*, Alizadeh, S., Maddix, D.C. (2023). "Guiding Continuous Operator Learning through Physics-based boundary constraints." Proceedings of the International Conference on Learning Representations (ICLR). https://www.amazon.science/publications/guiding-continuous-operator-learning-through-physics-based-boundary-constraints
Published in Proceedings of the 40th International Conference on Machine Learning (ICML), 2023
Recommended citation: Hasson, H., Maddix, D.C., Wang, Y., Park, Y., Gupta, G., (2023). "Theoretical Guarantees of Learning Ensembling Strategies with Applications to Time Series Forecasting." Proceedings of the 40th International Conference on Machine Learning (ICML), PMLR. 202:12616-12632. https://proceedings.mlr.press/v202/hasson23a/hasson23a.pdf
Published in Proceedings of the 40th International Conference on Machine Learning (ICML), 2023
Recommended citation: Hansen, D., Maddix, D.C., Alizadeh, S., Gupta, G., Mahoney, M.W. (2023). "Learning Physical Models that Can Respect Conservation Laws." Proceedings of the 40th International Conference on Machine Learning (ICML), PMLR. 202:12469-12510. http://proceedings.mlr.press/v202/hansen23b/hansen23b.pdf
Published in Proceedings of the 37th Conference of Neural Information Processing Systems (NeurIPS), 2023
Our PreDiff code is available on github.
Recommended citation: Gao, Z., Shi, X., Han, B., Wang, H., Jin, X., Maddix, D.C., Zhu, Y., Li, M., Wang, Y. (2023). "PreDiff: Precipitation Nowcasting with Latent Diffusion Models." Proceedings of the 37th Conference of Neural Information Processing Systems (NeurIPS). https://proceedings.neurips.cc/paper_files/paper/2023/file/f82ba6a6b981fbbecf5f2ee5de7db39c-Paper-Conference.pdf
Published in Physica D: Nonlinear Phenomena, 2024
Our ProbConserv code is on the amazon-science github.
Recommended citation: Hansen, D.*, Maddix, D.C.*, Alizadeh, S., Gupta, G., Mahoney, M.W. (2024). "Learning Physical Models that Can Respect Conservation Laws." Physica D: Nonlinear Phenomena, 457 (133952), (*Equal contributions). https://doi.org/10.1016/j.physd.2023.133952
Published in Society of Automotive Engineers (SAE) Technical Paper, 2024
Recommended citation: Ananthan, V., Ashton, N., Chadwick, N., Lizarraga, M., Maddix, D.C., et al. (2024). "Machine Learning for Road Vehicle Aerodynamics Simulation." Society of Automotive Engineers (SAE) Technical Paper. https://www.sae.org/publications/technical-papers/content/2024-01-2529/
Published in Transactions on Machine Learning Research (TMLR), 2024
Our Chronos code is on the amazon-science github.
Recommended citation: Ansari, F.A., Stella, L., Turkmen, C., Zhang, X., Mercado, P., Shen, H., Shchur, O., Rangapuram, S.S., Arango, S.A., Kapoor, S., Zschiegner, J., Maddix, D.C., et al. (2024). "Chronos: Learning the Language of Time Series." Transactions on Machine Learning Research (10/2024). https://www.stat.berkeley.edu/~mmahoney/pubs/2619_Chronos_Learning_the_Lang.pdf
Published in Proceedings of the 41st International Conference on Machine Learning (ICML), 2024
Recommended citation: Qiu, S., Han, B, Maddix, D.C., Zhang, S., Wang, Y., Wilson, A.G. (2024). "Transferring Knowledge from Large Foundation Models to Small Downstream Tasks." Proceedings of the 41st International Conference on Machine Learning (ICML), PMLR. 235. https://arxiv.org/abs/2406.07337
Published in Proceedings of the 41st International Conference on Machine Learning (ICML), 2024
Our Operator-ProbConserv code is on the amazon-science github.
Recommended citation: Mouli, S.C., Maddix, D.C., Alizadeh, S., Gupta, G., Wang, Y., Stuart, A., Mahoney, M.W. (2024). "Using Uncertainty Quantification to Characterize and Improve Out-of-Domain Learning for PDEs." Proceedings of the 41st International Conference on Machine Learning (ICML), PMLR. 235. https://arxiv.org/abs/2403.10642
Published in Technical Report, Preprint arXiv:2407.14129, 2024
Shorter version on Navier Stokes dynamics accepted at the ICLR 2024 Workshop on AI4DifferentialEquations In Science.
Recommended citation: Karlbauer, M., Maddix, D.C., Ansari, A.F., Han, B., Gupta, G., Wang, Y., Stuart, A., Mahoney, M.W., (2024). "Comparing and Contrasting Deep Learning Weather Prediction Backbones on Navier-Stokes and Atmospheric Dynamics." Technical Report, Preprint arXiv:2407.14129, Under Review. https://arxiv.org/abs/2407.14129
Published in Technical Report, Preprint arXiv:2407.19320, 2024
Recommended citation: Ashton, N., Angel, J.B., Ghate, A.S., Kenway, G.K.W., Wong, M.L., Kiris, C., Walle, A., Maddix, D.C., Page, G., (2024). "WindsorML: High-Fidelity Computational Fluid Dynamics Dataset For Automotive Aerodynamics." Technical Report, Preprint arXiv:2407.19320, NeurIPS Datasets and Benchmarks Track, Accepted for Publication. https://arxiv.org/pdf/2407.19320
Published in Technical Report, Preprint arXiv:2408.11969, 2024
Recommended citation: Ashton, N., Mockett, C., Fuchs, M., Fliessbach, L., Hetmann, H., Knacke, T., Schönwald, N., Skaperdas, V., Fotiadis, G., Walle, A., Hupertz, B. Maddix, D.C., Yu, P., (2024). "DrivAerML: High-Fidelity Computational Fluid Dynamics Dataset For Road-Car External Aerodynamics." Technical Report, Preprint arXiv:2408.11969, Under Review. https://arxiv.org/pdf/2408.11969
Published in Under Review, 2024
Recommended citation: Cheng, C., Han, B., Maddix, D.C., Ansari, A.F., Stuart, A., Mahoney, M.W., Wang, Y., (2024). "Gradient-Free Generation for Hard-Constrained Systems." Under Review.
Published:
I presented posters on my research at the ICME Xpo Research Symposium from 2015-2018 on the following topics:
Published:
PhD Thesis Research on finite-volume averaged-based methods for nonlinear porous media flow
Published:
ICML Time Series Workshop Talk with corresponding youtube link on GluonTS.
Published:
Please find my talk here.
Published:
My invited talk on “Physics-constrained Machine Learning for Scientific Computing” at the Machine Learning and Dynamical Systems Seminar at the Alan Turing Institute covers our following three research works:
Published:
In this panel, we discuss challenges for women in STEM and how to persevere, internship mentoring opportunities and the future uses of machine learning in science.
Published:
My talk on “Physics-constrained Machine Learning for Earth and Sustainability Science” at the 2nd AI for Good Webinar Series for AI for Earth and Sustainability Science covers our following four research works in various scientific disciplines from epidemiology to weather and climate:
Published:
I gave a guest lecture in Professor Krishnapriyan’s advanced Computer Science 294 graduate course on Physics Inspired Deep Learning at the University of California, Berkeley.
Published:
I gave an invited talk at the Institute for Computational and Experimental Research in Mathematics (ICERM)’s Workshop on the Industrialization of SciML at Brown University.
Published:
I spoke at the Ansys Monthly Seminar on our physics-constrained machine learning works with applications to weather and climate forecasting and aerodyanmics.
Published:
In this panel, I discussed my journey and career in mathematics after graduating from UC Berkeley with my bachelors in Applied Mathematics and then my PhD in Computational and Mathematical Engineering from Stanford University. I advised aspiring young mathematics students on the vast career opportunities with a mathematics degree.
Published:
In this panel, I will be discussing my PhD experiences at ICME and how that prepared me for a research career in industry.
Published:
I will be speaking at the Foundation Models for Science: Progress, Opportunities, and Challenges at NeurIPS 2024 Workshop.
Undergraduate course, University of California, Berkeley, 2011
I was one of a few undergraduate student instructors selected in the University of California, Berkeley’s mathematics department from 2011-2012.
Undergraduate course, University of California, Berkeley, 2012
I taught the undergraduate Math 54 course at the University of California, Berkeley.
Graduate course, Institute of Computational and Mathematical Engineering (ICME), Stanford University, 2015
I taught the Numerical Linear Algebra ICME refresher course for incoming ICME graduate students to help them prepare for their first year core graduate courses.
Professional and student course, Institute of Computational and Mathematical Engineering, Stanford University, 2016
I was the organizer of the 2016 ICME Data Science Workshops, which covered the fundamentals of data science for Stanford students and professionals in industry. I was also the instructor of the Advanced MATLAB workshop.
Graduate course, Stanford University, 2016
I taught and developed an Advanced MATLAB course aimed for graduate student scientists and engineers, covering topics including data structures, memory management, advanced graphics in higher dimensions, code optimization and debugging, object-oriented programming, compiled MATLAB (MEX files and MATLAB coder), and optimization, parallel computing, symbolic math and PDEs toolboxes.
Professional course, Amazon Web Services (AWS), 2022
I taught and developed the materials for a course on operational time series forecasting covering classical local state space models, e.g., ARIMA and ETS and deep learning models, e.g., DeepAR. In addition, we covered how to use the GluonTS time series toolkit.
Professional and student course, Institute of Computational and Mathematical Engineering, Stanford University, 2023
I taught the Introduction to Statistics course at the 2023 ICME Summer Workshops Fundamentals of Data Science, which covers the fundamentals of data science for Stanford students and professionals in industry internationally.