This is a webpage that includes many aspects of our team's work on Multi fidelity Fusion
By leveraging the multi-fidelity data, the surrogate model can be trained with many low-fidelity data, which is cheap to generate, and a few high-fidelity data to predict the output of the high-fidelity simulation accurately.
FidelityFusion focus on tractable multi-fidelity fusion methods, which can be easily optimized and scaled to high-dimensional output with strong generalization and robustness.
FidelityFusion includes the following algorithms:
-
AR0: the classic autoregression model by M. C. Kennedy and A. O'Hagan.
Tractable model
applicable tosingle-output
andsubset-structured
multi-fidelity data. -
NAR: the classic nonstationary autoregression model by G. E. Karniadakis' team.
Nontractable model
applicable tosingle/low-dimensional-output
andsubset-structured
multi-fidelity data. -
DC: Deep Coregionalization.
Nontractable model
applicable tohigh-dimensional-output
/spatial-temporal field output
, andsubset-structured
multi-fidelity data. -
ResGP: Residual Gaussian Process.
Tractable model
applicable tohigh-dimensional-output
/spatial-temporal field output
, andsubset-structured
multi-fidelity data. -
GAR [Slides]: Generalized autoregression model. Possibly the most powerful
Tractable model
applicable tohigh-dimensional-output
/spatial-temporal field output
that arenonaligned
(the dimensionality is different at different fidelities), andarbitrary-structured
multi-fidelity data. -
CIGAR [Slides]: Conditional independent generalized autoregression. A simplified version of GAR by leveraging the Autokrigeability.
Tractable model
applicable toultra-high-dimensional-output
/spatial-temporal field output
that arenonaligned
(the dimensionality is different at different fidelities), andarbitrary-structured
multi-fidelity data.
FidelityFusion was developed and maintained by mainly by Wei. W. Xing at IceLab-X and Zen Xing at Rockchips. A non-exhaustive but growing list needs to mention: Yuxing Wang and Guanjie Wang at BUAA.
Please cite our paper if you find it helpful :)
@inproceedings{
wang2022gar,
title={{GAR}}: Generalized Autoregression for Multi-Fidelity Fusion},
author={Yuxin Wang and Zheng Xing and WEI W. XING},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=aLNWp0pn1Ij}
}