Skip to main content

Table 1 Comparison of relative MSE and KL (Kullback-Leibler) divergence for the toy model

From: Optimized sparse polynomial chaos expansion with entropy regularization

Results

OMP

Ent-OMP

HEnt-OMP

Relative MSE

0.237

0.217

0.249

KL Divergence

912.14

706.64

365.34