Features at-a-glance
  • Reliability-based design optimization 

  • Bayesian inference for model calibration and inverse problems

  • UQLib: an open-source library of UQLab

  • Local and global sensitivity analysis

  • UQLink: universal connection to third-party software

  • Support vector machines for classification and regression

  • Advanced probabilistic modeling (copulas)

  • Seamless connection with MATLAB-based models

  • (Sparse) polynomial chaos expansions

  • Advanced Kriging (Gaussian process modeling)

  • Polynomial chaos-Kriging (PC-Kriging)

  • Canonical low-rank tensor approximations

  • Reliability analysis (rare event estimation)

 NEW! 

UPDATED! 

UPDATED! 

 
Reliability-based design optimization

The reliability-based design optimization (RBDO) module offers a set of state-of-the-art algorithms to solve various types of optimization problems under probabilistic constraints. They include:

  • Reliability index approach (RIA)

  • Performance measure approach (PMA)

  • Single loop approach (SLA)

  • Sequential optimization and reliability assessment (SORA)

On top of these well-known algorithms, the modular design of the RBDO module allows the user to set up customized solution schemes by combining all of the reliability, surrogate modeling, and optimization techniques available in UQLab.

Fig_RBDO_page.jpg

NEW!

 
Bayesian inference for model calibration and inverse problems

Bayesian inference is a powerful tool for probabilistic model calibration and inverse problems. UQLab offers a flexible and intuitive way to set-up and solve Bayesian inverse problems.

  • Intuitive definition of prior knowledge, forward model and data

  • State-of-the-art Markov Chain Monte Carlo (MCMC) algorithms

  • Customizable discrepancy between model and measurements

  • Support for user-specified custom likelihood

  • Support for multiple forward models and multiple discrepancy models (joint inversion)

  • Fully integrated with UQLab (e.g. surrogate models, complex priors, etc.)

 
UQLib

UQLib is a collection of general-purpose open-source MATLAB libraries that are useful in the context of uncertainty quantification. These functions are currently used across the scientific modules of UQLab, but they are designed for generic use.

  • Optimization (e.g., cross-entropy optimization, covariance matrix adaptation-evolution strategy and its constrained variant)

  • Differentiation (e.g., gradient computation)

  • Kernel (stationary and non-stationary kernel functions)

  • Input/output processing (e.g., subsampling)

Rosenbrock's function minimization
 
Sensitivity analysis

The sensitivity analysis module contains sample-based, linearization, and global sensitivity analysis methods, that quantitatively measure the importance of each input parameter.

  • Sample-based methods (input/output correlation and standard regression coefficients,
    with their rank-based versions)

  • Linearization (perturbation) method

  • Screening methods (Morris' elementary effects, Cotter indices)

  • Moment-independent global method (Borgonovo indices)

  • Sobol' indices computed by Monte Carlo simulation or analytically
    (from polynomial chaos expansions and low-rank tensor approximations)

  • Generalization of Sobol' indices for dependent input parameters
    (Kucherenko and ANCOVA indices)

Sampling- vs.  PCE-based Sobol' indices
 
UQLink

UQLink allows the seamless connection of third-party software to UQLab using universal "wrapping" of external codes through templates and a mark-up system.

  • Based on automated text input file generation

  • Available for all platforms (Windows, Linux, macOS)

  • Examples with simple C/C++ code and commercial finite element software

Importance Sampling
 
Support vector machines

Support vector machines (SVM) come from machine learning and allow one to build predictive models from data. In the context of uncertainty quantification, SVM for regression (SVR) can be used as surrogate models of complex simulators using designs of computer experiments. SVM for classification (SVC) can be used in the context of reliability analysis.

  • L1-SVR and L2-SVR formulation for regression

  • Soft-margin classification

  • Anisotropic and user-defined kernels

  • Leave-one-out error and span approximations

  • Multiple optimization algorithms (grid search, BFGS, cross-entropy, CMA-ES, etc.)

Importance Sampling

UPDATED! 

 
Advanced probabilistic modeling tools

In many uncertainty quantification problems, the sources of uncertainty are represented by complex probabilistic models (random vectors). UQLab offers a powerful and extendible set of tools to represent, infer, and sample from complex multivariate distributions.

  • Extensive library of marginal distributions

  • Modeling dependence with Gaussian and Vine copulas

  • Statistical Inference of marginals and copulas from data

  • Advanced sampling strategies (space-filling), including Monte-Carlo sampling, optimized latin hypercube sampling (LHS), low-discrepancy series (Sobol' and Halton sequences)

  • Sampling enrichment (nested LHS)

  • Support for custom-defined and bounded marginals

  • Isoprobabilistic transform facilities

Random vector with Gaussian copula
 
Easy plug-in of MATLAB-based models

Uncertainty quantification aims at predicting the impact of input parameters uncertainty onto the predictions of a computational model. UQLab offers a simple infrastructure to handle analytical models and MATLAB-based computational models (solvers).

  • Built-in support for functions defined as strings, function handles and m-files

  • Intuitive integration of more complex codes

Complex model response
 
Polynomial chaos expansions

Polynomial Chaos Expansions (PCE) are a metamodeling tool that enables the fast construction of surrogate models, which can be efficiently used for moment- and sensitivity analysis.

  • Full  and sparse polynomial chaos expansions

  • Advanced truncation strategies (hyperbolic norms, max interaction, custom basis specification)

  • Quadrature/sparse Gaussian quadrature (based on Smolyak grids)

  • Ordinary least squares, Least Angle Regression (LARS) and Orthogonal Matching Pursuit regression algorithms

  • Degree-adaptive sparse polynomial chaos expansions

  • Polynomials orthogonal to arbitrary distributions (via Stieltjes construction)

Full vs. Sparse PCE
Full vs. PCE response
 
Kriging (Gaussian process modeling)

UPDATED! 

Gaussian process modeling is a flexible and robust technique to build fast surrogate models based on small experimental designs

  • Simple, ordinary, and universal Kriging 

  • Highly customizable trend and correlation functions

  • Maximum-likelihood- and cross-validation-based hyperparameter estimation

  • Gradient-based, global, and hybrid optimization methods

  • Interpolation (noise-free response) and regression (noisy response) modes

featurePage-Kriging.png
 
Polynomial Chaos-Kriging (PC-Kriging)

Polynomial Chaos-Kriging associates the global approximation behavior of polynomial chaos expansions and the local accuracy of Kriging to provide a highly accurate surrogate model at low computational costs.

  • Support for sequential and optimal construction of PC-Kriging

  • Full control on both levels of approximation: polynomial chaos expansions and Kriging directly use the corresponding dedicated UQLab modules

  • Support for sparse, adaptive and arbitrary polynomial chaos expansions

  • Gradient-based, global and hybrid optimization methods for Kriging

1D PC-Kriging example
 
Canonical low-rank tensor polynomial approximations

Canonical low-rank approximations (LRA) are a powerful alternative to polynomial chaos expansions that are particularly effective in high dimension.

  • Low-rank basis construction based on orthonormal polynomials

  • Adaptive identification of maximum rank and polynomial degree via cross-validation

  • Alternate least-square calculation of basis elements and coefficients

  • Polynomials orthogonal to arbitrary distributions (via Stieltjes construction)

True vs LRA response
 
Reliability analysis (rare event estimation)

When the performance of a system is affected by uncertainties on its characteristics and/or its environment, reliability can be assessed by computing probabilities of failure.

  • FORM/SORM approximation methods

  • Sampling methods (Monte Carlo, importance sampling, subset simulation)

  • Kriging-based adaptive methods (AK-MCS, APCK-MCS)

Importance Sampling
Rosenbrock's function minimization with CMA-ES