
The GMRF method featured in this thesis also is one of the earliest application to be implemented in ExaStencils. The methods are implemented in different languages ranging from the most common language used by mathematicians (MATLAB), to the more performance oriented language (C++-PETSc/MPI), and ends with one of the newest programming concept (ExaStencils). This thesis also studies the ease of implementation of these methods in varying levels of programming abstraction. It is also seen that in any Monte Carlo type methods, a Krylov subspace type solver is almost always recommended together with a suitable preconditioner for robust sampling. The MLMC method was shown to be at least 6 times faster than the standard Monte Carlo method and the speedup increases with grid size. Speedups of as much as 1000 can be observed when compared to the standard Cholesky Decomposition sample generation method, even for a relatively small problem size. It is seen that the GMRF method can be used to generate Gaussian Fields with a Matrn type covariance function and reduces the computational requirements for large scale problems. The GMRF method is implemented in different programming environments in order to evaluate the potential performance enhancements given varying levels of language abstraction. This thesis is a study on the implementation of the Gaussian Markov Random Field (GMRF) for random sample generation and also the Multilevel Monte Carlo (MLMC) method to reduce the computational costs involved with doing uncertainty quantification studies. My code works for single core (in which case it does normal matrix multiplication, I mean there's no work to split up if you only run on one rank anyway). In order to overlap communication with computation, MPI provides a pair of functions for performing non-blocking send and receive operations.
