Denoising applied to spectroscopies – Part II: Decreasing computation time
2019; Taylor & Francis; Volume: 55; Issue: 3 Linguagem: Inglês
10.1080/05704928.2018.1559851
ISSN1520-569X
AutoresGuillaume Laurent, Pierre-Aymeric Gilles, William Woelffel, Virgile Barret-Vivin, Emmanuelle Gouillart, Christian Bonhomme,
Tópico(s)Spectroscopy Techniques in Biomedical and Chemical Research
ResumoSpectroscopies are of fundamental importance but can suffer from low sensitivity. Singular value decomposition (SVD) is a highly interesting mathematical tool, which can be conjugated with low-rank approximation to denoise spectra and increase sensitivity. SVD is also involved in data mining with principal component analysis (PCA). In this paper, we focused on the optimization of SVD duration, which is a time-consuming computation. Both Intel processors (CPU) and Nvidia graphic cards (GPU) were benchmarked. A 100 times gain was achieved when combining divide and conquer algorithm, Intel Math Kernel Library (MKL), SSE3 (Streaming SIMD Extensions) hardware instructions and single precision. In such a case, the CPU can outperform the GPU driven by CUDA technology. These results give a strong background to optimize SVD computation at the user scale.
Referência(s)