The use of 3D imaging is growing fast in many fields, such as the remote sensing of environments, with the direct receipt of 3D geometries at more and more accurate levels. Coupled to in-situ observations, it produces increasingly voluminous data and so hard to manage with classic computers that it becomes essential to represent them compactly and operably. Here high performance computing plays a key role: It allows to reduce these so-called massive data through adapted meshes, in particular for realizing realistic numerical simulations.
This is the objective of the Real2HPC project, carried out by Hugues Digonnet, researcher at the Institut de calcul intensif of the Ecole Centrale de Nantes. Started in 2015, this project aims to conceive a tool for building a mesh able to compile, exactly and in a reusable format, real massive data, in 2D or in 3D.
But conceiving such a tool is far from being as simple as it seems: « Our mesh has to be able to adapt itself, before the calculation, to the configurations it will perform and then, dynamically, to distribute them in an optimized way on the computing cores regarding the calculation times. We need also to push forward the calculation methods for adapting them to the amount of data and, finally, to take into account the post-treatment, especially the graphic processing », Hugues Digonnet explained.
For these large-scale computations, one million core hours were necessary on GENCI’s resources, Curie, Turing and Occigen. They concerned in particular the calculation of urban environments - a first step towards the simulation of dynamic flows, heat transfer or even dispersion of pollutants.
3D reconstruction of a neighbourhood of Nantes in 1900 obtained by scan of an existing model at the Château des Ducs de Bretagne and realised by the Institut de recherche en sciences et techniques de la ville (IRSTV)
© Ecole Centrale de Nantes