High-Performance Computing or HPC
Intensive computing or HPC (High Performance Computing) is at the crossroads of major scientific, societal and economic issues. In France, the objective assigned to computing policy is first and foremost to advance science in the service of knowledge, thanks to the possibilities offered by digital simulation, artificial intelligence, computational mass data processing and, very soon, quantum computing technologies.
Strategic from a sovereignty perspective, the use of digital simulation and high-performance computing has become indispensable for fundamental and applied research, as well as in a growing number of industrial sectors, in order to foster innovation and reduce the time it takes to design, validate and bring a product to market.
GENCI, in conjunction with computing centers, meets the needs of scientific communities. The acquisition, hosting, commissioning and operation of an Exascale machine in France represents a major challenge for French and European research. The Jules Verne consortium's bid, supported by France, has been selected for 2023.
Digital simulation and supercomputers
The third pillar of science alongside theory and experimentation, digital simulation is also a strategic tool for industry, enabling, among other things, shorter design and validation times while accelerating innovation.
Digital simulation aims to predict the behavior of objects that are complex or inaccessible because they are too small, too large, or too far away in space or time. How, for example, can we understand the evolution of the climate, the development of prions, the instabilities of certain supernovae or the formation of the primordial universe without recourse to simulation? Numerical simulation is used in a great many fields of scientific research and engineering, such as climate modeling, biology and health, fluid mechanics, chemical reactions, the synthesis of new materials and many others.
Experiments under real or full-scale conditions are often very costly or risky. We can think here of experiments under ultra-high pressure or ultra-high temperature, but also of cosmetics whose absence of toxicity must now be demonstrated without recourse to animal experimentation. Simulation has therefore become indispensable for designing and optimizing complex objects or products that interact with living beings.
The development of digital simulation, to reproduce these complex and difficile or even impossible experiments "in the laboratory", a fortiori when they would be dangerous (simulation of an industrial incident for example), costly (aircraft design for example), long-lasting, with multiple parameters (climatology for example) or even inaccessible on a human scale (astrophysics for example), has led to the development of another type of "calculating machine": the supercomputer.
In practical terms, digital simulation involves running a computer program on this very special type of computer. Indeed, a supercomputer, is a very large computer made up of a few thousand servers linked together by very high-speed networks and made up of dozens of computing units. They are used to study the operation and properties of a system or phenomenon, as well as to predict its evolution, for example the resistance of an oil rig to swell or the fatigue of a material subjected to vibrations.
.
Definitions
-
High-Performance Computing or HPC
The term High-Performance Computing (HPC) refers strictly to computational activities performed on a supercomputer. In fact, it is a field of computing that refers to the use of specific systems designed to solve complex, computationally demanding problems. These systems are specially designed to perform high-speed operations, in particular for the purposes of numerical simulation and the pre-training of artificial intelligences.
The appeal of HPC for scientific research lies in the sheer volume of data it can process and its ability to carry out complex simulations in a short space of time. Thanks to HPC, researchers can speed up their work, test hypotheses rapidly and obtain more accurate results. A genuine lever for accelerating scientific activity, HPC paves the way for new discoveries.
By using supercomputers based on advanced parallel architectures, scientists can model natural phenomena, simulate virtual experiments, analyze genetic sequences, study the behavior of materials at the atomic scale, or simulate highly complex multi-domain and multi-scale approaches, without having to resort to experiments that are too costly, inaccessible, destructive, polluting, or too risky.
-
Artificial intelligence
Artificial intelligence (AI) refers to the simulation of human cognitive processes by machines, particularly computers. AI aims to create systems capable of performing tasks that normally require human intelligence. These tasks can include problem solving, learning, pattern recognition, natural language understanding, decision making, and many others.
AI encompasses a variety of techniques and approaches, including:
- Machine learning
- Deep learning
- Natural language processing (NLP)
- Computer vision
- Planning and decision-making
- Expert systems
The convergence of artificial intelligence (AI) and high-performance computing (HPC) has become increasingly important. It enables the computing power of supercomputers to be fully exploited to accelerate the development and application of AI in a wide range of fields, from scientific research to business and security. It offers considerable advantages in terms of efficiency, performance and innovation.
HPC tools can act as gas pedals for AI projects through the computing power they provide, as for example in the processing of massive data, in the AI learning process, in deep neural training. AI can also be used to extract significant information when processing large quantities of data and running complex simulations. AI can be used for the optimization of simulation models and the automation of tasks such as resource management, job scheduling, performance monitoring, for example.
-
Quantum computing
Can be defined as a programmable quantum mechanical experiment given computational meaning, quantum computing should enable the solution of certain problems (e.g. quantum chemistry, combinatorial optimization) whose complexity is beyond the reach of the most powerful supercomputers. In the coming decades, quantum computing could contribute to decisive technological revolutions in many fields, such as the development of new medical treatments, the modeling of infectious agents, the capture of solar energy or CO2, the development of new materials, or the optimization of transport (maritime, air, road).
Major applications
High-Performance Computing is indispensable today in a great many academic and industrial fields:
- environment and climate for weather forecasting or the assessment of natural risks such as cyclones or tsunamis;
- automotive for modeling crash tests, engine combustion to reduce consumption and pollution, or the aerodynamic design of new vehicles ;
- aeronautics and space to reduce design and validation times for certain components;
- chemistry, medicine and biology to develop highly targeted drugs;
- materials physics to qualify new concepts or measure their strength;
- energy to optimize oil prospecting or design tomorrow's facilities ;
- finance to assess risks on certain complex products;
- multimedia to develop 3D sequences)
- decision support in times of crisis, natural disasters (tsunami, earthquake, flood, drought,..) or biological, nuclear or chemical threats