Files

36 : Computational mechanics for engineers

Incontournable, la mécanique numérique s’insère aujourd’hui dans l’ensemble de la chaîne de conception rapide des produits fabriqués par l’industrie. S’appuyant sur les outils de modélisation géométrique et de visualisation, et intégrant les outils de simulation et d’optimisation, elle réduit les délais de conception, limite les erreurs et s’insère dans l’esprit du développement durable en aidant à concevoir des produits de plus en plus respectueux de l’environnement.

36 : Computational mechanics for engineers

Has computational mechanics heralded an industrial revolution?

Ever since nuclear chain reactions were modelled at Los Alamos or since digital fluid mechanics was invented by NACA, the predecessors of NASA, back in the 1940s, to today's systematic use of digital tools in industrial design work, computational mechanics has gone through all sorts of stages and phases, with the constant objective, however, of improving on the models to gain in time and precision.

Through lack of time, numerous difficulties to carry out scale-one experiments and the advent of the first generation of digital computers, the end of WWII and especially the after-effects of Project Manhattan, signal the birth of computational modelling.

The initial investigations consisted of simulation of nuclear chain reaction propagation, which in turn led to the assembly of the first atomic bombs. When we recall that the computers of that time took several second stop carry out a multiplication, a major change came in the 1970s when processing power of computers rose significantly. Mechanical engineering modelling tools began to be used in industrial sectors.

"At the CETIM [technical centre for mechanical engineering industries] and at UTC, the first series of research work began in 1972-1975, leading to the first software packages used by industrialists in the 1980s", explains Mansour Afzali, Senior Science Delegate at CETIM.

Decomposing objects

One of the most commonly used methods is that of "finite elements" where space and objects are subdivided into 'loops' (or elements) which are as simple as possible. The system of complex equations that characterize the complete system is obtained by assembling the data calculated for each loop. "In the early days of computational mechanics, the work consisted of making models 'characterizing' mechanical, physical objects and to ensuring they were coherent with reality and the laws of mechanical engineering", adds Mansour Afzali.

The objective is to model the tiresome process of manual design. It begins with a technical drawing, with which a series of calculations is made to ensure compliance with the laws of physics. The next step is to assemble a physical prototype which undergoes a series of tests. Depending on the results obtained, the process reiterates, sometimes form the starting point, so that errors detected can be corrected.

"The process can be reiterated as many times as is necessary, until we obtain a prototype that passes all the tests of the specification", underlines Mansour Afzali, adding that modelling allows scientists to carry out a large number of tests to avoid long and costly iterations. Computational mechanics has led to a two-fold reduction in the time needed to design a car and even design the Airbus 380 solely on the basis of computer modelling.

Improving the models and the software

As and when new computational mechanics tools became available in the 1980s, numerous tests were carried out to detail the characteristics of materials and to certify the results of the calculations. Work like this enabled improvement to be m de on the models as well as for the software. Significant testing was carried out on software in the 1990s to improve their reliability and 'robustness'.

Ever increasing numbers of young engineers are being trained with these tools as they spread through industrial sectors. Computer sciences develop rapidly and digital modelling is now an integral part of CAM software (computer-aided manufacturing). "Progress like this, with increased model robustness and homogeneous modelling approaches allow you to have significant time gains and also the number of tests needed", details Mansour Afzali. Computational mechanics tools are becoming more trustworthy in the eyes of industrialists and have even become priority tools.

Numerous possible solutions

The processing power of today's computers grows constantly and it is henceforward possible to carry out complex calculations to optimize products. Given a set of parameters, constraints and industrial objectives, the optimization process consists of identifying the best solutions among numerous possible solutions. "Optimization is often due to the work of skilled engineers because it is necessary to make relevant decisions at various design stages", explains Mansour Afzali, for whom the contribution of experts is necessary to certify the results of the computations.

Faced with a true explosion of processing power and with increasingly homogenous models, mechanical engineering research now integrates probabilistic approaches to assess the life expectancy of components that take into account the 'variable' factors of the design parameters and uses. They also explore increasingly complex materials, such as the composites. New methods are coming on line, enabling scientists, for example, to take into account the speed of deformation of materials to model material reactions under crash-shock conditions. Today, the tools can be used to design a humble tin can or the mechanical response off the Eiffel Tower to climate or seismic events.

"All the component parts of a mechanical structure can be modelled and computed and for this purpose, industrialists now use computational mechanics", adds Mansour Afzali. This tool has become commonplace and young engineers often place a total yet blind trust in the method, estimating that the results have the same level of reliability as would be obtained via full-scale testing. And Mansour Afzali insists on the role of the engineers "whose responsibility it is to verify the hypotheses and the models used and never forget to assess the quality of the computational results".